Environmental Forensics: Principles and Applications - Chapter 6 ppt

22 389 0
Environmental Forensics: Principles and Applications - Chapter 6 ppt

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

6 Forensic Review of Environmental Trial Exhibits An accurate picture is worth a thousand words. 6.1 INTRODUCTION Environmental exhibits that are clear, accurate, and simple are a prerequisite for explaining the technical elements of an environmental case. Exhibits must also be factually and scientifically correct. Exhibit errors are unintentional due to transcrip- tion or preparation errors or intentional, as identified by a pattern of bias (Tufte, 1983, 1990, 1997). Intentional errors include: • Exaggerated vertical or horizontal scales • Selective data presentation • Data contouring (manually and computer-generated) • Color-coded data that obscure source areas • Contaminant transport models based on biased data When trial exhibits are exchanged, a concerted effort is required to validate their accuracy. Obtain the underlying information such as chemical results, especially in an electronic format, early in the discovery stage so that your expert witness and/or confidential consultant can quickly review the underlying data used to produce the trial exhibits. Determining that a trial exhibit is scientifically accurate benefits all parties. 6.2 EXAGGERATED VERTICAL AND HORIZONTAL SCALES Exhibit scales are often exaggerated, especially for geologic cross-sections and fence diagrams. When portraying a relatively small vertical scale, such as shallow soil ©2000 CRC Press LLC contamination (<100 ft.), relative to a substantially larger horizontal scale (>1000 ft), exaggeration is a reasonable way to present the data. Conversely, the depiction of subsurface contamination is skewed by excessively increasing the vertical scale relative to the horizontal scale. When vertical or horizontal scale exaggeration occurs, it should be posted on the exhibit and described in the testimony so that the viewer is informed. Plate 6.1 * depicts the concentration of trichloroethylene (TCE) in soil with a 1:1 and 1:10 vertical-to-horizontal scale (Morrison, 1998). The TCE distribution is represented as an iso-surface for the purpose of depicting the volume of contaminated soil. While the respective horizontal-to-vertical ratios are accurate in both versions, the perception regarding the extent of contamination is different. Exhibits relying on this technique are routinely employed in cases that address the reasonableness of remediation costs. When an exhibit prejudices the observers’ perspective, prepare a rebuttal exhibit with a 1:1 vertical-to-horizontal scale with the same data or decrease the three-dimensional area so that the exaggeration bias is reduced. 6.3 SELECTIVE DATA PRESENTATION It is the author’s experience that omission of selective data is common in environ- mental exhibits. Observed practices include: 1. Data omission 2. Use of averages or mean data (i.e., obtaining the average of quarterly data, moving averages, geometric means, time series presentations; using averaged values, aver- aged value with standard variation, mean plus confidence interval, measured value plus the percentage of relative standard deviation or coefficient of variation, etc.) which results in an underestimation of contaminant concentrations and plume geometry 3. Selection and presentation of the higher or lower value from split samples 4. Creation of multi-chemical composite contour maps (i.e., combining all solvent measurements and reporting them as total volatile organic compounds [VOCs] rather than for each compounds) to mask source identification 5. Arbitrary elimination of anomalous data 6. Data presentation generated from imprecise or non-specific analytical methods 7. Data filtering to reduce or eliminate reported measurements 8. Aerial photo cropping 9. Arbitrary revisions to the original data There is usually client reluctance to spend the money required for exhibit validation, especially when large data sets are used. For large invalidated data sets (>1000 data entries), a 5 to 15% transcription error between laboratory data and the computer spreadsheets is common. If the data entered onto the spreadsheet are double entered or cross-checked, this error is significantly reduced. * Plate 6.1 appears at the end of the chapter. ©2000 CRC Press LLC Validating a large data set (e.g., ≥100,000 entries) when the exhibit is based on 500 data points is unproductive. Identification of transcription errors is cost effec- tively performed by validating only those locations and compounds used in key exhibits. This strategy requires that the underlying chemical data sheets are quickly accessible once the exhibits are exchanged. Once the data used to create an exhibit is validated, it is used with the identical modeling and/or visualization software to determine if the trial exhibit is reproducible. If the animation software is proprietary, additional cost and time can result in purchasing or licensing the software from the company. In addition, the software may require unique hardware as well as a person fluent with the software. These hardware, software, and personnel issues should be resolved in advance of receiving the exhibits. Confirmation of the validity of data used to generate an exhibit may not be straightforward. Consider 100 split soil samples collected and tested for trichloro- ethylene. Is it more appropriate to use the lowest, highest, or average of the two values in the exhibit or to plot all three? If a trial exhibit relies on averaged values in some instances and alternates between high and low values for others, determine if a pattern of intentional data bias exists. A consistent method should be used and the rationale for the selection clearly stated on the exhibit and/or testimony. Exhibits that rely upon the geometric mean of a data set are often encountered, as water quality results are generally distributed geometrically in time and space. The geometric mean is obtained by taking the log of multiple values, adding the log, and then taking the anti-log of the averaged log values. This technique tends to dampen the impact of data outliers or individual anomalous values that may be important in identifying contaminant sources. Similarly, other statistical averaging techniques that assume a normal distribution should be confirmed. Minimization of biases due to concentration averaging, geometric means, and mean values is accomplished by using the actual values for a point in time. This latter approach improves the validity of the data interpretation, transport modeling, and ultimately the effectiveness of the remediation design (Martin-Hayden and Robbins, 1997). The interpretation of non-detect (ND) results can skew the results of the data set used to create an exhibit. A sample reported as ND can be interpreted as 0, as the value of the method detection limit, or as a value of one half the detection limit or omitted in the data set. If the geometric mean of a data set is used, the central tendency of the geometric mean will be significantly different if non-detects are excluded vs. if values equal to one half the method detection limit are used. For time-series data using single or averaged data (e.g., 10 years of quarterly groundwater sampling data), graphing data from a single quarter or averaging values for several quarters can skew the viewer’s perception if the chosen quarter(s) are anomalous relative to historical trends. Combining 6 or 12 months of non-sequential groundwater data (e.g., annual, quarterly, and biannual sampling) for an aquifer with a high velocity (e.g., >1000 ft per year) onto one exhibit when monitoring wells are spaced less than 1000 ft apart results in an unrepresentative portrayal of contaminant distribution. Creating a rebuttal exhibit depicting seasonal or more consistent histori- cal trends including anomalous sampling quarter data places the trial exhibit in a more balanced historical perspective. ©2000 CRC Press LLC If sample integrity is suspected due to sampling bias, especially for volatile compounds collected by multiple consultants, plotting the chemical results vs. time and labeling the tenure of the various consultants may identify whether this potential exists. Figure 6.1 illustrates trichloroethylene (TCE) concentrations in groundwater samples collected from multiple wells by three consultants between 1991 and 1994. In Figure 6.1, the TCE values for samples collected by Consultants A and B between 1991 and 1993 are smaller than the TCE concentrations from samples collected by Consultant C. The higher TCE concentration collected by Consultant C may indicate the use of different sampling equipment or procedures. Valid reasons exist for eliminating anomalous (e.g., outlier) values from a data set used to create an exhibit; however, the presence of anomalous data may be the only indication that the data are skewed and hence may be one of the most important data points in the population. If data are omitted, the rationale should be prominently posted on the exhibit. An example of omitted data is the presentation of changes in groundwater flow direction via rose diagrams. Figure 6.2 depicts the frequency of the groundwater flow direction from quarterly monitoring reports. The purpose of Figure 6.2 is to demonstrate that a contaminant plume in groundwater is captured with a groundwater extraction system located downgradient of the source. The groundwater extraction system was designed to capture the contaminant plume when the ground- water flow direction is west to southwest (Figure 6.2A). Groundwater flow to the north results in the transport of contaminated groundwater beyond the capture zone of the extraction system. Figure 6.2A shows nine quarters of groundwater flow that is predominately to the west to southwest. Figure 6.2B is a rebuttal exhibit depicting all 13 quarters with the direction of flow alternating between the southwest and northeast. Figure 6.2A does not contain reference to the omitted data. FIGURE 6.1 TCE concentrations from five groundwater monitoring wells collected by three consultants between 1991 and 1994. ©2000 CRC Press LLC While omission of anomalous data adverse to one’s position is usually obvious, subtle permutations are also encountered. An example is a chemical or geologic cross-section. A cross-section is a slice through the subsurface with information intersected by the slice displayed two or three dimensionally. A common cross- section manipulation is the inclusion or omission of data points not intersected on the cross-section. Figure 6.3(3a) depicts a plan view of a cross-section (A-A ¢) that intersects total petroleum hydrocarbon (TPH)-impacted soil. Figure 6.3(3b), how- ever, is the actual transect line reflecting the sampling points from which soil chemistry was used in the cross-section. In the case of the transect A-A¢ in Figure 6.3(3b), data along the transect that were not used included locations S-EX7 and S4- 3-PL. Sampling locations within 5 ft of the A-A¢ transect from locations S1-EX3 and S9-EX5 (see 3a) were also omitted. Sample locations located 30 ft to the east (S5- 5-PL, S9-7-PL), however, were projected onto the A-A¢ transect in 3a and incorpo- rated into the accompanying cross-section. Another example of data omission is the exclusion of non-CLP (Contract Labo- ratory Program) data. Contract Laboratory Program data are the documentation required for sample testing associated with the Comprehensive Emergency Cost Recovery Act (CERCLA) or Superfund and Resource Conservation and Recovery Act (RCRA) investigations. The primary components of this program include field and/or trip blanks, field duplicate sample results, and internal laboratory quality control results (e.g., matrix spikes, matrix spike duplicates, and laboratory method blanks). Historical CLP and non-CLP (e.g., Phase I or II investigations) may not be available. If non-CLP data are excluded in an exhibit, plot the CLP and non-CLP data and compare the results. If one component of the CLP documentation is unavailable or has been violated (e.g., broken travel or field blank bottles) and is included in the exhibit, create the same graph or figure with and without the suspected data. FIGURE 6.2 Rose diagrams showing historical groundwater flow directions. (From Morrison, R., in Environmental Claims Journal, 11(1), 93–107, 1998. With permission.) ©2000 CRC Press LLC Another data omission example is the case of split samples from one laboratory using CLP procedures and a second data set with non-CLP documentation. Plot the split CLP and non-CLP sample data collectively and individually to determine if significant differences in interpretation occur. If the non-CLP data are significantly dissimilar, the non-CLP data can be used for a different purpose (e.g., qualitatively vs. quantitative) or weighed differently. For example, the CLP data may be used for risk assessment purposes or to provide a quantitative measure of the volume of soil exceeding a clean-up concentration. The combined CLP and non-CLP data can be used to establish the boundaries of the contamination. Determining the reasonableness of an analytical method relied upon to create an exhibit may be required. In Plate 6.2, * 24 soil samples from a soil excavation are split into three discrete samples, with each sample forwarded to an analytical laboratory and tested for total petroleum hydrocarbons as gasoline. When each data set is contoured, different contaminant source areas as well as volumes above a remediation concentration of 100 mg/kg occur. A plan view of the contours from the Method 3 data depicts three source areas, while Method 1 and 2 data indicate two source areas. The exhibit relying on the Method 3 data or an average of the three data sets will result in significantly different interpretations of the distribution of the TPH in the soil. The data set selected for the exhibit influences the interpretation regarding the location of TPH contamination. The solution is to perform an analysis of the repre- sentativeness or accuracy of each analytical method to determine which data set is most representative. In Plate 6.2, Method 1 introduced false positive readings, while the methanol extract used in Method 2 was less effective in contaminant removal than * Plate 6.2 appears at the end of the chapter. FIGURE 6.3 Plan view of cross-section transects A-A¢. (From Morrison, R., in Environmen- tal Claims Journal, 11(1), 93–107, 1998. With permission.) ©2000 CRC Press LLC Method 3. For this soil type and contaminant, Method 3 is the most representative data set. A variation of the Plate 6.2 example is reliance on a testing technique such as EPA Standard Method 418.1 to detect total petroleum hydrocarbons (TPHs) in soil samples used to guide the excavation of hydrocarbon-impacted soil. EPA Method 418.1 is a non-chromatographic technique and detects the presence of biogenic compounds in the soil (i.e., peat, pine needles, organic matter) resulting in false- positive measurements (George, 1992; Zemo et al., 1995). The author has observed cases when EPA Method 418.1 is used to define where to excavate, until the excavation is inhibited by the presence of a building or road. The consultant then changes to an analytical method that does not introduce a false bias (i.e., EPA Method 8015). Testing using EPA Method 8015 results in non-detect sample measurements and becomes the basis for halting the excavation. Whether the original excavation using EPA Method 418.1 was warranted becomes not only a source of contention but also affects the reliability of an exhibit combining test results using EPA Methods 418.1 and 8015. Figure 6.4 is a cross-section of a soil excavation where soil samples were tested for TPHs using EPA Standard Methods 418.1 and 8015. Soil samples collected within the interior of the excavation were tested via EPA Method 418.1, while EPA Method 8015 was selected for confirmation soil sampling along the excavation perimeter. The potential implications of this observation are that over-excavation probably occurred and that the consultant may have intentionally relied upon the false-positive bias results inherent with EPA Method 418.1 to excavate non-petro- leum-contaminated soil as a means to generate income. EPA Method 8015 was then used to halt the excavation, in this case when its proximity to subsurface piping presented significant complications to continued excavation. Once the excavated soil FIGURE 6.4 Excavation cross-section using EPA Methods 418.1 and 8015. ©2000 CRC Press LLC is remediated or co-mingled with other petroleum-impacted soil, it becomes prob- lematic whether subsequent test results of these excavated soils can determine if the original EPA Method 418.1 results were valid. Figure 6.5 depicts a plan view of excavated gasoline-contaminated soil. The organic-rich subsurface soils provided consistent false-positive measurements when using EPA Standard Method 418.1. Once the excavation proceeded close to a cooling tower and manufacturing building, the consultant switched to soil analysis using EPA Standard Method 8015 which resulted in non-detect sample results. Excavation near the surface structures then ceased. The distribution of analytical methods used for soil analysis relative to the above-ground structures in Figure 6.6 suggests an intent to create non-detect boundaries in areas in which extensive shoring was required. It may be warranted to retain an analytical chemist to reconstruct the validity of the test method(s) used to direct a soil excavation. The chemist can identify data believed to be unreliable which should be omitted from an exhibit. Conversely, if no quality assurance analysis is performed, both parties may erroneously assume that the detection of a particular compound is correctly identified. It is the author’s experi- ence that in the case of gas chromatography/mass spectrophotometry (GC/MS), it is not unusual to find that 5 to 10% of the compounds are misidentified, especially if the interpretations are not manually examined. FIGURE 6.5 Plan view of soil excavation and selective use of EPA Standard Methods 418.1 and 8015. ©2000 CRC Press LLC Data filtering is the revision or omission of data based on identification of the removed data as anomalous and/or non-representative. An example is the detection of 20 parts per billion (ppb) of TCE in a rinsate sample collected from a groundwater bailer. The bailer is subsequently used to collect a groundwater sample that results in a reading of 24 ppb. The data (24 ppb) are omitted from the data set based on a concentration of 4 ppb (lower than the maximum contaminant level of 5 ppb) via subtracting the equipment blank value from the measured groundwater sample. Another example of data filtering is assigning a new detection limit at five times the contamination level detected in the rinsate sample. The new detection limit is therefore 5 ¥ 20 = 100 ppb. The detection of 24 ppb in the groundwater sample is now regarded as non-detect, as are trichloroethylene concentrations up to 100 ppb. This method results in significant data omissions. Data filtering may be represented as justified through re-sampling. For example, monitoring wells may be re-sampled immediately after contamination is detected or re-sampled several times until contamination is not detected. The non-detect sample is then reported in the quarterly groundwater monitoring report and relied upon for the trial exhibit. Another technique is repeated groundwater sampling at the same location using a cone penetrometer test (CPT) rig or less quantitative technology (soil gas), with the re-sampling occurring days, months, or years after the original results to confirm the use of the non-detection measurements shown on a trial exhibit. For data sets where measurements are omitted, the major difficulty often lies in identifying the omissions. For large data sets (>1000 entries), omissions may not be apparent without a thorough review. Another difficulty is the testing of split samples by multiple samples, with only those sample results supportive of a particular position being reported. One technique for identifying data omissions is to aggres- sively pursue any electronic databases kept by the consulting firm or facility operator. Another option is to subpoena the original laboratory sheets and create a separate database. Aerial photo cropping is a technique that can remove undesirable information. Figure 6.6 shows two versions of an aerial photo of a tank farm in 1925 — uncropped and cropped; the cropped version deletes a tank under construction in the upper left corner. Be aware that when a person selects an aerial photo from a repository or dealer, the portion that is selected for the hard copy is usually a subset of the original, usually due to the scale of the parent aerial photograph. When ordering aerial photography, a number of scales and coverage dates are available. It is the author’s experience that all of the coverage dates are rarely ordered. This can result in the omission of aerial photo information if the opposing side obtains copies during discovery and relies on these rather than independently obtaining their own aerial photographs. When forensically evaluating a trial exhibit, examine all the underlying founda- tional information, especially field and laboratory notes. Figure 6.7 depicts a field and final soil-boring log contained in an environmental report. The field log depicts a 3-ft zone of contamination, while the final log contained in the environmental report shows a contaminant zone that is 7 ft thick. The final boring log was used with other boring logs to estimate the volume of contamination and associated remediation ©2000 CRC Press LLC costs. While the difference between the field and final boring log is small (ª4 feet), this difference extrapolated over a large area results in substantial differences in contaminant distribution and associated remediation costs. 6.4 DATA CONTOURING Data contouring (manual or computer-generated) is the interpolation of numbers of equal value in space (i.e., connecting the dots). Contouring provides useful visual FIGURE 6.6 Uncropped (top) and cropped (bottom) aerial photograph. ©2000 CRC Press LLC [...]... Mathematical Geology, 16( 7) :68 5–718 Martin-Hayden, J and G Robbins, 1997 Plume distortion and apparent attenuation due to concentration averaging in monitoring wells, Ground Water, 35(2):339–347 Morrison, R., 1998 Forensic review of environmental trial exhibits, Environmental Claims Journal, 11(1):93–107 Morrison, R and Erickson, R., 1995 Environmental Reports and Remediation Plans: Forensic and Legal Review,... analytical and extraction methods and corresponding contour maps ©2000 CRC Press LLC ©2001 CRC Press LLC Plate 6. 3 Contoured concentration of TCE in soil gas Plate 6. 4 Color and size ramping of circles to indicate benzene concentrations in groundwater ©2000 CRC Press LLC ©2001 CRC Press LLC Plate 6. 5 Two-dimensional contouring with color-coded data and different contour intervals Plate 6. 6 Computer... distance and kriging The differences between these two methods may visually appear minor; however, for large areas with limited data, contaminated volume calculations and hence cost and time required for in situ remediation can be dramatically impacted While Figure 6. 11 presents a two-dimensional example, identical issues exist with three-dimensional representations 6. 4.3 COLOR-CODED DATA Two and three-dimensional... Hydrocarbon and Organic Chemicals in Ground Water: Prevention, Detection, and Restoration Conference, National Ground Water Association, Houston, TX, pp 35–52 Hughes, J and D Lettenmaier, 1981 Data requirements for kriging: estimation and network design, Water Resources Research, 17 (6) : 164 1– 165 0 Joseph, G., 19 96 Modern Visual Evidence, Law Journal Seminars Press, New York, p 474 Journel, A and E Isaaks,... closed * Plates 6. 3 and 6. 4 appear at the end of the chapter ©2000 CRC Press LLC FIGURE 6. 8 Examples of computer-generated contour biases contours in the upper and lower right-hand quadrant are similar to the framed regions in A through D on Figure 6. 8 A means to emphasize or minimize environmental data in a computer-generated contour map is to adjust the contour intervals The selection of a large contour... The 100-ppb contour map depicts multiple potential sources obscured on the 500-ppb map due to the contour interval selection The 100-ppb contour map also contains posted data, thereby allowing confirmation of the contouring Common contouring methods used in constructing two- and three-dimensional contour maps and animations include inverse distance, kriging, minimum curvature, Sheppard’s method, and polynomial... three-dimensional exhibits can be created that, while scientifically correct and based on a complete and validated data set, obscure key information An example is selection of the contour interval The contour maps in Plate 6. 5* were prepared using identical data In the color-shaded contour map in Plate 6. 5, the * Plate 6. 5 appears at the end of the chapter ©2000 CRC Press LLC contour interval selection is not constant... these surfaces default to a confidence and/ or probability level used to create these surfaces Plate 6. 8* illustrates contaminant iso-surfaces with two different contouring confidence levels for an identical data set The iso-surface in both panels is set to include all of the soil concentration data between non-detect and 150 ppb The upper panel illustrates an iso-surface with a 95% confidence level,... while a smaller contour interval tends to emphasize potential source areas (Erickson and Morrison, 1995) FIGURE 6. 9 Example of contouring errors and scales between data point coordinates and a base map ©2000 CRC Press LLC FIGURE 6. 10 Adjustment of data contour interval for source identification Figure 6. 10 depicts two-dimensional contour maps for trichloroethylene concentrations in groundwater where... Figure 6. 8 reflect a sufficient data density so that the biases observed in frames A through D are not generated Examine the contours relative to individual data points If closed contours are offset from discrete data points, the contour and site map may be improperly scaled The framed contours in Figure 6. 9 illustrate these types of features The closed * Plates 6. 3 and 6. 4 appear at the end of the chapter . locations S1-EX3 and S9-EX5 (see 3a) were also omitted. Sample locations located 30 ft to the east (S 5- 5-PL, S 9-7 -PL), however, were projected onto the A-A¢ transect in 3a and incorpo- rated into. cross-section. In the case of the transect A-A¢ in Figure 6. 3(3b), data along the transect that were not used included locations S-EX7 and S 4- 3-PL. Sampling locations within 5 ft of the A-A¢. intersected on the cross-section. Figure 6. 3(3a) depicts a plan view of a cross-section (A-A ¢) that intersects total petroleum hydrocarbon (TPH)-impacted soil. Figure 6. 3(3b), how- ever, is the actual

Ngày đăng: 11/08/2014, 09:21

Từ khóa liên quan

Mục lục

  • Environmental Forensics: Principles and Applications

    • Table of Contents

    • Chapter 6: Forensic Review of Environmental Trial Exhibits

      • 6.1 Introduction

      • 6.2 Exaggerated Vertical and Horizontal Scales

      • 6.3 Selective Data Presentation

      • 6.4 Data Contouring

        • 6.4.1 Manual Contouring

        • 6.4.2 Computer Contouring

          • 6.4.2.1 Inverse Distance Method

          • 6.4.2.2 Kriging

          • 6.4.2.3 Minimum Curvature Method

          • 6.4.2.4 Sheppard’s Method

          • 6.4.2.5 Polynomial Regression

          • 6.4.3 Color-Coded Data

          • References

Tài liệu cùng người dùng

Tài liệu liên quan