Trace Evidence Proficiency Testing Guidelines by SWGMAT (Forensic Science Communications, July 2001)
July 2001 - Volume 3 - Number 3
Standards and Guidelines
Trace Evidence Proficiency
Scientific Working Group for
Materials Analysis (SWGMAT)
Introduction | Proficiency Testing | Reference Documents
Terminology | Types of Proficiency Tests
Proficiency Test Design | Preparation of Tests
Predistribution Testing | Distribution
Submission of Participant Results | Assessment of Participant’s Results
Summary Reporting of Results for All Participants
This document is based in part on the European Network of Forensic Science Institute’s (ENFSI) Guidance on the Conduct of Proficiency Tests and Collaborative Exercises Within ENFSI. SWGMAT gratefully acknowledges ENFSI’s Working Groups’ collaboration on this document.
1.2. The goals of SWGMAT are to assist in the advancement of forensic science and to promote a commitment to excellence among the members of the forensic community. Proficiency testing is one of the key measures of performance.
1.3. Any trace evidence proficiency test provider (see 4.2) should conform to the Guidelines for the Requirements for the Competence of Providers of Proficiency Testing Schemes (ILAC-G13:2000).
1.4. Prior to participating in trace evidence proficiency testing, the appropriate laboratory personnel should read: ASTM E 1301 Part B: Selection and Use of Proficiency Testing Programs by Laboratory Accreditation Bodies, and Annexes; A Statistical Methods for Evaluating Proficiency Testing Data; A2 Quality Assurance of Proficiency Testing Programs; and A3 Selection and Use of Proficiency Testing Programs by Laboratory Accreditation Bodies.
2.1. Proficiency testing is the use of intra- or inter-laboratory testing for the following purposes (as outlined in ISO/IEC Guide 43-1:1997(E)):
2.1.1. To determine the performance of individuals using specific methods or measurements.
2.1.2. To monitor the continuing performance of laboratories against known expected results.
2.2. It is noted in ISO/IEC Guide 43-1:1997 (E) that “a major distinction exists between the evaluation of the competence of a laboratory by the assessment of its total operation against predetermined requirements and the examination of the results of a laboratory’s participation in proficiency testing which may only be considered as giving information about the technical competence of the testing laboratory at a single point of time under the specific conditions of the test (or tests) involved in a particular proficiency testing scheme.” This is particularly important to keep in mind regarding trace evidence proficiency testing results.
American Society of Crime Laboratory Directors Laboratory Accreditation Board Proficiency Review Program, December 1997.
ASTM 1301-95 Standard Guide for Proficiency Testing by Interlaboratory Comparisons, 2000.
Guidance on the Conduct of Proficiency Tests and Collaborative Exercises Within ENFSI, European Network of Forensic Science Institutes Yearbook 1998–1999, 1998, pp. 63–69.
Guidelines for the Requirements for the Competence of Providers of Proficiency Testing Schemes, ILAC-G13, 2000.
ISO Guide 43-1, Proficiency testing by interlaboratory comparisons—Part 1: Development and operation of proficiency testing schemes, 2nd ed., 1997.
National Association of Testing Authorities, NATA Accreditation Requirements for Forensic Science, Rhodes, New South Wales, Australia, 2000.
SWGMAT Quality Assurance Guidelines, Forensic Science Communications [Online]. (January 2000). Available: www.fbi.gov/programs/lab/fsc/backissu/jan2000/swgmat.htm
4.1. Technical expert: An individual who can conduct the appropriate analytical tests that are reasonably assumed may be applied to the samples contained in the test and correctly interpret their results. A technical expert is a scientist who has successfully passed a proficiency test in that particular discipline (e.g., hairs, fibers, paint) within the past two years.
4.2. Test provider: An entity that holds the primary responsibility for designing, preparing, validating, distributing, and reporting the results of proficiency tests and ensuring the accuracy thereof.
4.3. Predistribution testing laboratory: A laboratory that examines and analyzes a proficiency test prior to its general distribution to validate its design, quality, and manufacture.
4.4. Predistribution testing: Taking a proficiency test prior to distribution to all participants (e.g., trial run).
4.5. Participant: A laboratory or an individual participating in a proficiency test.
4.6. Analyst: An individual who is qualified to participate in a proficiency test.
5.1. Proficiency tests can serve as a check on various types of performance:
5.1.1. Qualitative characterizations, identifications, and/or comparisons;
5.1.2. Quantitative measurements; and
5.2. Proficiency tests can be carried out using material supplied to all participants for concurrent examination or using material for sequential examination by the participants on a round-robin basis. Sequential examinations can result in lengthy exercises, problems with the stability or integrity of the material involved, and delays in the overall assessment and reporting. Concurrent examinations can suffer from problems of heterogeneity among specimens distributed to various participants. Such heterogeneity must be assessed during the predistribution testing, and its effect on the interpretive portion of the test must be evaluated prior to distribution of test samples.
5.3. Proficiency tests can be conducted as open or blind tests.
5.3.1. An open proficiency test is a test that is identified to the participants as such. The participant knows it is a proficiency test.
5.3.2. A blind proficiency test is a test that is submitted to the participants as a real case. The participant does not know it is a proficiency test.
It is unlikely that blind tests (those submitted under the guise of being an actual case) will be practical, although individual laboratories may benefit from them.
6.1. Proficiency tests must be realistically designed to reflect analytical concerns in forensic casework as closely as possible. Refer to ISO/IEC Guide 43-1:1997(E) Section 4 for a list of proficiency testing schemes. Refer to ASTM E 1301 Section 6 for an overview of organization and design of proficiency tests. Proficiency tests should not be time-consuming and expensive. It is essential that they are designed so that:
6.2.1. The tests are straightforward, and their intent is not overly ambitious.
6.2.2. They avoid any confusion about their objective(s).
6.2.3. The tests are properly designed to achieve the stated purpose with minimum effort.
6.2.4. The information derived from the tests is maximized and used to best affect the forensic science community.
6.3. During the design stage of each test, the test provider must be, or must collaborate closely with, a technical expert(s). When the test or exercise requires highly specialized or non-forensic knowledge or when statistical treatment of the results is appropriate, it is essential to involve relevant experts or statisticians at the design stage.
6.4. In designing a proficiency test, the test provider is responsible for the following:
6.4.1. Establishing the specific analytical and/or procedural question(s) posed by a test.
6.4.2. Establishing the time frame, scheduling, and due date for the test.
6.4.3. Identifying laboratories for predistribution testing of samples.
22.214.171.124. Predistribution laboratories must be chosen based on the expertise of available personnel, analytical resources, and time commitments appropriate to the type of test being constructed.
126.96.36.199. Predistribution laboratories must have at least one technical expert in the field of the test being reviewed.
188.8.131.52. Predistribution laboratories must have the required instrumentation to complete the test adequately.
6.4.4. Establishing that the reported results, upon which performance will be assessed, are unambiguously attainable through the predistribution testing protocol.
6.4.5. Obtaining and providing the test materials.
6.4.6. Producing clear, unambiguous instructions for participants regarding what they are required to do and the format for reporting their results.
6.4.7. Making arrangements for packaging and shipping.
6.4.8. Ensuring that the design and distribution of the test comply with any legal, health, and safety requirements.
6.4.9. Establishing the protocol for communicating the results.
6.4.10. Establishing the protocol for assessing results.
7.1. The test providers must ensure that all test materials provided are uniform and, where appropriate, homogeneous, and that they will not deteriorate. The degree of homogeneity should be such that differences between test items will not significantly affect the evaluation of a participant’s results. The tests should be reviewed by a technical expert to guarantee their integrity. The same standards or practices and techniques that should be used in forensic laboratories to ensure sample integrity, stability, and security shall be used throughout the processes of test material preparation and distribution.
7.2. The test materials must be properly and adequately characterized by a predistribution laboratory(s) before being issued for testing. Any available information regarding the primary or reference standards, or their analytical data, used in the predistribution process must be available to all test participants on request after the test deadline has passed.
7.3. Replacement test materials, including test packaging, are to be made available to participants in the event their original test materials are lost or compromised in any way and to assist in resolving any issues that may arise concerning the integrity of the test materials. The materials must be retained by the test provider for at least three years after the due date of the test, unless the material(s) will degrade and are not useful for reanalysis.
7.4. The test provider must fully document all details of the test materials, their preparation, and characterization. Samples and all related data and documentation from each step of the sample preparation must be maintained for at least three years after the due date of the test, unless the material(s) will degrade and are not useful for reanalysis.
7.5. The test provider must identify any unusual and/or possible health and safety considerations associated with the test materials and/or their examination. This information must be provided to the participating laboratories with the test materials. Alternative test samples must be utilized when the potential health and/or safety of the test participants could be compromised.
7.6. The test provider must retain an archival sample of each original material used to prepare a test and at least 10 prepared tests, or 10 percent, whichever is less, for at least three years after the due date of the test, unless the material(s) will degrade and are not useful for reanalysis.
8.1. Prior to sending to participating laboratories, predistribution testing by at least two laboratories, independent of the test provider and each other, must be performed. Predistribution testing assesses and confirms the quality of the test design, the relevance of the proposed scenario and questions posed, the suitability of the test materials, and the type of packing methods used for the test materials. Forensic laboratories should be used for predistribution testing.
8.1.1. The predistribution testing laboratories should not use the pretest as the annual proficiency test. Another comparable test should be provided.
8.2. The test provider must have a protocol in place to deal with the resolution of any test design problems revealed through quality control or predistribution tests.
8.3. Tests shall not be released until the quality of the predistribution test and the results are verified by the technical expert. The tests shall not be released until all problems have been rectified.
8.4. The test provider must have a procedure in place whereby the technical expert acknowledges the quality of the test by a signed statement. This statement must be filed with the test documentation.
8.5. The predistribution tests must be made of exactly the same materials as the actual proficiency tests.
8.6. The instructions for the test must conform to ISO/IEC Guide 43-1:1997(E) Section 6.2.
9.1. Test materials must be packaged to ensure their integrity, stability, and security while in transit. Any specific requirements for their handling and storage must be explicitly stated, particularly if the health or safety of participants could be affected.
9.2. Details of the packaging and distribution must be fully documented by the test provider and retained at their facility for at least three years. This information must be made available on request.
9.3. Any special storage or handling requirements must be stated on the outside of the packaging and on the enclosed paperwork.
9.4. Information about requirements addressed in this section must be made available on request.
10.1. It is important to identify the manner in which the participant’s results should be returned before the test is distributed. The use of forms greatly facilitates the tabulation of the test results but can constrain valuable comment. Space must be allotted in each test form for participant comments.
10.2. If the test design allows, provisions should be made for allowing a participant to word a conclusion in a manner that is consistent with the individual laboratory’s standard method for reporting results.
10.3. Unit measurements, when appropriate, must be included by the participants.
10.4. A statement of uncertainty and/or error must be stated when appropriate by the participants.
10.5. The test provider must specify the due date on the proficiency test form and in the test’s cover letter or instructions for submitting the participant’s test results.
10.6. The test provider will retain all returned tests and related documentation for at least three years after the release of the report containing a compilation of test results.
11.1. The reported results and other information elicited by the test, and upon which performance will be assessed, must have been part of the test design and predistribution testing. During the design stage, the test provider must establish a method of assessing the participant’s test results.
11.1.1. The data generated during the test must be considered in the participant’s performance assessment. How the test answers were determined is as important as obtaining the correct answers.
11.1.2. This does not preclude summary questions that provide a yes/no or a match/no match form of question.
11.2. Technical experts in the tested discipline, as individuals or panels, will review the participant’s test results. Statisticians may be used in the assessment of performance.
11.3. Any significant discrepancy between the expected answer(s) and the participant’s answer(s) and the source and cause of the discrepancy must be addressed by the test provider and the involved laboratory(s). Discrepancies that affect the expected results of the test are considered significant. The discrepancy will typically originate from the analyst, the participating laboratory, the test provider, or the test sample.
11.3.1. If the discrepancy originates with the analyst (e.g., analytical or interpretive discrepancies), the participating laboratory should have a written procedure in place to address and mitigate the situation.
11.3.2. If the discrepancy originates with the participating laboratory (e.g., systematic discrepancies), then the participating laboratory’s written procedure for mitigation must be appropriate for the type and source of the discrepancy.
11.3.3. If the discrepancy originates with the test provider or the sample, then the test provider must declare that test void and issue a report containing the predistribution test results, a written explanation for the discrepancy, and a schedule for the distribution of the next test in that discipline. Assuming that the problem was not found during pretesting, the test provider must address any gaps in the pretesting review process that were not previously recognized. Notification to test recipients that the test has been declared void must be made within ten working days of that decision.
12.1. The test provider will establish a protocol during the design stage for reporting the summary results for all participants. This should conform to ASTM E 1301 Section 7.5. There must be an established release date for when the summary report will be provided, and the release date shall be communicated to the test participants when the testing materials are provided. The summary report must be checked for accuracy before publication.
12.2. For purposes of reporting and disseminating participants’ results, the laboratories will be traceable by coded references.
12.3. The report must include the names of the consulted technical experts, predistribution testing laboratories’ data, and source (manufacturers’) data.
12.4. The report will provide summaries of the returned results, techniques used, and a discussion of performance based on the specific declared test objectives and expected results. The report should be objective and report the facts. Many proficiency tests yield useful information that is incidental to the specific test objective. This information should also be collected and reported in a manner that does not obscure the primary test objectives, interpretation of the test results, or assessment of performance.
12.5. A copy of the report will be sent to every participating laboratory and shall be made generally available to others on request.