Table of Contents
What Is the Standard Deviation Index?
The Standard Deviation Index (SDI) is a quality metric used in clinical laboratory proficiency testing programs. It measures how far a laboratory's result deviates from the peer group mean, expressed in units of the peer group standard deviation. SDI is essentially a z-score applied to inter-laboratory comparisons.
SDI is used by organizations like the College of American Pathologists (CAP), the WHO, and other accreditation bodies to evaluate laboratory performance. Consistently high SDI values may indicate systematic bias that requires corrective action, such as recalibration of instruments or investigation of reagent issues.
SDI Formula
Interpretation
| SDI Range | Performance | Action |
|---|---|---|
| |SDI| ≤ 1.0 | Acceptable | No action needed |
| 1.0 < |SDI| ≤ 1.5 | Marginal | Monitor closely |
| 1.5 < |SDI| ≤ 2.0 | Poor | Investigate bias |
| |SDI| > 2.0 | Unacceptable | Corrective action required |
Frequently Asked Questions
What causes a high SDI?
A high SDI indicates systematic bias. Common causes include calibration drift, expired or improperly stored reagents, instrument malfunction, or differences in analytical methodology compared to the peer group. Negative SDI indicates low bias while positive SDI indicates high bias.
How is SDI different from CV?
SDI measures bias (accuracy) relative to peer labs, while CV measures precision (reproducibility) within your own lab. A lab can have low CV (precise) but high SDI (inaccurate), which indicates consistent but biased results that need calibration adjustment.