QA and QC Essentials for Laboratory
QA and QC Essentials for Laboratory Interviews
Preparing for a laboratory interview often means more than just knowing how to run tests—it means understanding quality. Quality Assurance (QA) and Quality Control (QC) are foundational to laboratory science, ensuring that results are accurate, reliable, and defensible. Whether you’re interviewing for a clinical, research, environmental, or industrial lab role, a clear grasp of QA/QC concepts can set you apart.
This guide breaks down the most common QA and QC terms, explains how they apply in real lab settings, and shows how to talk about them confidently in interviews.
QA vs. QC: The Big Picture
Quality Assurance (QA) and Quality Control (QC) are closely related but not interchangeable.
-
Quality Assurance (QA) refers to the system-wide processes that ensure quality. This includes SOPs, training, audits, documentation practices, and compliance with standards like ISO 17025 or GLP. QA is proactive—it focuses on preventing errors before they happen.
Example: Creating and approving an SOP for sample preparation, training analysts on it, and conducting annual internal audits to ensure the SOP is followed.
-
Quality Control (QC) focuses on the operational techniques used during testing to verify accuracy and precision. QC is reactive and ongoing, involving control samples, calibration checks, and replicate analyses.
Example: Running a control sample at the start of each batch to confirm an instrument is performing within acceptable limits.
Interview tip: A strong answer shows you understand that QA builds the framework, while QC confirms day-to-day performance.
Core Measurement Concepts
Understanding how results are evaluated is central to QA/QC discussions.
-
Accuracy: How close a result is to the true or accepted value.
Example: A certified reference sample has a true value of 100 mg/L, and your result is 99.8 mg/L.
-
Precision: How consistent results are when the same sample is tested repeatedly.
Example: Running the same sample three times and obtaining results of 50.1, 50.0, and 50.2 mg/L.
A method can be precise but inaccurate (consistent yet wrong), or accurate but imprecise (close to the true value but inconsistent). High-quality methods strive for both.
-
Uncertainty: The range around a reported value within which the true value is expected to lie, expressed with a defined confidence level.
Example: Reporting a result as 25.0 ± 0.5 mg/L at 95% confidence.
Interview tip: Be ready to explain how labs report uncertainty and why it matters for decision-making.
Equipment and Method Control
-
Calibration: Adjusting instruments by comparing them to a known reference standard to ensure accurate measurements.
Example: Calibrating a balance daily using NIST-traceable weights before weighing samples.
-
Traceability: The ability to link measurements back to recognized standards (often national or international).
Example: Using certified reference materials with documented traceability to national standards.
-
Validation: Demonstrating that a method performs as intended—covering accuracy, precision, linearity, LOD, LOQ, and robustness.
Example: Validating a new HPLC method before using it for routine sample analysis.
-
Verification: Confirming that an already validated method or instrument performs correctly under your lab’s specific conditions.
Example: Running known standards to verify a published method works with your instrument.
Interview tip: Employers value candidates who understand that validation is more extensive than verification.
Samples, Controls, and Checks
Quality control relies heavily on strategic sample use:
-
Control Sample: A sample with known values analyzed alongside unknowns to confirm system performance.
Example: Analyzing a low- and high-level control with every analytical batch.
-
Blank: A sample without analyte, used to detect contamination or background interference.
Example: Running a reagent blank to ensure solvents are not contaminated.
-
Replicate: Repeated testing of the same sample to assess precision.
Example: Analyzing one water sample in duplicate to confirm reproducibility.
-
Spike: A known quantity of analyte added to a sample to evaluate recovery and method accuracy.
Example: Adding a known concentration of pesticide to a soil sample to assess recovery.
-
Internal Standard: A known compound added to all samples and standards to correct for variability during analysis.
Example: Adding an internal standard in GC-MS analysis to compensate for injection variability.
-
Outlier: A result that deviates significantly from others and requires investigation, not automatic rejection.
Example: One replicate result falls outside control limits and triggers a review per SOP.
Interview tip: Emphasize that outliers must be documented and justified according to SOPs.
Detection and Quantitation Limits
-
Limit of Detection (LOD): The lowest concentration that can be reliably detected, but not necessarily quantified.
Example: Detecting a contaminant at 0.01 mg/L but not reporting a precise value.
-
Limit of Quantitation (LOQ): The lowest concentration that can be measured with acceptable accuracy and precision.
Example: Reporting results confidently only above 0.05 mg/L.
These limits define what a method can confidently report and are critical in regulated testing environments.
Documentation, Compliance, and Accountability
-
Standard Operating Procedure (SOP): Written, approved instructions that ensure tasks are performed consistently.
Example: Following a step-by-step SOP for sample digestion and documenting any deviations.
-
Chain of Custody: Documentation that tracks sample handling from collection through analysis and disposal.
Example: Signing and dating sample transfer forms from field collection to laboratory receipt.
-
Audit: A systematic review of lab operations to verify compliance with internal procedures and external standards.
Example: An internal ISO 17025 audit reviewing training records and QC logs.
-
Accreditation: Formal recognition that a laboratory meets defined quality standards, such as ISO 17025.
Example: A lab maintaining accreditation through regular external assessments.
Interview tip: Mention experience following SOPs exactly—and documenting deviations when they occur.
Managing Problems: CAPA
No lab is error-free. What matters is how issues are handled.
-
Non-conformance: Any deviation from SOPs, methods, or quality standards.
Example: Using an expired reagent during analysis.
-
Corrective Action: Steps taken to fix an identified problem and address its root cause.
Example: Reanalyzing samples, retraining staff, and documenting the incident.
-
Preventive Action: Proactive steps taken to reduce the risk of future issues.
Example: Implementing automated alerts for reagent expiration dates.
Together, corrective and preventive actions (CAPA) demonstrate a mature quality system.
External Performance Evaluation
- Proficiency Testing: External assessment where labs analyze blind samples and compare results to peers or reference values.
Participation in proficiency testing shows competence, transparency, and commitment to quality improvement.
Final Interview Tips
- Use real examples: daily QC checks, calibration schedules, audit preparation, or handling a non-conformance.
- Reference standards like ISO 17025, GLP, or CLIA when relevant.
- Keep explanations clear and concise—interviewers often test understanding, not memorization.
Mastering QA and QC terminology shows that you don’t just run tests—you understand the systems that make laboratory results trustworthy. That mindset is exactly what hiring managers look for.

Comments
Post a Comment