Verifying design verification records is a critical quality control activity, especially in regulated industries (medical devices, aerospace, automotive, pharmaceuticals). It ensures the evidence proving the design meets its requirements is complete, accurate, traceable, and compliant. Here's a comprehensive guide on how to do it effectively: Core Principle: The goal is to confirm that the verification process itself was executed correctly and that the conclusion (design meets requirements) is supported by objective evidence.
-
Understand the Context & Requirements:
- Review the Design Input Document: Know exactly what requirements were defined and need verification.
- Review the Design Verification Plan/Protocol: Understand the intended scope, approach, test methods, acceptance criteria, responsibilities, and schedule.
- Applicable Standards & Regulations: Identify relevant standards (e.g., ISO 13485, IEC 62304, ISO 9001, AS9100, FDA QSR, GMP) and regulatory requirements governing design verification.
-
Verify the Verification Plan/Protocol:
- Completeness: Does it cover all design inputs? (Use a traceability matrix).
- Clarity & Appropriateness: Are test methods, procedures, and acceptance criteria clear, unambiguous, and technically sound? Are they appropriate for the requirement?
- Feasibility: Can the tests actually be performed as described? Are resources (equipment, personnel, materials) available?
- Risk Assessment: Does the plan adequately address risks associated with high-criticality requirements?
- Approval: Is the protocol formally reviewed and approved by authorized personnel (QA, Engineering, Management)? Signatures/dates present?
-
Verify the Execution of Tests/Activities:
- Adherence to Protocol: Were tests performed exactly as described in the approved protocol? (Check deviations).
- Qualified Equipment: Was all test equipment used calibrated and qualified? Calibration certificates/records available?
- Qualified Personnel: Were personnel performing tests trained and competent? Training records available?
- Environmental Controls: Were required environmental conditions (temperature, humidity, etc.) monitored and recorded?
- Raw Data:
- Completeness: Are all raw data sheets, test logs, instrument printouts, photos, videos, etc., present? Nothing missing?
- Legibility: Is all raw data clear, legible, and permanent?
- Traceability: Can raw data be directly traced back to the specific test procedure, equipment, operator, date/time, and sample/unit tested?
- Annotations: Are any corrections or changes made to raw data initialed and dated? (No "white-out" or erasures).
- Witnessing: Were critical tests witnessed? Witness signatures present?
- Deviations & Non-Conformances:
- Were any deviations from the protocol documented? (e.g., equipment failure, sample issue, procedural change).
- Were deviations formally assessed (e.g., via a Deviation Report or Non-Conformance Report - NCR)?
- Was the impact on verification results determined and documented?
- Were deviations approved by authorized personnel before proceeding?
- Were NCRs raised for any test failures or unexpected results? Investigated and dispositioned?
-
Verify the Analysis & Conclusion:
- Test Report: Does the report accurately summarize the raw data?
- Analysis: Is the analysis of the raw data correct? Does it logically lead to the conclusion?
- Acceptance Criteria: Did the results meet the pre-defined acceptance criteria stated in the protocol? Clearly documented "Pass/Fail" or "Meet/Not Meet" for each test/requirement?
- Traceability Matrix: Is there a final traceability matrix showing each design input, the corresponding verification test/activity, and the result (Pass/Fail)? Does it confirm all inputs were verified?
- Overall Conclusion: Does the report state a clear, unambiguous conclusion that the design outputs meet the design inputs? Is this conclusion justified by the evidence presented?
- Signatures: Are the test report and conclusion reviewed and approved by authorized personnel (e.g., Test Engineer, Design Lead, QA Representative)? Signatures/dates present?
-
Verify the Closure & Record Integrity:
- Issue Resolution: Were any open issues, deviations, or NCRs from the verification process resolved and closed before final approval?
- Record Assemblage: Are all documents (Protocol, Raw Data, Deviations/NCRs, Test Report, Traceability Matrix, Approvals) collected into a single, coherent record set?
- Indexing & Accessibility: Is the record set clearly indexed and organized for easy retrieval? Does it follow the company's document control procedures?
- Storage & Retention: Is the record stored securely according to company policy and regulatory requirements? Is the retention period defined and met?
Essential Tools & Techniques
- Traceability Matrix: The MOST crucial tool. Visually links each design input to its verification activity and result. Verify its completeness and accuracy.
- Checklists: Develop specific checklists based on your quality system, standards, and project requirements for each phase (Protocol Review, Execution Review, Report Review).
- Auditing Principles: Apply audit techniques:
- Sampling: For large datasets, ensure sampling is statistically valid or representative.
- Traceability: Follow the evidence trail forward (Input -> Protocol -> Test -> Raw Data -> Report -> Conclusion) and backward (Conclusion -> Report -> Raw Data -> Test -> Protocol -> Input).
- Interviews: Talk to personnel involved (testers, engineers, reviewers) to understand the process and clarify any ambiguities in the records.
- Comparative Analysis: Compare verification records across similar products or historical projects to identify anomalies or inconsistencies.
Common Pitfalls to Look For
- Missing Requirements: Traceability matrix gaps.
- Inadequate Test Methods: Methods not suitable for proving the requirement.
- Vague Acceptance Criteria: Criteria that allow subjective interpretation.
- Incomplete Raw Data: Missing logs, printouts, photos.
- Uncontrolled Deviations: Tests run out-of-spec without approval/assessment.
- Incorrect Data Analysis: Math errors, misinterpretation of graphs/charts.
- Unjustified Conclusions: "Pass" stated despite ambiguous or borderline results.
- Lack of Approvals: Missing signatures from required personnel.
- Poor Organization: Records scattered, difficult to follow the logic.
Who Performs the Verification?
- Quality Assurance (QA): Often the primary function responsible for verifying the verification records as part of design reviews, audits, or release processes. They ensure compliance with procedures and regulations.
- Design Engineering: Responsible for the technical content and ensuring the verification proves the design meets inputs. They review the technical adequacy.
- Project Management: May oversee the overall verification process and record completeness.
- Regulatory Affairs: Ensures compliance with specific regulatory requirements related to verification.
- Independent Auditors: Internal or external auditors verify the entire design verification process, including the records.
Summary
Verifying design verification records is not just about checking paperwork; it's about confirming the integrity and reliability of the evidence used to declare that a design is fit for its intended purpose. It requires a systematic, traceable approach focusing on completeness, adherence to plan, accuracy of data, soundness of analysis, and clear justification of the conclusion. Using tools like traceability matrices and applying rigorous audit principles are essential for this critical quality assurance activity.
Request an On-site Audit / Inquiry