1 / 32

Out of Specification and Atypical Test Results

Out of Specification and Atypical Test Results. Contents. Objective Definitions and terminology General investigation principles Investigation stages Initial Laboratory Investigation Full-Scale OOS Investigation Additional Laboratory Testing Review of Production

dorjan
Télécharger la présentation

Out of Specification and Atypical Test Results

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Out of Specification and Atypical Test Results

  2. Contents • Objective • Definitions and terminology • General investigation principles • Investigation stages • Initial Laboratory Investigation • Full-Scale OOS Investigation • Additional Laboratory Testing • Review of Production • Importance of corrective and preventive actions

  3. Process overview

  4. Cause for an investigation Out of Specification arises: Production error? Analytical (lab) error? Investigation Root cause assigned? Corrective and Preventive Action?

  5. References • FDA Guidance for Industry : Investigating Out-of-Specification (OOS)Test Results for Pharmaceutical Production, October 2006 : Contains Nonbinding Recommendations • USP– NF General Chapter <1010> Analytical Data – Interpretation and treatment • ICH Q7A Note for Guidance on Good Manufacturing Practice for Active Pharmaceutical Ingredients, November 2000 • EU Guidelines to GMP Part I Chapter 6 Quality Control Section 6.32.

  6. Definitions and Terminology (1) • Out of Specification (OOS) Result • Reportable result that falls outside established specifications or acceptance criteria. • Do not include situations where Control test is not continued due to violations of system suitability tests (SST) limits • Identification of gross operator-errors during a run: e.g. wrong instrument settings • Such deviation cases should be tracked in order to apply CAPA if appropriate • Reportable Result • Result that must be compared to a particular acceptance criterion. • May be “an average value, an individual measurement, or something else”, as defined in the control test [USP]

  7. Definitions and Terminology (2) • Established Specifications or Acceptance criteria: • Approved specifications or acceptance criteria (e.g. limits) with respect to the final result (reportable result) in order to evaluate the quality of a product or sample. • May be official (part of a regulatory file) or internal (e.g. additional or alternative control tests) These do not apply to: • Acceptance criteria during the course of analysis (before the reportable result) such as system suitability tests (SST), warning limits, in-process control limits for process adjustments • Atypical Test Result or Out of Trend / expectation, suspect, … • Result within specification • Result different from those usually obtained or expected • Same general investigation principles as OOS results • Site Quality Control Manager must determine extent and depth of any • investigation

  8. Definitions and Terminology (3) • Re-analysis • Analysis of retained sample preparations • Repeating some part of the testing, e.g. • re-measurement • re-injection • redilution of sample and standard preparations • Further extraction • Fresh analysis • Performing the testing as normal (New execution of the control test) • As described in the control test • Only allowed if initial results could be invalidated

  9. Definitions and Terminology (4) • Retest : • New preparation and analysis of a portion of the original sample • Part of a full-scale OOS investigation • Using a predefined procedure/plan with a MAXIMUM number of retests, • Used to verify/ reject the possibility of laboratory errors • Control Sample : • A sample of material that has previously been tested and approved or well characterised. • Re-sampling : • Collecting and analyzing a new sample from the product • Must have a documented rationale for re-sampling, (e.g. original sample not representative or compromised/contaminated) • Used for confirmation of product failure

  10. Presentation Scope • Applicable to QC laboratories, Manufacturing sites and Affiliates • All GMP relevant laboratory testing including inprocess controls • Where approved specification or acceptance limits are established. Note: Compendial tests Where interpretation and handling of test results are described, those procedures are to be followed.

  11. Principles of Investigation (1) • All OOS or atypical test results must be thoroughly investigated and documented • Includes identification of root cause • OOS or Atypical test results are only allowed to be invalidated if a laboratory error can be assigned • Demonstrated or indicated with high probability and/or a product failure definitely excluded • Supporting information and/or data • Timely initiation (e.g. 2 business days) and completion (e.g. 20 business days) • Otherwise written rationale • Sterile products: a parallel investigation in the manufacturing area must be initiated immediately • A risk assessment and notification in case of confirmed OOS for product already on the market or for a risk that could affect other batches or products on the market

  12. Principles of Investigation (2) • Consider: • Problem has occurred previously (historical data) • Possibility that other batches or products are affected • Corrective actions must be defined as a conclusion of the investigation and followed through to prevent reoccurence • A system must be in place for the tracking of OOS results • Investigation phases • Initial Laboratory Investigation • Full-Scale OOS Investigation • Additional Laboratory Testing • Review of Production

  13. Responsibilities (1) • Analyst • Achieves accurate analytical results • Should not use systems that do not meet system suitability requirements (SST); or in such case during analysis identifies data collected during suspect time period (e.g. reference standard injection runs in a chromatographic system) • Should not continue test in case of obvious error (spilling of a sample solution, incorrect dilution etc.) • Immediately reports the OOS to his/her supervisor • Retains the original test solutions, materials and source data • Participates in the investigation

  14. Responsibilities (2) • Supervisor / QC Manager • Conducts an objective and timely investigation • Informs Site Quality Manager (immediately in the case of a sterile product) • Decides upon investigation of atypical results • Site Quality Manager • Ensures specific site procedures and systems exist • Ensures appropriate training given • Makes batch related decisions when an OOS or atypical result is confirmed • Informs all relevant departments • Assuring that corrective actions are taken to prevent reoccurrence • For sterile products, promptly initiates parallel investigation in laboratory AND manufacturing area.

  15. Initial Laboratory Investigation OOS/OOT results identified by Analyst. Supervisor informed immediately INITIAL LABORATORY INVESTIGATION Assess whether the test procedure was followed correctly (calculation; sampling; sample preparation; equipment functionality& qualification; training etc) Laboratory Error Identified? YES NO See slide 18 (Lab error IS identified) Full scale OOS investigation required (See slide 20 )

  16. Initial Laboratory Investigation:examples (1) • Method discussed between analyst and supervisor : confirm analyst knowledge of and performance of the correct procedure • Verification of calculations • Review of samples for correct labelling and identification • Verification of correct laboratory test methods • Examination for proper documentation • Review of chromatograms, spectra, data and calculations • Review of reagents, media, diluents, controls and standards • Examination of the instruments and laboratory systems (qualification, calibration, maintenance) • Review other samples run concomitantly (if any)

  17. Initial Laboratory Investigation:examples (2) • Verification of analyst’s training history • Inspection of prepared sample • Reanalysis/Redilution of test samples (if stable) • Supervisor fully document and preserve records of the Initial Assessment • Everything except new sample preparation

  18. Laboratory Error is Identified (Initial Lab investigation- slide 15) Laboratory error IS identified Invalidate original test results Consider Implication on other analytical tests/products CAPA plan Fresh analysis Evaluation if positive Implement CAPA action plan and follow through New Evaluation Conclusion

  19. Categorising Errors Consider Actions for improvement of processes/ prevention of errors – e.g. training analysts; instrument re-qualification, update to procedures.

  20. Full scale OOS Investigation Laboratory error is NOT identified (from slide 15): Full Scale OOS investigation Review of Production (slide 22) Additional Laboratory testing Full Laboratory investigation • Areas to review will include: • Batch Dossier Evaluation • Equipment: Validation, Qualification, Calibration • Training of Personnel • Risk Management Evaluation • Follow Company protocols on: • Re-testing • Re-sampling Deviation explaining OOS confirmed? OOS Confirmed? (See slide 21) NO YES Batch Failure Investigation (See slide 21)

  21. OOS Confirmed? YES (Additional Laboratory testing, slide 20) OOS Confirmed? NO YES Batch Failure Investigation Lab error identified? See slide 18, above. NO • Evaluate rejection of batch • Risk Assessment • Quality Alert Notification Take all results into consideration Impact on Other Products? Investigation report Document. If yes, further evaluation required

  22. Review of Production • Involvement required of all departments that could be implicated: Manufacturing, Process Development, Maintenance, Engineering, Sub-contractor • Timely, thorough, well-documented review • Review of: batch dossier, equipment (qualification, calibration), personnel training and risk management evaluation, etc. • Once Root cause is identified and CAPA plan implemented: OOS investigation may be terminated

  23. Additional Laboratory testing; retesting; Maximum number; design • Variables : number specified in advance, analyst, control samples … • Must be defined before testing • Case-by case decision, e.g. depending on • Possible cause of the OOS (analyst, equipment, sample …) • Level of confidence required • Analytical variability • “Extent” of OOS

  24. Additional Laboratory testing: Resampling • Resampling to be carried out: • in accordance with predetermined procedures and sampling strategies • where it is identified that the original sample was improperly prepared and not representative of batch quality • Use same sampling method unless it is demonstrated as inadequate

  25. Reporting Testing results: Averaging General Considerations • Mean • Estimate of the true value • Reliability increases with the number of determinations • However: may hide (unacceptable) variability • Reportable results and acceptance limits • Correlated • Evaluation of single results requires wider limits due to the broader distribution • If the mean is defined as reportable result, the variability should be checked (standard deviation or range)

  26. Reporting Testing results: Appropriate and Inappropriate Uses of Averaging Appropriate • When sample assumed homogeneous and if the written and approved test method specify that the average of multiple replicates (assays) is considered one test and represents one reportable result Inappropriate • When testing is intended to measure variability of a product (blend uniformity, content uniformity) • In case of additional testing during OOS investigation because variability is hidden • Specifically when some results are OOS and other within specifications • Provide all individual results to quality management responsible for approving or rejecting the product

  27. Reporting Testing results: Outlier Tests • Outlier result, Definition: A value obtained, on rare occasions, markedly different from the others in a series, with a validated method • Outlier testing: statistical procedure for identifying from an array those data that are extreme: • SOP is mandatory (includesminimum number of results required, to obtain a statistically significant assessment) • Used when it is not possible to reveal the root cause of OOS/ deviation • May be used as investigative tool • Should not be used as sole criteria to invalidate data

  28. Concluding the Investigation • Results evaluated, batch quality determined, release/rejection: decision by Quality Management • OOS cause revealed: • Suspect result invalidated and not used • OOS confirmed and is caused by factor affecting batch quality • OOS result used to evaluate quality of lot • OOS Cause not revealed: • Inconclusive investigation • OOS not confirmed • OOS result given full consideration for batch decision

  29. Reporting and Documentation • OOS report should include the following: • Chronology of the investigation • Full description of the initial laboratory investigation including an analysis of all data derived from all testing • Batches or products involved • Justification for invalidating any data • Conclusion including conformity of final results and batch • disposition • Corrective and preventive actions taken to prevent the OOS • from occurring again • Uniquely numbered to permit tracking • Maintained in a controlled and retrievable manner

  30. OOS Investigations • Clear definition of the reportable result • Thorough, timely, well documented, scientifically justified with no preconceived assumptions • Clear and methodical escalation • Objective to assign a cause • Product failure • Laboratory (analytical) error • Corrective actions to prevent reoccurrence • Atypical results: chance to increase data quality • E.g. in stability studies

  31. Examples: Identification of Analytical Errors

  32. Thank You Any Questions

More Related