You dont have javascript enabled! Please enable it! LAB-055 Laboratory Results Out of Specification Investigation Pharmaceuticals quality assurance & validation procedures GMPSOP

LAB-055 Laboratory Results Out of Specification Investigation

DepartmentLaboratoryDocument noLAB-055
TitleLaboratory Results-Out Of Specification Investigation
Prepared by: Date: Supersedes: 
Checked by: Date: Date Issued: 
Approved by: Date: Review Date:

Document Owner

Laboratory Manager

Affected Parties

All Laboratory staffs

Purpose

This procedure describes the actions taken by an Analyst in the event the result of a test does not conform to Raw material/components or Finished products specifications for physical and chemical tests.

Scope

It is the responsibility of the Laboratory Manager to ensure that this procedure is followed in a timely manner, without preconceived assumptions as to the cause of the Out-of-Specification (OOS) result.

An OOS result does not necessarily mean the batch under investigation fails and shall be rejected.  The OOS results shall be investigated, and the findings of the investigation, including re-test results shall be interpreted to evaluate the batch and reach a decision regarding release or rejection.

Definition

OOSOut-of-Specification test result is one that lies outside the specifications to which it is required to conform.
DRDeviation Report

 

Related Documents

Form-305Laboratory Investigation and Report form
Form-310Raw Material Out of Specification Investigation form
LAB-025Laboratory Workbook

 

EHS Statement

When using chemicals, Laboratory staffs wear safety glasses and are aware of any potential hazards regarding the chemicals in use.

Procedure

The retained sample preparation will greatly facilitate OOS investigations.  Before discarding the test solution and standard preparations, Laboratory staff will check the data for compliance to specifications.

The Laboratory phase of the OOS investigation shall normally be performed within 1 working day of generation of the original OOS result.  The purpose is to confirm or invalidate the original OOS result.

1. Determination of an OOS Causalities

1.1. For a result to be Out-of-Specification (OOS) or Out-of-Limit, a limit must be provided.
If a result obtained is determined to be outside the defined limit for the sample, the Analyst is required to complete part A of the appropriate OOS Report and Investigation Form.
(See Form-305 and Form-310)

1.2. Out of Specifications tests results are entered into the OOS Report and Investigation form electronically or on the hardcopy.

1.3. Upon completion of part A of the form, it is submitted to the Laboratory Manager.  The Laboratory manager raises a Deviation Report (DR) for the OOS Investigation immediately for Laboratory error and process manufacturing failures that are out of expiry specifications.

Laboratory Manager and/or Technical Service Manager will undertake a review of Out-of-Release Specification results to decide if a DR is to be raised.

1.4. At this time an initial evaluation for further testing is undertaken (part B of OOS Report and Investigation Form).

1.5.  The out of specification test is evaluated for determinant error (Lab. error).  The Laboratory Team Specialist or a second Analyst reviews all pertinent aspect of the analysis and any other sources of potential error, which could cause an aberrant result.  The initial Analyst may also discern the source of error and may complete this section.

1.6. If a determinant error is found, it is explained in Part B of the OOS form and the results rejected by notation, which is signed by the Analyst and Laboratory Manager.

1.7.  Repeat analyses are initiated in the same manner as the first analysis.  (Refer Section 2.)

The results from the second testing are recorded in part C of the OOS form.

The Analyst and Laboratory Manager signs the OOS form.  The results are entered onto the Finished/Raw material Specification and test reports.

1.8.  For finished goods, stability and trials: if a determinant error is not found, an action plan is identified based on part D of the OOS investigation and report form concerning the investigation of the production process

The action plan is reviewed by the Laboratory Manager and the Analyst.

2. Samples for Retesting

Re-testing should be done in duplicate.

2.1. Raw materials: Samples to be retested include the original pooled sample, a resample of the bulk material and a previously passed retention sample.

2.2. Finished Goods: Samples to be retested include the original set of samples (if available), a resample of the entire load, a recently passed batch from retention samples and a stability sample of the same finished goods code.

2.3.  Stability, Trial, Process Validation and Complaints: Samples to be retested include the original set (if available) or additional samples from the stability or trial batches and, if applicable, a retention sample of the same solution or finished goods code.

The original test preparation may be used as one of the samples for a retest.

2.4. The number of samples required for a relative quantitative retest is nine (9), tested in duplicate.  In cases where there are not enough samples available, the maximum number of samples available should be used.

2.5.  For empirical quantitative tests, e.g. pH, specific optical rotation, colour of solution, refractive index, etc., the number of retests is three (3) or as specified for pH by Laboratory Manager.

3. Procedure for Retest

Based on the type of analysis to be performed, the retest will follow an outlined procedure (part C) of the OOS Investigation Report.

3.1.  Quantitative External Standard

This category covers any test which generates a quantitative result determined by comparison to a standard reference material, e.g. HPLC, UV-Vis, etc.

3.1.1.  Two dilutions from a common stock standard or two duplicates from a common stock standard IS NOT ACCEPTABLE.  Prepare either two independent standard preparations or if a standard preparation exists, prepare one additional standard preparation.  It is essential that each standard be prepared individually.

3.1.2. Ensure that the instrument is in calibration and that all system suitability tests have been run.

3.1.3. Each standard preparation is analysed before the retest samples are analysed.

3.1.4. The relative error between the two standard preparations is calculated by the following formula:

Relative Error =   200 x (C1/r1) – (C2/r2)
                                       C1/r1 + (C2/r2)
Where C1, C2 = the concentrations of std preparation 1 & 2 respectively
r1, r2 = the average responses for standard preparation 1 & 2 respectively.

The relative error must not be more than 2.0%.  If the error is greater than 2.0% discard the standard preparations and repeat using two new standards.

If a realistic precision cannot be attained an analyst may proceed with the consent of the Laboratory Manager.

3.1.5. Calculate the result for each retest.  These are the reported values.

3.1.6.  Acceptance Criteria.  If the results are within specification then a classification of acceptance is given.

Rejection of data
(i) If an aberrant result is known to be caused by Laboratory error, the value can be excluded from the set of data used to evaluate the final result.

(ii) Otherwise, the Dixon’s Outliers Test can be applied to establish whether a result may be rejected from the set of data.

When a result is an “outlier” and subject to bias, but it is not certain and cannot be established with any clear reason for the extreme value.  Under these conditions it is reasonable to apply an outliers test to the data in an attempt to reach an objective decision.

Dixon’s Test for Extreme Values

(i) Arrange the results in increasing order of value.

(ii) Calculate a ratio (Rcalc) where:

Xn = the suspected value (either largest or smallest)

Xn -1 = the value nearest the suspected value

Xn -2 = the second nearest value to the suspected value

X1 = the value furthest from the suspected one

X2 = the value second furthest from the suspected one

X= the value third furthest from the suspected one

For sets of three through seven values:

Rcalc  =  (Xn – Xn-1) / (Xn – X1)

For sets of eight through ten values:

Rcalc  =  (Xn – Xn-1) / (Xn – X2)

For sets of eleven through thirteen values:

Rcalc  =  (Xn – Xn-2) / (Xn – X2)

For sets of values greater than 13:

Rcalc  =  (Xn – Xn-2) / (Xn – X3)

The calculated R value is compared to a critical value in the Table below for a given confidence level (usually set at 95%) and sample size (N).

If Rcalc is > Rcrit (from tables) the result may be rejected from the set of data.

Table of Critical Values for Dixon’s Outliers Test

N =Confidence Level = 95%Confidence Level = 99% N =Confidence Level = 95%Confidence Level = 99%
30.9410.988 140.5460.641
40.7650.889 150.5250.616
50.6420.780 160.5070.595
60.5600.698 170.4900.577
70.5070.637 180.4750.561
80.5540.683 190.4620.547
90.5120.635 200.4500.535
100.4770.597 210.4400.524
110.5760.679 220.4300.514
120.5460.642 230.4210.505
130.5210.615 240.4130.497
    250.4060.489

 

It is important to realise that the Dixon’s Outliers test uses a range approach to estimating standard deviation or precision of the assay.  The test is therefore subject to influence by distributions that may exhibit skewness.  If this is the case then we are more likely to detect an extreme result as an outlier when in fact it is at the tail of the skewed distribution.

3.1.7.  Averaging

Averaging test data depend upon the sample and its purpose.

i) When the sample can be assumed to be homogeneous, then averaging can provide a more accurate result, but

ii) The relevance on averaging can hide variability amongst individual test results for liquids e.g. incomplete mixing.  Therefore averaging of results for liquids is rarely used during Part A and Part B of the OOS Investigation.

3.1.8.  Non-conforming criteria.  For all other possibilities, the classification is to accept the result as being non-conforming.  The Laboratory Manager then requires further evaluation.

The classification of REJECT is used for all non-conforming samples (RM’s or F/G’s).

The Q.A. Manager is notified after the data has been reviewed.  The Laboratory Manager and date the OOS investigation report form.

If necessary an Incident meeting is initiated and completed within 4 working weeks from the initial OOS test results being reported.

3.2. Titrations

3.2.1. If required, make additional volumetric solution(s) and/or test solutions.

If necessary, re-standardise the volumetric solution(s) in triplicate.

Each of the individual concentrations determined for a particular solution must be within 0.5% of the average concentration.

3.2.2. Retest each of the three samples in triplicate and record the results.

3.2.3. Calculate the result for each retest determination by using the average volumetric solution factor.

3.2.4. Apply point 3.1.5, 3.1.6 and 3.1.7 for accepting, accepting non-conforming and rejecting a titration.

3.3. Empirical Measurements

Quantitative measurements of sample attributes include specific optical rotation, pH, absorptivity, colour (instrumental), refractive index, viscosity, etc.

3.3.1. The state of calibration of the instrument should be verified.

3.3.2. The three Retest Samples should be prepared independently.  Calculate the results for each retest sample.

3.3.3. Apply point 3.1.5, 3.1.6 and 3.1.7 for either accepting, accepting non-conforming and rejecting empirical measurements.

3.4. Semi and Non Quantitative Method

Examples include Thin Layer Chromatography (TLC) and Identifications (Infra-red, Ultra Violet, precipitation/colour change, etc).

3.4.1. For instrumental methods, verify the state of calibration of the instrument making the measurement.

3.4.2. Three retest samples are to be prepared independently.

3.4.3. Review the results for each of the three Retest Samples.

3.4.4. Apply point 3.1.5, 3.1.6 and 3.1.7 for either accepting, accepting non-conformity and rejecting semi- and non-quantitative methods.

4. Reporting of results

4.1.  All results are reported electronically on an OOS investigation and report form (Form-305/ Form-310) along with the discussion of any pertinent information regarding the analysis, which have a bearing on the decision-making process, which is then incorporated into the Deviation Report.

The Incident Meeting investigation is to be thorough, unbiased, well documented and scientifically defensible.

Investigation of the Production Process (Part D of OOS form) assists the analyst to explain the OOS result and is part of the decision-making process to accept or reject a result.

Stability Trends are filed in the Product Technical Documents held in Technical Department.

4.2.  The OOS Investigation and Report Form is forwarded electronically to the Quality Assurance Officer as supporting data when an Incident Meeting is called.

4.3.  The OOS Investigation and Report Form is supporting data in the discussion of changes in production processes or analytical procedures and is forwarded to the Technical Manager for further action when required.

4.4.  Results and final decisions should be recorded in the Laboratory Notebook according to SOP LAB-025

5. Extended Investigation

When the laboratory investigation does not determine that laboratory error caused the OOS result and testing results appear to be accurate, an Incident Meeting is held.

As part of the OOS investigation, a general review provides a list of other batches and products possibly affected and if the problem has occurred previously.

6. Appendix: Flowchart – Handling of OOS Results

7. Summary of Changes

Version #Revision History
LAB-055New