You dont have javascript enabled! Please enable it! Manual – 069 The Validation of Facilities and Systems Pharmaceuticals quality assurance & validation procedures GMPSOP

Manual – 069 The Validation of Facilities and Systems

1. Purpose

The purpose of this guideline is to provide requirements for the Validation of Facilities and Systems and to outline recommendations on how to achieve compliance.

2. Scope and Applicability

This guideline can be applicable to any Operations site, function and departments undertaking work, or providing support services, required to meet Good Manufacturing Practice (GMP). The guideline applies to all Facilities and Systems used in the manufacture and control of registered stages of Drug Product or Active Pharmaceutical Ingredient (API) for validation or sale. The guideline applies to all projects involving the introduction of, or significant change to, any Facility or System that potentially impacts on product quality. NB: Where this guideline refers to product quality, consideration should also be given to product safety and efficacy.

3. Definitions

3.1 System

A collection of components organized to accomplish a specific function or set of functions.

3.2 Component

A constituent part or aspect of something more complex. In programming and engineering disciplines, a component is an identifiable part of a larger program or construction. A component provides a particular function or group of related functions.

3.3 Commissioning

The process of verification that new or modified assets can meet their design intent, while bringing them from a constructed state into beneficial operation, as defined by the acceptance criteria and agreed with the Project Sponsor.

3.4 Design Qualification

The documented verification that the proposed design of the facilities, systems and equipment is suitable for the intended use.

3.5 Installation Qualification

Documented verification that all physical aspects of a facility or system, which affect product quality adhere to the approved specification and are correctly installed.

3.6 Operational Qualification

Documented verification that all functional aspects of a facility or system which affect product quality, perform as intended throughout all anticipated operating ranges.

3.7 Performance Qualification

Performance Qualification provides documented verification that all aspects of a Facility or System, which can affect product quality perform effectively and reproducibly based on the approved process method and specifications.

3.8 Process Validation

Establishing documented evidence, which provides a high degree of assurance that a specific process will consistently produce a product meeting its pre- determined specifications and quality characteristics.

3.9 Cleaning Validation

Establishing documented evidence that a specified cleaning procedure will provide a high degree of assurance that it can be used to consistently clean a piece of equipment or a facility to a predetermined acceptable level of cleanliness.

3.10 Direct Impact System / Component

This is a system / component that is expected to have a direct impact on product quality. These systems / components are subject to qualification. In some instances, Direct Impact Systems / Components will depend on Indirect Impact Systems / Components for effective operation and therefore, any interfaces need to be carefully assessed.

3.11 Indirect Impact System / Component

This is a system / component that is not expected to have a direct impact on product quality, but typically will support a Direct Impact System / Component. These systems / components are not subject to qualification, but are subject to Good Engineering Practice (GEP).

3.12 No Impact System / Component

This is a system / component that will not have any impact, either directly or indirectly, on product quality. These systems / components are not subject to qualification, but are subject to Good Engineering Practice.

3.13 Good Engineering Practice

Established engineering methods and standards that are applied throughout the project lifecycle to deliver appropriate, cost-effective solutions [ref. 5.2.1].

3.14 Validation Documentation 

Documentation necessary to meet the requirements of cGMP’s, over and above that required for Good Engineering Practice (sometimes referred to as “Enhanced Documentation”). Validation Documentation should complement (and not repeat) that which is created through Good Engineering Practice. In addition to being approved by a technical/engineering representative they should also be approved by QA.

4. Responsibilities

Each site shall have in place procedures for the validation activities detailed in this guideline. The procedures shall identify the responsibilities associated with the technical and QA approvals of Validation Documentation. As a minimum these should be:

Technical Approvals: To ensure that all the engineering and operational aspects have been considered and, where appropriate, that the work has been performed in accordance with the approved program / protocol.

QA Approvals: That all cGMP and regulatory requirements have been considered, met and documented as appropriate. It is the responsibility of each site to appoint a person accountable for validation for each project. For large projects, this may be a full-time validation manager.

5. Guideline

5.1 Validation Lifecycle for Facilities and Systems

This guideline applies to all types of Facilities and Systems. The validation lifecycle model is illustrated at Appendix 1. The lifecycle phases indicated in the appendix follow the same sequence as those for computerized systems. The relationship between computerized system validation requirements and the requirements of this guideline should be explained within each individual Validation Master Plan (VMP) or Validation Plan (VP) in the context of the validation being planned.

5.2 Validation Master Plan (VMP)

A VMP is a strategic document, which shall be approved at an early stage in the project that identifies the elements to be validated. It is recommended that these elements are identified by conducting a Systems Impact Assessment.

5.2.1 Systems Impact Assessment (SIA)

A Systems Impact Assessment is the process of determining which Systems should be subject to qualification, part of a risk-based approach to validation. The assessment is made by evaluating the impact that a System has on the quality of the product. The Systems should each be categorized as one of the following:

 – Direct Impact System

 – Indirect Impact System

 – No Impact System

Only Direct Impact Systems are subject to qualification, though all systems are subject to GEP. (For further information, refer to Appendix 1) Indirect Impact Systems can affect the performance or operation of a Direct Impact System and therefore:

* Any interfaces need to be carefully assessed

* It should be ensured that Direct Impact Systems could detect or prevent a product quality-threatening problem with an Indirect Impact System linked to it. In the instance when a system can be used as both a Direct and Indirect Impact System, the requirements of the Direct Impact System shall take precedence to ensure compliance to cGMP’s. It should be noted that this is a very high level assessment. Borderline cases should be treated as direct impact. Examples of factors which can determine the impact on product quality are given in Appendix 3.

Figure: Examples of Systems showing Spectrum of Impact ISPE Baseline

Pharmaceutical Guide, Volume 5 – Commissioning and Qualification

The SIA will document all the Systems and the rationale as to whether they should or should not be qualified. This rationale should be developed by a multi-disciplinary team (e.g. user representative, engineering representative, process engineer, validation manager, QA representative) undertaking an Impact Assessment of all Systems. Each System should be categorized by type, as defined above. The final output of this assessment shall be approved as part of the VMP by technical and QA representatives. On projects where there is no requirement for a VMP, the systems impact assessment will be recorded within the VP.

5.3 Validation Plan (VP) Each Direct Impact System shall be detailed in a validation plan, detailing the validation activities to be undertaken to assure product quality. Consideration may be given to the logical grouping of systems within the VP’s. For example, Process Equipment; Process Services; Building / Building Services; Process(es) / Products(s). Each VP should demonstrate that the validation activities have been considered and will be organized in a structured manner. The VP may include acceptance criteria, which would then be reported against when the corresponding report is written.

NOTE

For projects involving a small number of Systems, it may be desirable to incorporate the contents of the individual Validation Plans into the VMP or one single Validation Plan. The VP should identify the validation lifecycle activities to be undertaken. For each validation lifecycle activity, the procedures to be followed and associated responsibilities, by role or function, for generation, review and approval should be recorded in accordance with the requirements for Validation Documentation (see 3.14, above). The VP should identify and document the ruling SOPs and the relationship between them. The VP should record the approach to the Component Impact Assessment for each system (see 5.8.1, below) and where the results of the assessment will be recorded. The VP should identify the documentation system to be used for the management of change control and the extent to which formal change control applies to the documents at each stage. For projects involving significant changes to existing Systems, a new VP should be prepared. The existing VP may be used as a reference document. The VP shall be prepared and approved at an early stage in the project. Thereafter, it shall be subject to formal change control.

5.4 User Requirement Specification (URS)

The URS shall define what the System is required to do. It should be comprehensive and reflect the intended functional use of the System. It should address operational, performance, regulatory, engineering, EHS and commercial requirements. Requirements associated with product quality shall be clearly identified (e.g. in tabular form, numbered and prioritized) and shall be capable of verification during subsequent qualification. The URS shall be prepared, commented on and approved as a minimum by technical and QA representatives. The URS shall be approved prior to purchase of the equipment, and thereafter shall be subject to formal change control. For projects involving the introduction of a new drug product or substance, the URS should relate to the best information available from Development reports and reviews.

5.5 Supplier Selection

Typically, the Supplier Selection process will consist of up to five stages:

 – Supplier’s Proposal

 – Supplier Audit

 – Supplier Selection

 – Contract Negotiation

 – Order Placement

The decision on whether or not to audit a supplier should be supported by existing Supplier Information such as Supplier Audit Reports / Questionnaires and Supplier Performance Reviews. The rationale should be recorded in the relevant validation document. Limitations of supplier capabilities, measures to minimize the risk of these limitations and recommended approach should be identified. Any Supplier Audit Reports acquired or prepared as part of the supplier selection process, should be referenced within the relevant validation document (e.g. DQ report).

5.6 Functional Specification (FS)

The FS shall define how the System meets the operational, performance, regulatory, engineering and EHS requirements defined in the URS. It should be comprehensive and reflect the intended functional use of the System. Requirements associated with product quality – as identified in the URS (see 5.4, above) – shall be clearly identified (e.g. in tabular form, numbered and prioritized) and shall be capable of verification during subsequent qualification. The FS shall be prepared, commented on and approved as a minimum by technical and QA representatives. Ordinarily, it should be prepared by the supplier of the System. The approval may effectively be achieved by inclusion of the FS at DQ report approval. Other document types may specify system function. If other types of document are used they should be identified.

5.7 Commissioning Strategy

The strategy for commissioning of Facilities and Systems, as an element of GEP, should be considered at an early stage in the project. For each stage of the commissioning process this should identify the following:

 – Activity (see Appendices 1 and 2).

 – Scope of the activity.

 – Interface with formal qualification activities

 – Persons responsible, e.g. testers, witnesses, approvers.

 – Process by which the activity is to be undertaken, including any documentation requirements.

 – Records to be produced, how they should be registered and archived. The strategy should avoid duplication of testing, wherever this is justifiable. All rationale/justification should be documented.

Some qualification activities, described later at 5.9 and 5.10, may be undertaken at the System supplier’s factory, provided that the location and environment have no effect on the installation or operation and that they are conducted in accordance with cGMPs. In this instance, site testing of these elements should be sufficient to demonstrate that the transfer to site has not affected the installation or operation.

5.8 Design Qualification (DQ) 

The purpose of DQ is to assure that the design of a proposed new or modified facility; system or equipment meets cGMP requirements and is suitable for its intended purpose. In addition to the equipment operation, particular attention also should be made to the design for routine cleaning of the equipment. Compliance with both cGMP and suitability of use should be documented. DQ integrates the URS and FS and relevant design documents e.g. design specifications, in assuring that what has been designed will meet both regulatory and internal requirements. The underlying theme of assuring design compliance prior to the construction or manufacture of the facility, system or equipment is inherently logical and avoids costly errors in judgment. The documentation of the DQ aspects of the qualification scheme shall be prescribed in the VP. The actual specification documents to be reviewed and the responsibilities for both the technical and cGMP compliance review of the design shall be clearly mandated. QA shall perform the cGMP review. All review activities shall be documented. The DQ Program / Protocol and Report shall be prepared, commented on and approved by the persons identified in the Validation Plan. Subject to local practice, there may not be a need for a separate DQ Program / Protocol or Report if this information is distinctly captured as a specific section or part of an earlier document, e.g. VP.

5.8.1 Components Impact Assessment (CIA) 

A risk assessment shall be conducted to determine which components of a direct impact system shall be subject to qualification. The approach to the assessment will depend upon factors such as system or equipment size, complexity, maturity availability of (and access to) vendor design specifications, and locally established practices. It is recommended that a component impact assessment is conducted with reference to Appendix 3: Examples of Factors Which Can Determine Impact on Product Quality. The CIA, as a part of the risk assessment, is made by evaluating the impact that a component of a System has on the quality of the product. The components should each be categorized as one of the following:

 – Direct Impact Component (can also be called critical component)

 – Indirect Impact Component (can also be called non-critical component)

 – No Impact Component (can also be called non-critical component)

Only Direct Impact Components are subject to qualification, though all components are subject to Good Engineering Practice. Indirect Impact Components can affect the performance or operation of a Direct Impact Component and therefore:

 – Any interfaces need to be carefully assessed

 – It should be ensured that Direct Impact Components could detect or prevent a product quality-threatening problem with an Indirect Impact Component linked to it.

In the instance when a component can be used as both a Direct and Indirect Impact Component, the requirements of the Direct Impact Component shall take precedence to ensure compliance to cGMPs. The components within the Direct Impact Systems, Indirect Impact systems and in some cases also No Impact systems should all be assessed for criticality. This is suggested to ensure that systems previously judged to be Indirect Impact or No Impact in the early, high-level assessment have not subsequently acquired a critical function. The assessment should document the components of the System and the rationale as to which of its components should or should not be qualified. This rationale should be developed by a multi-disciplinary team (e.g. user representative, engineering representative, process engineer, validation manager, QA representative). The use of detailed drawings will enable system boundaries to be identified. It is recommended that the output from the assessment is recorded in the DQ report. The final output of this assessment shall be approved by technical and QA representatives.

5.9 Installation Qualification (IQ)

5.9.1 Installation Qualification Program / Protocol

The IQ shall verify that all physical aspects of a Facility or System which can affect product quality comply with the requirements in the approved specifications and are correctly installed. The activities (and their location) comprising IQ will have been identified in the VP. The IQ Program / Protocol should be kept specific to product quality related items only. The IQ Program / Protocol shall define the acceptance criteria for the IQ. The IQ Program / Protocol shall be prepared, commented on and approved by the persons identified in the VP. The IQ Program / Protocol shall be approved prior to carrying out the IQ and thereafter shall be subject to formal change control. The IQ Program / Protocol must define how deviations from the IQ or failure to meet acceptance criteria is to be documented and reviewed. This may involve reference to an approved SOP if appropriate. For projects involving changes to existing equipment, a new IQ Program / Protocol shall be prepared or the modifications qualified using a change control procedure.

5.9.2 Execution of the Installation Qualification

The IQ shall be performed by the persons identified (by name or job role) in the IQ Program / Protocol. The results of the IQ should comprise the original completed IQ Program / Protocol with its check sheets, drawings, etc. marked up with the results observed, comparisons with the pre-determined acceptance criteria, references to any documents (e.g. Operating & Maintenance manuals, calibration records) retained elsewhere, signed and dated by the persons involved, together with any relevant attachments, such as additional raw data and deviation reports. The IQ should be satisfactorily completed before the start of Operational Qualification (OQ). There could be some items which may not be complete and these require a technical judgment before delaying the start of OQ. The rationale for commencing OQ prior to satisfactory completion of IQ should be formally documented and approved. This is valid even if combined IQ / OQ programs / protocols are to be executed. The completed IQ results should be presented as a report or a number of completed IQ results combined into a summary report. The IQ Report shall be prepared, commented on and approved by the persons identified in the Validation Plan.

5.10 Operational Qualification (OQ)

5.10.1 Operational Qualification Program / Protocol

The OQ shall verify that all functional aspects of a Facility or System which can affect product quality operate as intended throughout all anticipated operating ranges. The activities (and their location) comprising OQ will have been identified in the VP. Phases, sequences of operation, operation of controllers, alarms, displays, recorders, etc should be documented in the description. As in the IQ, the OQ Program / Protocol should include detailed test instructions. The test instructions should include some assessment under ‘worst case’ conditions to include, for example, prolonged operation, extended operation at the limits condition, minimum/maximum loads, etc. It is essential that the test design take into account the full range of the process conditions to be used. To demonstrate consistency of operation, the OQ should include multiple runs, particularly for critical or sensitive functions. The OQ Program / Protocol should be kept specific to product quality related items only. The OQ Program / Protocol shall define the acceptance criteria for the OQ. The OQ Program / Protocol shall define, where applicable, critical instrument calibration as an essential prerequisite. The OQ Program / Protocol shall be prepared, commented on and approved by the persons identified in the VP. The OQ Program / Protocol shall be approved prior to carrying out the OQ and thereafter shall be subject to formal change control. The OQ Program / Protocol must define how deviations from the OQ or failure to meet acceptance criteria is to be documented and reviewed. This may involve reference to an approved SOP if appropriate. For projects involving changes to existing equipment, a new OQ Program / Protocol should be prepared or the modifications qualified using a change control procedure.

5.10.2 Execution of the Operational Qualification

The OQ should be performed by the persons identified (by name or job role) in the OQ Program / Protocol. The results of the OQ should comprise the original completed OQ Program / Protocol with its check sheets, etc., marked up with the results observed, comparisons with the pre-determined acceptance criteria, references to any documents (e.g. commissioning records, justification reports) retained elsewhere, signed and dated by the persons involved, together with any relevant attachments, such as additional raw data and deviation reports. The OQ should be satisfactorily completed before the start of PQ. There could be some items that may not be complete and these require a technical judgment before delaying the start of PQ. The rationale for commencing PQ prior to satisfactory completion of OQ should be formally documented and approved. This is valid even if combined OQ / PQ programs / protocols are to be executed. The completed OQ results should be presented as a report or a number of completed OQ results combined into a summary report. The OQ Report shall be prepared, commented on and approved by the persons identified in the VP.

5.11 Performance Qualification (PQ)

5.11.1 Performance Qualification Program / Protocol

PQ follows IQ/OQ. Although described below as a separate activity, it is acceptable to include PQ testing as part of the OQ exercise. The PQ is the final qualification activity prior to performing Process Validation (PV). PQ assesses that the equipment and ancillary systems, as connected together can perform effectively and reproducibly. The PQ is performed using production materials, qualified substitutes or simulated product and subject to processing conditions encompassing upper and lower operating limits or ‘worst case’ conditions. PQ bridges OQ, with its emphasis on demonstrating equipment function, and PV with its emphasis on process capability and consistency. PQ takes OQ one step further due to the requirement to include production materials, qualified substitutes or simulated product. A properly executed OQ and PQ means that PV can be conducted using routine process conditions. There are no specific requirements for the number of runs to be performed in PQ. One of the goals of PQ is to demonstrate consistency. Multiple runs or trials, especially for the critical elements of PQ, should be included. The PQ Program / Protocol shall verify that all aspects of a Facility or System which can affect product quality perform effectively and reproducibly based on the approved process method and specifications. The activities comprising PQ will have been identified in the VP. The PQ Program / Protocol should be kept specific to product quality related items only. The PQ Program / Protocol shall define the acceptance criteria for the PQ. The PQ Program / Protocol shall be prepared, commented on and approved by the persons identified in the VP. The PQ Program / Protocol shall be approved prior to carrying out the PQ and thereafter shall be subject to formal change control. The PQ Program / Protocol must define how deviations from the PQ or failure to meet acceptance criteria is to be documented and reviewed. This may involve reference to an approved SOP if appropriate. For projects involving changes to existing equipment, a new PQ Program / Protocol shall be prepared or the modifications qualified using a change control procedure.

5.11.2 Execution of the Performance Qualification

The PQ shall be performed by the persons identified (by name or job role) in the PQ Program / Protocol. The results of the PQ should comprise the original completed PQ Program / Protocol with its check sheets, comparisons with the pre-determined acceptance criteria, references to any documents (e.g. commissioning records, justification reports) retained elsewhere, signed and dated by the persons involved, together with any relevant attachments, such as additional raw data and deviation reports. The PQ should be satisfactorily completed before the start of Process Validation. There could be some items that may not be complete and these require a technical judgment before delaying the start of Process Validation. The rationale for commencing PV prior to satisfactory completion of PQ should be formally documented and approved. The completed PQ results should be presented as a report or a number of completed PQ results combined into a summary report. The PQ report shall be prepared, commented on and approved by the persons identified in the validation plan.

5.12 Supplier Performance 

Review A Supplier Performance Review should be performed before project closure, following completion of system validation activities. GMP elements of the supplier performance should be recorded in the appropriate validation report. This should provide the following internal feedback for future projects:

 – Supplier performance

 – Prompts for future improvements

 – Recommendations for future projects

5.13 Process Validation (PV) and Cleaning Validation (CV)

For appropriate process and cleaning validation please refer to Manual 035, 036 037, 038 & 040

5.14 Validation Report (VR)

A Validation Report(s) shall be produced, corresponding to each of the project VP’s, confirming that all validation activities identified in the VP have been completed and that any anomalies have been satisfactorily resolved. The VR shall be prepared, commented on and approved by the persons identified in the VP.

5.15 Validation Master Report (VMR)

Where a project specific VMP has been prepared, a corresponding VMR shall be produced, confirming that all validation activities identified in the VMP have been completed and that any anomalies have been satisfactorily resolved. The final approval for use within the GMP regulated process/processes needs to be demonstrated through a clear statement in the VMR or through other documentation. Final approval of the VMR will mark the completion of the validation project. However, approved completion of certain elements of the exercise, with supporting documents, may permit beneficial operation to commence prior to VMR approval. As a minimum, all IQ, OQ and PQ work, including any computer validation work must be complete, with approved reports, and CV and PV must be complete, with approved reports. It is recommended that a formal review, with documented output, is conducted to record the status of the facility prior to commencement of beneficial operation (manufacture of product ultimately for sale).

6. Appendices