You dont have javascript enabled! Please enable it! VAL-045 Impact Assessment for Computerised Systems Pharmaceuticals quality assurance & validation procedures GMPSOP

VAL-045 Impact Assessment for Computerised Systems

DepartmentValidation/Technical ServicesDocument noVAL-045
Prepared by: Date: Supersedes: 
Checked by: Date: Date Issued: 
Approved by: Date: Review Date:

Document Owner

Validation Manager

Affected Parties

All Validation, Technical Service, Operations, Quality Assurance, Engineering and Project staffs involved in validation projects.

Purpose

The purpose of this guideline is to provide a method of assessing and determining the validation requirements for computerised systems and controllers.

The SOP identifies the typical qualification activities required for those systems having a Direct or Indirect impact on product/process quality and data integrity, should the system fail or malfunction.  These activities are in addition to Good Engineering Practice (GEP), which is appropriate for all systems, and is also outlined.

Scope

Computer validation, as with other types of validation, is to be performed to avoid any intolerable risk to product quality (including integrity of stored GxP data), customer safety and to maximise business benefits from the particular system.

Impact Assessment should be applied to new projects, change requests and re-validations of Computerised Systems.

Impact Assessment is the process by which Computerised System Items are identified and evaluated.  Impact Assessment will guide the level of validation that is required for the task at hand.  All validation requirements will be detailed or attached to the Validation Plan or Change Request forms and approved by the validation or change request committee respectively.

The extent of validation to be performed will reflect the novelty or complexity of an Item (i.e. whether it is a standard Item in wide-use, or a purpose-built development with no history) and its GxP Impact (i.e. Direct, Indirect or None).  From these factors, a risk profile can be developed.  This allows the project team to provide an objective approach to support the validation requirements.

Definition

Business System OwnerThe Business System Owner is the Manager of the functional department using a Computerised System.
Computerised SystemA grouping of Items that interact electronically (with one or more members of the group). The “group” may also be one stand-alone device. Also see “System Boundary”.
FirmwareSoftware on a silicon chip.

Functional Testing

Testing that compares the expected output of a function with the actual output of that function when a known input is provided. Test cases include normal and abnormal inputs. This method only qualifies the particular function tested.
FunctionAn aspect of the internal operation of the Item (which is listed on the Impact Assessment form and rated for GxP Impact).
GAMP“Good Automated Manufacturing Practices” – document developed by IS to provide guidance in Computerised System Validation
GEP“Good Engineering Practices” – Established engineering methods and standards that are applied throughout the project lifecycle to deliver appropriate, cost–effective solution.
GxP

Generic abbreviation for ‘Good ….. Practices’ related to Medicinal Products. Includes:

GMP – “Good Manufacturing Practices” – defined by regulatory agencies.

GLP – “Good Laboratory Practices”

ItemAn individual inventory item of a Computerised System.
Source CodeThe human-readable form (programming language) of a computer program. Source code must be translated into a machine-readable form before the computer can execute it.
Structural TestingExamining the internal structure of the source code against general programming standards and specific system requirements.  Requires specific computer science and programming expertise.  Structural testing chiefly occurs during software development.
System BoundaryA limit drawn around the Computerised System to logically define what is, and is not, included in the system. A system may interface with other systems of different functionality without being regarded as an integral part of those systems (e.g. to pass data one way).

Related Documents

Form 705Impact Assessment Form for Computerised Systems
VAL-005Validation-Concept and Procedure
VAL-040Computerised Systems Validation
VAL-030Equipment Specification & Qualification

EHS Statement

While this guideline refers to product quality, consideration should also be given to requirements for safe equipment operation and building these issues into the User Requirement Specification phase.  It may be appropriate to increase validation requirements if computerised systems perform a safety function.

1. Responsibility

The Business System Owner, Change-Control Coordinator or Project Coordinator is responsible for carrying out Impact Assessments during new projects, Change Requests and re-validations of computerised systems.

It is also the responsibility of the Business System Owner or project coordinator to have relevant representatives from Engineering, IS and Quality Assurance (Validation) review (or input to) this Impact Assessment.

2. System Identification

A Computerised System usually includes multiple Items that interact electronically, but it may be a single Item acting in isolation.  The Master Inventory Item in the Computerised System is identified, based empirically on the ‘seniority’ of its controlling function and its direct electronic contact with the majority of related Items within the Computerised System.  It is usually a PLC if available.

 The system boundary is defined by listing all of the related items of the system on the Impact Assessment form (Form 705) for the Master Inventory Item.

Below are a few examples of computerised systems.

– Solution Preparation PLC– Autoclave Controller
– Process Monitoring System (i.e. SCADA)– Laboratory Computerised System (i.e. HPLC)
– Purified Water System PLC– HVAC Controller

 

Figure 1: An illustration of the structure of a Computerized system is the following Finishing Line PLC

3. Criticality Assessment (Quality Impact)

Validation is concerned with assuring that our manufacturing processes, activities and systems deliver product that reliably meets quality standards.  Product Quality attributes are Identity, Safety, Efficacy, Purity, and Evidence.  Examples of quality characteristics are shown in the table in Section 10.  Consideration of the significance of electronically records generated may influence the Criticality Assessment, as outlined in Section 11.

3.1. Direct Impact

Impact Assessments look individually at the Items within Computerised Systems to evaluate the affect of their Functions on product quality.

The following list of questions assess whether an Item/Function has a “Direct Impact” on the quality of a product/process or integrity of stored data.
(If the answer to any question is “YES” the Item/Function has Direct Impact.)

The Criteria below should be used to assist in formulating a judgement based on the comprehensive understanding of the product, process and the nature of the system.  They should NOT be used to replace the exercise of professional judgment by appropriately qualified personnel.

1.     Is the Item used to demonstrate compliance with the registered process?
2.     Will the normal operation or control by the Item have a direct effect on the product/process quality, (including ingredients and product components)?
3.     Will failure of the Item or its alarms have a direct effect on product quality or efficacy?
4.     Is information from this Item recorded as part of batch record, lot release data, or other GMP related documentation?
5.     Does the Item control critical process functions that may affect product quality (and there is no independent verification of the performance of the control system)?
6.     Is the Item used in analytical tests associated with a product specification, analytical method or compendia?

– Is the Item used in Cleaning or Sterilising (e.g. Purified Water, Clean Steam, Autoclave)?

– Does the Item control or measure the preservation of product status (e.g. “Sterile”, “Clean”)?

– Does the Item produce data, which is evaluated to accept or reject product (e.g. analytical results, chart records, printouts or process reports)?

– Does the Item interact or integrate with other GxP systems (i.e. creation or transfer of GMP data)?

– Is the Item relied upon to protect patient safety in case of a product quality failure (e.g. Distribution records, Quarantined materials records)?

Note 1 – The presence of any one Direct Impact Function within an Item results in the Item being regarded as Direct Impact. Direct Impact Items may also contain Indirect Impact and No Impact Functions.

Note 2 – Component criticality may be multi-faceted. For instance, an instrument that is used for engineering diagnostics (i.e. non-product quality related) may be installed in contact with the product.  In this case the installation and construction materials must be qualified as part of the Installation Qualification and Validation Plan or Change Request, while the functionality of the instrument as an Item of a Computerised System is No Impact.

Note 3 – The Criticality assessment may be used to reduce the scope of validation of Items and their functions. Validation must address all Direct Impact functions of Items.  Other Items/functions may be qualified at a lower level.  GEP as described in Section 8.1 is the minimum requirement.

3.2. Indirect Impact

Items/Functions may be assessed as having an Indirect Impact on product quality, where:

a. The Item’s operation supports a Direct Impact Item (e.g. an actuator in a critical-control process).

b. The Item’s functionality is important to maintain GxP compliance but is not strictly required for the assessment or release of product (e.g. Thermocouple Validation System, Training Records database).

c. Another Item independently verifies the quality function.  For instance, a Label printer has Indirect Impact when independently verified by a (Direct Impact) vision system, or by 100% manual inspection.

For an Item to have an overall rating of Indirect Impact it must not perform any Direct Impact functions.

3.3. No Impact

Some functions have no, or negligible, affect on product quality and can be rated No Impact.  Typically, these are used for activities such as simple relocation of materials or product, provision of non-GxP services for operations, data display for non-GxP purposes, e.g.

a. Most Servo Drives,

b. Most HVAC systems,

c. Some Operator Panels,

d. PLC’s for operation of boilers, robots, granulate systems, vacuum supply, etc.

4. Complexity Assessment (GAMP Categorisation)

All Items can be categorised into one of following five categories. The GAMP category reflects the degree of novelty or complexity of an Item and will influence which validation activities are applicable.

4.1. Category 1 – Operating Systems (Compilers and System Configuration Files)

Specific validation of commercially available operating systems (and compilers), which are established in the market, is not required.  The validation of the application software running on the operating system is considered to validate the operating system.  Operating systems rely on system configuration files that can impact on system performance and data usage and therefore should be recorded.

Typical examples:   Windows NT and Unix

4.2. Category 2 – Firmware (Standard Instruments, Micro controllers, Smart Instrumentation)

This category is essentially hardware with onboard firmware that cannot be programmed by users but can be configured to set up a run-time environment and process parameters.  Custom firmware should be considered Category 5.

Typical examples:   Printer, Barcode Reader, Check Weigher.

4.3. Category 3 – Standard Software Packages (Commercial Off-The-Shelf or COTS)

COTS packages are items that are exposed to high volume use in the marketplace, such that validation of the package itself is not required.  COTS packages are not configured to define the business or manufacturing process, apart from establishing the run-time environment (e.g. network and printer connections).  Process parameters may be input into the application.  Supplier audits may be needed for highly critical or complex applications or where experience with the application is limited.

Typical examples: 

– Excel, Word (documents, used as word processors),

– Artwork Generation packages,

– Statistical Analysis packages,

– and Diagnostic tools,

4.4. Category 4 – Configurable Software Packages

These packages are also widely-used but provide the ability for significant tailoring of functionality to suit the specific requirements of a business or process.  The package provides a number of standard modules, functions and interfaces which can be tuned, selected or assembled as required.  The standard elements being configured would each typically contain significant operational depth and their configuration would be a high-level activity.  (Some packages permit the development of fully customised modules.  These developments should be managed as Category 5.)  Category 4 packages normally require a vendor audit to be performed for critical and complex operations with emphasis on design qualification of the package (documented evidence of a quality approach to system development and structural testing).  The outcome of the audit may dictate the testing approach required at the user site, and this should form the basis of a validation rationale.

Typical examples:   

– SCADA,

– Building Management System

– Unsophisticated Excel spreadsheets, e.g. Particulates results, Cleaning criteria, Bio-burden graphs.

– Autoclave Control System (as-standard from Original Equipment Manufacturer but utilising a configuration file of cycles)

– Filter Integrity Test System

– Access Databases

– Configured Operator Panels

– HPLC (laboratory instrumentation),

4.5. Category 5 – Custom (Bespoke) Software (Custom-Built Items)

For custom-built items (built to a user specification) the validation approach follows the full life-cycle model.  These are quite often complex consisting of several components spanning many of the GAMP categories.

If a third party supplier is providing the custom-built item, an audit of this supplier is recommended to examine their quality systems.  A Validation Plan can be more appropriately developed after reviewing the audit report, as the amount of validation activity is often dictated by the information obtained from the supplier.

If a third party supplier has developed the software as part of a standard machine and there is evidence of previous testing and adequate software development standards it may be appropriate to designate the item as Category 4.  Before this classification, it is important to determine the scope of any ‘customisation’ to suit user specifications, especially where these may be critical functions.

Typical examples:  

– Solution Preparation Control System

– Packing Line Control System

– Robot PLCs

– Servo Drives

– Automated Dispensary System

– Spreadsheet applications which manipulate data, use custom macros, sophisticated logic or lookup functions, e.g. Assay Calculator

4.6. Combining GAMP Ratings

Many complex Items consist of a combination of GAMP ratings.  Thus, different components of an Item may require varying degrees of validation.  The highest category number established dictates the overall validation requirements for the Item (this already incorporates the requirements of lower level category components).

For instance, consider a typical PC-based item, which includes a customised application

Figure 2: Technology Level Vs GAMP Ratings

Each operational level in the Item has a corresponding GAMP rating.  The level with the highest rating determines the overall Item rating; in this case it would be GAMP 5.  Changes to lower levels should be assessed on the basis of their GAMP rating however their interaction with higher levels within the Item must also be considered.

Note that the Data level has no GAMP rating.  If this level however, has a GMP role it may determine the overall Impact of the system (i.e. Direct).  Examples of ratings for various types of records are shown in Section 11.  Data layers with GMP Impact require control measures to preserve their documentation attributes (i.e. accuracy, authenticity, availability and integrity).  These control measures should be recorded on the Impact Assessment form and included in the Validation Plan.

5. Overall Risk-Profile Classification

Combine the Impact rating and GAMP category to determine a Validation strategy (C number):

 

Impact Assessment (Criticality)

 

No Impact
on GXP Functions

No impact on the performance or operation of GxP Functions

Indirect Impact
on GXP Functions

Items that may affect the performance or operation of other Items which have Direct Impact on GxP Functions

Direct Impact
on GXP Functions

Items that have a direct effect on the performance or operation of GxP Functions

GAMP Category (Complexity)

1. Operating systems

C1 Validation:

Record Version & GEP Functional

C1 Validation:

Record Version & GEP Functional

C1 Validation:

Record Version & GEP Functional

2. Firmware (Instruments and controllers)

C1 Validation:

Record Version & GEP Functional

C1 Validation:

Record Version & GEP Functional

C2 Validation:

Record Configuration and Version No & GEP Functional

3. Standard packages

C1 Validation:

Record Version & GEP Functional

C2 Validation:

Record Configuration and Version No & GEP Functional

C3 Validation:

Minimal Functional

4. Configurable packages

C2 Validation:

Record Configuration and Version No & GEP Functional

C3 Validation:

Minimal Functional

C4 Validation:

Some Functional

Minimal Structural

5. Custom-built

C3 Validation:

Minimal Functional

C4 Validation:

Some Functional

Minimal Structural

C5 Validation:

Extensive Functional

Extensive Structural

6. Documenting the Impact Assessment Process

(Form 705) is used to document Impact Assessments of computer inventory Items, and to identify potential impact on product quality or on the manufacturing process or on the integrity of stored GxP data.

Per Item:

– An Impact Assessment is documented for each Item on the inventory list.

– The form for each Item also references the Master Item in its Computerised System.

– The Impact Assessment for the Master Item in each Computerised System also lists all of the other Items in that Computerised System.

– Each Item may consist of several internal Functions.  These are listed on the Impact Assessment, along with their GMP significance.  This includes the presence of user-configurable files, e.g. recipes, databases, vision system models, etc; and the generation of GxP data /files.

– Interfaces with Item(s) in other Computerised Systems are listed, i.e. where some electronic interaction exists but the other systems have a clearly separate functionality.

– Security attributes (controls on file access/modification) of configuration files used by the Item and of data files generated by the Item should be recorded.

– The highest GAMP Category and Criticality Rating for the Item is recorded.

– Circumstances, which may be used to justify variations from validation requirements are recorded.

Hardware, operating systems and system configuration files are listed in IQ documentation.  Accordingly, these do not need to be listed on Impact Assessments.  Changes to hardware, operating systems or system configuration files can affect the operation of computerised items and this needs to be assessed in Change Requests.

Impact Assessments can be filed in hard copy by the Validation Officer.  Electronic images of Assessments can also be attached.

 Figure 3: Flow-chart of the Impact Assessment updating process:

Once the overall risk-profile classification has been determined, as described in Section 5, the Validation Requirements as described in Section 7, can be cross-referenced for the particular Item. For example:

Impact Assessment for Items of a Computerised System

Computerised System Name

Building Management System

(with 4 Items)

Impact Assessment

(Criticality)

1st Item of the system2nd Item of the system3rd Item of the system4th Item of the system
DirectDirectIndirectNo Impact
Complexity of the items based on GAMP categorisation

Category

GAMP 3

Category

GAMP 4

Category

GAMP 1

Category

GAMP 1

Overall Risk-Profile Classification

C3

Validation:

Minimal Functional

C4

Validation:

Some Functional

Minimal Structural

C1

Validation:

Record version & GEP Functionality

C1

Validation:

Record version & GEP Functionality

This figure illustrates that the validation requirements within the Computerised System will vary with the criticality of each Item.  The validation efforts should appropriately address the risk of each Item.

7. Validation Requirements

7.1. Standard Requirements

The following table is to be used as a guide for determining an appropriate Validation strategy. The table is not intended to be entirely prescriptive. The Validation Plan (or Change Request) is used to justify any variations. Typical examples where non-standard approaches are justified are outlined in the next section.

 

ASSESSMENT

COMPUTER VALIDATION REQUIREMENTS
C1 Validation

Record Version & GEP Functionality

Installation Qualification

Record Version

Some functional testing should be performed during commissioning, in line with Good Engineering Practices.

C2 Validation

Record Configuration setting and Version & GEP Functionality

Installation Qualification

Record name of operating system

List Software and Version No.

List of Hardware

Configuration Setting – record version

Some functional testing should be performed during commissioning, in line with Good Engineering Practices.

C3 Validation

Minimal Functional Testing

 Supplier Audits:

May be requested for critical and complex operations.

 

Validation Plan 

Standard Validation Plan applies here

 

Design Qualification – Validation

User Requirement Specification

Functional Specification

 

Installation Qualification

Record name of operating system

List Software and Version No.

List of Hardware

Configuration Setting – record version

Some Functional Testing should be performed during commissioning, in line with Good Engineering Practices.

 

Operation Qualification – Validation

Minimal Functional Testing (may be combined with IQ, but requires QA sign-off).

 Validation Report
C4 Validation:

Some Functional, Minimal Structural Testing

Suppliers Audits 

Normally done for critical and complex operations. The findings from the audit may reduce the amount of internal verification and validation that is carried out by the site.  This will depend on the Supplier having a Quality Management System in place or a recognised third party certification such as ISO 9001.  It should also be based on sponsor’s site history with this supplier.

 

Validation Plan

(Computer Validation Plan) may be required for Custom Built System that requires the site to manage the Life Cycle Development otherwise the standard Validation Plan will be applied.

 

Design Qualification – Validation

User Requirement Specification

Functional Specification

 

Installation Qualification

Record name of operating system

List Software Version No.

List of Hardware

Configuration Setting – record version

Verification of source code availability

Review / Verification of Supplier Testing (Software Integration).

Structural Testing – Minimal (Source code inspection)

Some Acceptance Testing (e.g. FAT / SAT) should be performed, along with commissioning, in line with Good Engineering Practices.

 

Operation Qualification – Validation

Some Functional Testing (may be combined with IQ, but requires QA sign-off).

 

PQ – Validation

Final User Testing

 

 Validation Report
C5 Validation:

Extensive Functional, Extensive Structural Testing

Suppliers Audit–

The findings from the audit may reduce the amount of internal verification and validation that is carried out by the site.  This will depend on the Supplier having a Quality Management System in place or a recognised third party certification such as ISO 9001, and/or a good history of supply with the site.

 

Validation Plan

(Computer Validation Plan) may be required for Custom Built System that requires the site to manage the Life Cycle Development, otherwise the standard Validation Plan will be applied.

 

Design Qualification- Validation

User Requirement Specification

Functional Specification

Hardware Design Specification

Software Design Specification

Software Module Specification

 

Installation Qualification

Record name of operating system

List Software Version No.

List of Hardware

Configuration Setting – record version

Verification of source code availability

Review / Verification of Supplier Testing (includes Module Testing and Software Integration Testing).

Structural Testing – Extensive (Source code inspection).

Acceptance Testing (e.g. FAT, SAT) and commissioning should be performed, in line with Good Engineering Practices.

 

Operation Qualification – Validation

Hardware Testing

Functional Testing (Extensive)

 

Performance Qualification – Validation

Final User Testing

 Validation Report –

7.2. Examples Justifying a Non-standard Approach

On occasions the Validation strategy might vary from that in the above table.  Variations may be justified by business requirements or in response to risk.  Some examples include:

– Custom-built Firmware.  Where purpose-built firmware is used it is more appropriate to treat this at a higher GAMP level (eg GAMP 5).

– Customised Alterations.  Where an existing program is modified the overall rating would likely be GAMP 5.  The standard approach for this rating (e.g. ‘Extensive Functional, Extensive Structural’) need not necessarily apply to the entire program.  Rather, this higher level of validation attention should be focused on the sections being altered and their interface with the remaining program.  The unaffected areas of software may be treated according to the rating it would have received without the customisation.  Such an approach relies on the changes being well constrained and the original program being well described.

– Copies of Customised Software.  Where customised code is re-used it may be possible to refer to some previous test results without repeating these tests.  For this to be appropriate the requirements of the original and new systems must be highly similar.  Typically, the need for test duplication varies with the level of detail being assessed; low-level tests (i.e. sub-routine functions) are less likely to require repetition than high-level tests (i.e. overall system operation).  Similarly, modules that are commonly-used within a program do not require detailed testing with all input and output combinations.

– Supplier Quality Systems.  Where a Vendor Audit demonstrates that a supplier has a well-developed Quality System, the extent of confirmatory testing required to be generated by the site can be reduced.  Tests recorded on protocols developed by suppliers do not require copying to site formats (so long as the content is appropriate).  Conversely, where there is reason to be concerned about the assurance provided by a supplier, additional testing and input by the site may be necessary.  For maximum benefit, any extra involvement from the site should be provided as early as possible within the Development Lifecycle.

– Other Compliance Requirements.  Where assurance of software performance is required for other reasons (e.g. compliance with EHS regulations) additional testing might be considered.  Such testing may utilise the formats and structures of Validation protocols, as appropriate.

8. Qualification

As shown in Figure 6.1, the overall qualification of the Computerised System would be comprised of the activities of validation corresponding to the categories of the individual Items, once the criticality and the complexity of the Items are established (refer to section 5.0).

8.1. Good Engineering Practices

All systems, regardless of their Quality Impact, are to be supplied and developed in accordance with Good Engineering Practices (GEP).  For many systems there will be no separate requirement for Validation and GEP alone is sufficient.  Evidence of GEP includes:

a. Developments are designed or specified against agreed requirements

b. Competent personnel (including contractors) are selected for the task

c. Full consideration is given to EHS, Operating, Maintenance and Standards requirements

d. Completed works are inspected, tested, commissioned and recorded appropriately.

The “GEP-alone” approach does not imply an absence of documentation; rather there is a reduced need for review and approval of this documentation by Quality Assurance personnel.

8.2. Structural Verification

Structural verification involves inspection and assessment of the actual source code by a suitably qualified person (who is not the programmer).  Documentation used to support verification include logic diagrams, description of modules, definitions of all variables and specification of all inputs and outputs.  Structural verification is used to assess:

1. General application of good programming standards and practices.  Some programming practices are known to be associated with operational failures, or to hamper ongoing maintenance.  Depending on the software being programmed, the following are examples of issues that might be verified:

a. Code layout is logical and the flow is easy to follow, including adequate comments that aid understanding by others who may need to maintain it in the future.

b. Dead code and open-loops have been removed or commented out,

c. Variables are sensibly named.

d. Confounding special values are excluded e.g. divide by zero or square root of a negative number

e. Operations with invalid or missing data are prevented (e.g. by checking for empty strings, correct data type, values within limited range)

f. Parameters are initialised to a known value prior to use to prevent unexpected results.

g. The program operates safely when abnormal (error) conditions occur.

h. Invalid, illegal or adverse conditions such as alarms, alerts, errors and hardware failures, are identified and highlighted to the user in a way that allows appropriate response.

i. Databases and fields are correctly indexed.

j. Memory allocation/de-allocation does not conflict with the requirements of the operating system or of other programs likely to be in operation.

k. The issues for assessment may be defined in a checklist and should be understood and agreed with the vendor beforehand. A vendor’s quality system might normally generate such evidence.

2. Conformance with the specified requirements and design:

a. None of the stated requirements are missing, incorrect or incomplete.

b. Critical algorithms and calculations (i.e. Direct or Indirect Impact) are correctly coded.

c. Program paths follow their specified sequences.

d. Branching logic aligns with the specified conditions.

The specification documents are the appropriate reference for this assessment.  Emphasis should be placed on areas of the program with the most critical role.  Early inspection can avoid delays during testing, however errors can be difficult to detect until the code is used.

Structural verification should also be used to inform the range of cases to be challenged in Functional Verification.  The results of structural verification must be documented.

8.3. Functional Verification

Functional testing is a thorough and systematic comparison of observed output values of a program with expected output values for a specified range of defined inputs values and other specified parameters.

8.3.1. Minimal Functional Testing

This testing level ignores the internal mechanism or structure of a software system and focuses on the outputs generated in response to selected inputs and execution conditions (sometimes this is described as “Black-box” testing).

An example is evident from the Figure below.

Signal “Level to high”   >> 

>>  Alarm for “high Level”

8.3.2. Some Functional Testing

This testing takes into account the structure of a software system.  Types include branch testing, path testing and statement testing for the ‘Direct Impact’ or ‘Indirect Impact’ functions or modules.  Test cases shall be generated to ensure that selected branches/paths/statements are executed.  This is illustrated in the figure below.  Test cases A-D (4 cases) are required to test this module.

Figure 4:  Some Functional Testing

8.3.3. Extensive Functional Testing

Similar to “Some Functional Testing”, this testing takes into account all branch testing, path testing and statement testing for the majority of modules developed.

In certain circumstances, site can accept statements of assurance from suppliers based on user experience or software development standards and the provisions required from the supplier to satisfy the site’s requirements.

8.4. Vendor Audit Report

Any contract software developers and software developers of custom-built systems should be audited using a supplier audit document, which is based on GAMP, ISO and industry knowledge.  The audit report will provide a means of assuring that quality has been provided in the package supplied and may be used to support the reduction of qualification testing through the use of GEP.

8.5. Test Environment Operations

Test environments for Functional Testing may be used to increase assurance.  A test environment can enable functional testing without the inherent risks of a working plant; challenging branch conditions that are physically difficult to generate.  The test environment, however, must be carefully designed to be representative of the final installation and it is important that software functionality is maintained during and after the testing.  GEP will require final verification occurs on the actual equipment, although the operation of a test environment may permit the scope of this testing to be more focused.

9. Figure 5:   Impact Assessment Process Overview

10. Typical Product Quality Characteristics

Examples only

Quality AttributeTablet / Solid Dose ProductSterile Liquid Product
IdentityCorrect labelCorrect label
 Correct packaging materialCorrect packaging material
 Correct packaging componentsCorrect packaging components
 Correct Batch & Expiry detailsCorrect Batch & Expiry details
 Readable Batch & Expiry detailsReadable Batch & Expiry details
 Correct patient informationCorrect patient information
   
SafetyReject segregationReject segregation
 Package security & integrityPackage security & integrity
 Stable productStable product
  Container closure integrity
   
EfficacyAvailability (Disintegration / dissolution)Strength / Concentration
 Correct doseCorrect dose
 Physical integrity 
   
PurityChemical purityChemical purity
  Sterility
  Apyrogenicity
  Low particle count
  Biological purity
   
EvidenceBatch RecordsBatch Records
 In-process testingIn-process testing
 Materials SpecificationsMaterials Specifications
 Laboratory Test ResultsLaboratory Test Results
 Pack appearance (some markets)Pack appearance (some markets)
 Traceable product componentsTraceable product components
 Process stabilityProcess stability
 Process InstructionsProcess Instructions
 Operating Procedures (some)Operating Procedures (some)
 Product Distribution RecordsProduct Distribution Records

11. Impact Assessments for Record Types

The following table illustrates how various types of records might be assessed for Impact.

Type of RecordImpactComment / Justification
Adverse Event reportsDirectAdverse Events management records are used to control potentially harmful product, implying Direct Impact for associated records.
Batch production recordsDirectThese contain the final records documenting a decision to release product.
Calibration recordsDirect, Indirect, No ImpactThe Impact of these documents depends on the criticality of the process for which the calibrated item is used and the presence of any independent confirmatory activity. For instance records that are assessed as part of a batch release decision (eg probes which generate Sterilising-cycle data where a calibration failure may cause a review of product manufactured since the previous calibration), have Direct Impact. Other reports permitting engineering analysis or troubleshooting may have Indirect or No Impact.
Clinical TrialsDirectResults demonstrate the safety and efficacy of products. Set product performance standards for Regulatory Submissions.
Contracts / Technical AgreementsDirect, Indirect, No ImpactAgreements that set product quality requirements are of Direct impact. Equipment specifications that are necessary to meet product quality requirements are of Indirect impact, or less.
Dispensing / weighingDirectThese records form part of the batch records.
Distribution RecordsDirect,   No ImpactRecords that support product return and recall have Direct Impact. Other records, like intervening logistics have No Impact (with the exception of controlled drugs).
Equipment cleaning recordsDirect, IndirectInspectors can request these records during inspections and problems related to cleaning (cross-contamination) can lead to serious consequences for product quality. The Impact of these records depends on the scope of any confirmatory activities (e.g. Quality Control checks before product release).
Equipment operating / cycle reportsDirect, Indirect, No ImpactThe Impact of these documents depends on the criticality of the process. For instance records that are assessed as part of a batch release decision (eg Sterilising-cycle, Filter Integrity Test) have Direct Impact. Other reports permitting engineering analysis or troubleshooting may have Indirect or No Impact.
Maintenance RecordsIndirectThese records support machine capability and are evidence of a system of management control.
Master production recordsDirectThese contain all the critical instruction and control points supporting product release decisions.
Material / Procurement recordsDirectThese records support traceability of product ingredients and components and therefore are relevant to batch recall.
Material SpecificationsDirectThese contain definitions of attributes that are critical to product quality.
Monitoring recordsDirect, Indirect, No ImpactThe Impact of these records depends on the criticality of parameters being monitored. For example, microbiological and environmental performance could be Direct Impact (for a sterile area) or Indirect Impact (for secondary packaging and warehouse areas); while building management records of office environments are of No Impact. Management reports on the progress of validation, internal audits or other investigations are of No Impact.
Planning documentsIndirect, No ImpactThis document type can have different impacts. Some schedules (e.g. cleaning, calibration, or maintenance) are of interest to inspectors as evidence of GMP compliance. These can have Indirect Impact as absence of such plans may increase the risk of companies not having the GMP required results. Other plans for management information, such as project plans, have No Impact.
QA Audits, Investigations (including Deviations)Direct, IndirectQA investigations, required by GMP to assess / improve an organisation’s Quality Management System, usually do not affect single product quality decisions and are Indirect Impact. However, an investigation (e.g. into an Out-Of-Specification) used in a batch release decision has Direct Impact.
Quality Control Analysis resultsDirectThese records are used for critical release decisions
Patient RecordsIndirectTraceability of Clinical Trial data.
Regulatory SubmissionsDirectSets product performance criteria and manufacturing standards.
Standard Operating Procedures (SOPs)Direct, Indirect, No ImpactThe criticality of SOPs depends on the nature of the SOP concerned. For example, SOPs that govern the validation of computerised systems should not be considered as critical as SOPs that govern Quality Control operations (including final batch release). The criticality of a set of SOPs should be the same as the most critical GMP records they manage.
Training / personnel records.Direct, IndirectWhile these records (and definitions of roles and responsibilities) are GMP requirements most have limited impact on product quality. Critical decisions typically follow SOPs and involve more than one responsible person. Some specific training qualifications have Direct Impact on batch release – eg Sterile Operator, Authorised Person.
Validation documentsIndirectExamples include Validation Plan, Protocols, Results and Reports. While the correct function of equipment and systems has immediate potential to create harmful product, GMPs require Quality Control checks before product release.

12. Summary of Changes

Version #Revision History
VAL-045New