You dont have javascript enabled! Please enable it! VAL-040 Computer System Validation Pharmaceuticals quality assurance & validation procedures GMPSOP

VAL-040 Computer System Validation

DepartmentValidation/Technical ServicesDocument noVAL-040
Prepared by: Date: Supersedes: 
Checked by: Date: Date Issued: 
Approved by: Date: Review Date:

Document Owner

Validation Manager

Affected Parties

All Validation, Technical Service, Operations, Quality Assurance, Engineering and Project staffs involved in computer validation projects.

Purpose

To overview the procedure to be followed for the Qualification/Validation of computerised systems.

Scope

This procedure applies to all computer systems (including embedded systems) directly associated with, or supporting, regulatory  compliance requirements for the development, testing, manufacture and distribution of medicinal products.

Definition

21CFR11Title 21 of Part 11 of the Code of Federal Regulations of the Food and Drug Administration of the USA.
Application Software

A program designed to perform a specific function for the user or, in some case, for another application program. Roles include data collection, data manipulation, data archiving or process control.

Computerised SystemsComputerised Systems cover a broad range of systems, including automated manufacturing equipment, control systems, automated laboratory systems, manufacturing execution systems and computers running laboratory, manufacturing or distribution database systems.
 A Computerised System consists of hardware, software and any network connections, together with the controlled function and the associated documentation that collectively perform a certain function.  The computer is just one part of the process.
ConfigurationThe documented physical and functional characteristics of a particular item or system.  A change converts one configuration to a new one.
Configuration ManagementThe process of identifying and defining the configuration items in a system, controlling the release and change of these items throughout the system life cycle, recording and reporting the status of configuration items and change requests, and verifying the completeness and correctness of the configuration items.
DesignThe process of transforming requirements into specified characteristics or into the specification of a product, process or system
Design SpecificationA complete definition of the equipment or system in sufficient detail to enable it to be built.  This is derived from the Functional Specification and links to the Installation Qualification, (which checks that the correct equipment or system has been supplied and installed to the required standards).
Embedded SystemA system, usually microprocessor or Programmable Logic Controller (PLC) based, whose sole purpose is to control a particular piece of automated equipment.
FirmwareConfigurable software on a silicon chip.
Functional SpecificationA description of the system in terms of the functions it will perform, and facilities required to meet the User Requirements Specification.  The supplier should write the Functional Specification, in a way that can be understood by both users and developers.
Functional TestingTesting that compares the expected output of a function with the actual output of that function when a known input is provided.  Test cases include normal and abnormal inputs. Since this approach ignores the internal mechanism, (i.e. coding) of a system it is also known as “Black-box” testing.  This method only qualifies the particular function tested.
Life Cycle ModelCovers the phases of Planning, Specification, Design, Construction, Installation, Testing, Acceptance, Operation and Decommissioning.  An approach to computer system development that begins with identification of the user’s requirements, continues through design, integration, qualification, user validation, control and maintenance, and ends only when commercial use of the system is discontinued and its data is redeployed.
QualityConformance to specified requirements
Quality Management SystemA management system to direct and control an organisation with regard to quality.  A QMS formally documents roles, responsibilities and procedures.  Important elements include planning, performance standards, documentation, training, monitoring and assessment, and the handling of changes and deviations.
QualificationThe process to demonstrate the ability to fulfil specified requirements
ReviewActivity to evaluate how well the subject matter (e.g. a proposed system) meets all established objectives.  Review attributes include suitability, adequacy and effectiveness.
Risk

An exposure to the chance of future injury or loss.  The risk of an event may be measured by a combination of its severity and probability.  In this SOP, risk is mainly concerned with GxP non-compliance, either in actual characteristics, (i.e. product safety, identity, strength, quality and purity) or documented supporting evidence of conformance and traceability.

Other relevant risk types to consider include EHS and Business Impact.

SoftwareA collection of programs, routines, and subroutines that controls the operation of a computer or a computerized system.
Source CodeA computer program presented in human readable form (programming language).  Source code must be translated into ‘machine readable’ form before the computer can execute it.
Structural TestingUsing computer-programming expertise to examine the internal structure of a system’s source code.  Two aspects are of interest – the performance, (e.g. correct branch conditions, algorithms, no “dead code”) and the presentation, (e.g. programming standards, commenting, and structure).  Structural testing chiefly occurs during software development.
User Requirement SpecificationA specification that describes what the user expects the system / equipment is to do.  To achieve this, the document will contain at least a set of criteria or conditions that have to be met.

Responsibilities

Information Services DepartmentSupervising and maintaining the environment that support business-wide computer systems.  This consists of a standard networked hardware and standard software. Roles include data access and security; back-up; contingency and disaster recovery and change control.  Changes with potential impact on compliance (GxP) require approval of the Business System Owner.
Electrical Services DepartmentSupervising and maintaining automated and embedded systems used within Production. Roles include development of standards, calibration of instruments, back-up and recovery of applications, implementation of changes, and administration of SCADA security.
Project Managers / Change CoordinatorsEnsuring that quality and compliance activities form an integral part of projects/changes throughout all phases.  This includes ensuring that documented evidence to support validation is generated, maintained, reviewed and approved for the computerised system being developed.
Business System OwnersEnsuring that the quality and compliance requirements for their systems have been met; that systems are correctly used and that system integrity is maintained during operation. Roles include acceptance of system; provision of training; authorisation of user-access, SOPs and changes.  Also back-up programs and files where the system is not networked (e.g. embedded systems).  Responsibility for individual tasks may be delegated to users and others.
UsersConducting tasks delegated by the Business System Owner, which may include contributions to system planning, specification, testing, acceptance, operation, and maintenance.  Activities are to follow SOPs and policies for system use, security and change control.
Validation CommitteeApproving planned Validation, and completed Validation, for Change Requests and Projects, (see SOP VAL-005)
Validation DepartmentMonitoring compliance with Validation and Change Control procedures, as well as ensuring that training programs are established, followed and recorded.  Relevant documentation such as user requirements specification, validation plan, acceptance test plan, test reports and results should be reviewed against the site and regulatory requirements. As a minimum, approval of the Validation Manager is required for the planning and conclusions of validation activities.

Related Documents

VAL-005Validation-Concept and Procedure
VAL-010Revalidation
VAL-025Validation of Laboratory Instruments
VAL-030Equipment Specification and Qualification
VAL 045Impact Assessment for Computerised Systems
VAL 050Computerised Systems – Functional Testing
VAL 055Design Qualification Guidelines
VAL 060Protecting the Reliability of Electronic GMP Records

EHS Statement

While this procedure is concerned with the compliance of a system with GxP regulations, the methodologies could be useful for managing SHE risks.

Procedure

1. General Approach

Computerised System Validation is the process of:

“Establishing documented evidence which provides a high degree of assurance that a computerised system will consistently function in accordance with its pre-determined specifications and quality attributes throughout its lifecycle.”

The inherent complexity of software presents major challenges when attempting to establish such assurance.  It is usually impossible to test every potential combination of input, output and function of a system.  Consequently, it is difficult to gain assurance by depending on post-development testing and qualification activities alone.  The intention of this SOP is to build assurance by the management of:

Design – using a structured process to specify the intended outcomes of an activity

Quality – using a documented process to trace and approve the deliverables of each activity

Risk – taking an approach that focuses efforts on the areas that are most critical

Lifecycle – maintaining a system from its original concept to eventual retirement from use

Commonly used acronyms:

ER/ESElectronic Records/Electronic Signatures
FATFactory Acceptance Test (i.e. Acceptance Testing at the supplier’s premises)
FSFunctional Specification
GAMP“Good Automated Manufacturing Practice” a guidance document by ISPE
GxP

The range of good practices i.e.

· GMP (Good Manufacturing Practices),

· GLP (Good Laboratory Practices),

· GCP (Good Clinical Practices),

· GDP (Good Distribution Practice).

PLCProgrammable Logic Controller
SATSite Acceptance Test (i.e. Acceptance Testing within the site environment)
SDSSoftware Design Specification
SMDSSoftware Module Design Specification
URSUser Requirements Specification

2. Validation Process

The Validation process envisaged by this SOP is prospective and encompasses the entire life of the Computerised System, from initiation onwards. It includes many activities in common with other Validation and Project activities.  A typical lifecycle is made up of Planning, Specification, Design, Construction, Testing, Installation, Acceptance, Operation and De-commissioning.  The activities in these phases are illustrated in Figure 1 and outlined below.  Appendix 1 also summarises these activities and refers to related SOPs and guidance within the GAMP document.  (Note: compliance with this SOP is the expected standard.  GAMP is a useful reference, however other approaches may be acceptable).

Figure 1: Typical Validation Process within Lifecycle

2.1. Planning

A Validation Plan must be produced for each GxP computerised system that requires computer validation.  The Validation Plan defines activities, deliverables, procedures and responsibilities for establishing that the system is fit for purpose and complies with regulatory requirements.  The Validation Plan should be maintained as a live document until issue of the Validation Report.  See SOP VAL-005 for further details of Validation Planning.

Not all systems require computer validation deliverables.  The level of validation activity depends on two attributes: the type of computer system and the potential impact of that system on GxP compliance.  SOP VAL 045 describes the process (Impact Assessment) to follow to determine the Validation deliverables.  This process must be completed prior to approval of the Validation Plan.

The scope and depth of Validation will also depend on the type of Quality Management System that the system supplier follows.  The Impact Assessment will highlight where a supplier audit is appropriate.  Information from independent audits can also be referenced.  Validation activities may be adjusted in accordance with the level of assurance that a supplier can provide.  It must be remembered, however that post-development testing cannot deliver assurance from an ill-disciplined supplier.  For this reason, auditing should be completed prior to making a purchase from a supplier.

The Validation Plan and Validation Test Protocols must address the functions of the system that have regulatory impact.  While the Impact Assessment will assist in determining the components and functions at a high-level, complex systems may require a more detailed consideration.  A Risk Assessment process is recommended to identify the specific areas to be addressed by the testing program and to plan a scaled response.  The identification of functions, and assessment of risks, can only be done as the requirements of the system are specified.  Consequently, the Planning phase will overlap with the Specification phase.  As the development proceeds and additional details are described, or requirements change, the Validation Plan and Protocols must continue to be updated.  A document, (e.g. Requirements Traceability Matrix) is recommended to maintain the visible alignment between the various levels of requirements, specification, testing and deliverables.

2.2. Specification

The starting point for development of a Computerised System is the User Requirements Specification (URS).  The URS defines what the system is required to do, as well as what it must not allow (where that is important).  The URS should be complete and reflect the intended functional use of the system; who will use the system; where it will be located; as well as more general needs, eg performance, security, and regulatory compliance.  It is important that the URS conveys the user’s needs and ideas to those persons responsible for system development. Each requirement should be defined in a way that is testable and that highlights business and GxP criticality. Guidance on the preparation of URS for equipment is found in SOP VAL 030.

The URS may be a contractual document.  Users or a third party (even a supplier), will write the URS for the Business System Owner, who will review and approve it. It is recommended that the URS be reviewed by QA or Validation to ensure that regulatory requirements are adequately identified and are testable.  The URS should also detail the specific documentation required of the supplier to support validation.

Specifications continue to be a useful reference after the project and need to be stored and maintained as live documents.

2.3. Design and Construction

During Design and Construction the developer should translate the functions defined in the URS into a computerised-system product through a series of documented, detailed, software and hardware specifications.  The supplier is responsible for developing these documents and titles and formats will depend on the nature of the system and the supplier’s procedures.  Where the system is not complex these specifications may be combined in one document, however a suggested hierarchy of documents is that of Functional Specification and Design Specifications.

A Functional Specification is a more general response to the URS and describes the detailed functions of the system (i.e. what the system will do) in a manner that is understandable to the user. Design Specifications translate the URS into full details for system developer use (i.e. how the system will work).  It should be derived from the Functional Specification and contain enough details to enable the system to be built and maintained.  Various Design Specifications may be required to address separate aspects, (e.g. software, hardware).  All specifications should be written in a manner that allows requirements to be verified, (e.g. by inspection or testing).

Specifications should be subject to documented review processes prior to implementation. Ideally this will include approval by the site management.  This review should address the completeness of design, (i.e. are all the URS functions addressed) and the integrity of design (i.e. what are risks of potential failures in this design). SOP VAL 055 identifies a number of methods for Risk Assessment and Design Review that should be considered for use in this process. It is expected that site analysis will be to the Functional Specification level; the supplier may manage more detailed analysis.

Developers should establish suitable techniques to ensure that the developed or configured software meets the specified requirements.  They should implement a Quality Management System to monitor the software development and configuration; maintain specifications throughout the development; and control the deployment of version changes.  The supplier should also ensure that programming rules and conventions are followed.

A site representative should monitor progress of the development against an agreed timeline.

The Impact Assessment (SOP VAL 045) will identify where review of the actual code is appropriate.  This is known as Structural Testing and utilises expertise in programming to assess the code for compliance with technical, (i.e. specified functionality) and quality (i.e. programming standards) requirements.  The system programmer must not complete the Structural Testing.

Regulatory authorities may expect to be able to inspect a copy of the source code for application software.  Availability may also be critical for long-term support, maintenance and enhancements. Arrangements should be made with the supplier regarding access to the source code.  The access arrangement should be recorded in the IQ report.

2.4.Testing, Installation and Acceptance

This phase is a planned process of challenging and evaluating a system, and its components, throughout its development.  The Impact Assessment, and any Risk Assessments, will guide the overall scope and depth of testing. SOP VAL 050describes the process for functional testing.  Each test should be part of an overall strategy that is designed to make the whole process coordinated, efficient and effective.  Since it is impossible to test every potential combination of input, output and function of a system, testing should be structured to:

a. Consider those aspects that are of critical importance

b. Specify what coverage can be achieved

c. Find errors in the software (not merely confirm correct operation in normal conditions).

Test Protocols (sometimes called Test Plans, Test Specifications or Test Scripts) should define in detail the areas to be tested, the test data to be used, and the expected results.  The Test Protocols must be reviewed and approved prior to commencing formal testing.  Tests are conducted at various levels, corresponding to the hierarchy of details developed in the specification phase.  Documented traceability should be provided between Test Protocols and their controlling documents, such as Functional and Design Specifications, to demonstrate complete coverage of specified requirements.  This is illustrated in Figure 2.

Testing must be documented as it is performed and this raw data is to be referenced, dated and retained to demonstrate the testing was performed to an agreed standard.  Each test result should contain a clear pass or fail statement.  All results will be kept, as primary evidence of testing, and so should be filled out with care.

Figure 2: Traceability Between Documentation Deliverables

In general terms, the developer is responsible for unit, module and integration testing and should be satisfied that the software system delivers the functionality specified before normal Acceptance Testing begins.  Test reports should be produced, reviewed and approved within the supplier’s Quality Management System.  The reports that are to be reviewed and approved should be identified in the Validation Plan.

On completion of supplier testing the system is installed, or formally transferred, into the pre-production environment (i.e. the same environment that will be used in production, however without exposing the system to commercial operation).  Final Acceptance, often called Site Acceptance Testing (SAT), occurs after installation.  Factory Acceptance Testing (FAT) by a site representative may also be appropriate prior to installation, especially for automated equipment.

Acceptance testing (in the form of an Operational Qualification Report) is the final opportunity for the Business System Owner (or their delegate) to verify that the system performs in accordance with the URS under simulated operating conditions.  It requires formal testing of the complete system over a range of operating conditions, according to a Operational Qualification Test Protocol.  This protocol must be reviewed and approved within the site (QA or Validation Department).

Since user testing addresses the URS the approach is often that of Functional Testing.  Where possible, knowledge of the internal operation of the system should be utilised to enhance the testing and to determine the range of appropriate test conditions and inputs.

Where testing undertaken by the developer is planned, executed and documented to the same level as Acceptance Testing, this may be used to reduce the site-responsibility qualification activities.

The results of acceptance testing must be formally documented and should include as a minimum:

a. Documented installation of system in operating environment (‘test’ and ‘production’)

b. Evidence that the system meets the mandatory items of the URS

c. Evidence that the required operational procedures are in place and work as expected

d. Evidence that the required support procedures are in place and have been tested, in particular:

o    Backup and restore

o    Data archive and retrieve

o    Access security

e. Evidence that the users have been properly trained

f. Compatibility with other systems/architectures in the production computing environment.

The expected format for this information is IQ, OQ and PQ reports (shown on Figure 2).  SOP VAL-005 provides additional guidance in the preparation of these reports.

Note that the testing process for an automated system (e.g. manufacturing equipment) will usually encompass more than the computerised component.  In this case it is not necessary to produce separate IQ, OQ and PQ reports.  It is necessary however to ensure that the requirements for Computerised System Validation are fully addressed within the scope of the Equipment Validation.

The completion of user acceptance must be formally documented in a Validation Report.  The Validation Report summarises the outcome, including anomalies, deviations or exceptions, of all activities and deliverables defined in the Validation Plan.  It must also specifically document that the system may be accepted into operational use.  This document shall be formally reviewed and approved by the Validation Committee, and the Business System Owner, prior to the system being used.

2.5. Operation

Once a system is approved for use, the operational phase commences.  Since the system and its environment are dynamic, maintaining a validated state requires compliance with formal controls (SOPs).  The required controls and support processes must be in place, including the necessary training of users in their use, prior to approval of the Validation Report.  The maintenance of these controls is required throughout the operational life of the system.

All changes to GxP regulated systems, proposed during the operational phase must be subject to a formal Change Control process, and should be assessed, reviewed, authorised, documented, tested and approved before implementation.  On completion of changes, system support documentation must be updated.  This should include the Impact Assessment and Design Specification.

Backup copies of all software and relevant data must be taken, maintained and retained within safe and secure areas.  Backup and recovery procedures must be verified.  Application back-ups must be taken after system changes (eg version upgrade) and should be performed before changes (as a risk mitigation).

Periodic Reviews should be undertaken to verify that a system remains in a validated state and is being operated in accordance with applicable regulations and company policies and procedures. Recording and analysis of errors / malfunctions in computerised systems is done using deviation reporting management and Change management to enable corrective action to be taken. SOP VAL-010 describes the philosophy of Revalidation. Computerised systems are considered to remain qualified if the equipment/system has NOT been to subject to change and does not have repeated unexplained failures.

Computer systems and data should be adequately protected against wilful or accidental loss, damage or unauthorised change.  Security can be classified as either physical or logical (i.e. software-dependant), both of which should be implemented as appropriate.  The Business System Owner is responsible for authorising access levels and approving the addition and removal of users.

Critical instruments connected to automated systems must be calibrated on a regular, planned basis.  This calibration must include the computer system to ensure the accuracy of the full loop.

Training plans should be established for the use and support of the system.  These plans need to consider both the needs of users and of technical support staff.  This training should cover the regulatory impact of using the system.

Where suppliers are to provide certain support services for regulated computerised systems, contracts should define how this service is to be provided (in line with site procedures).  Supplier GxP training should be considered where relevant.  Performance of suppliers should be monitored and the subject of regular auditing.

2.6. De-commissioning

This is the final phase of the computerised system life cycle, leading to cessation of use.  Where the computerised system is storing critical records electronically, plans must be prepared for their satisfactory migration to a new system or for redefining the storage medium and retrieval capability (since retention requirements may extend beyond the life of a computerised system).  The process of transferring records from either one computerised system to another, or from one storage medium to another, should be verified and documented. Formal validation will be necessary for regulated applications.  A formal design approach is recommended, including appropriate Risk Assessment.  The decommissioning process should be the subject of a Validation Plan (possibly as part of any replacement system).

It is recommended that decommissioning requirements be considered as part of the original system URS.

3. Special Topics

While much of the Validation approach described above is common with general requirements, there are certain additional expectations for computerised systems.  These are described below.

3.1. System Inventory

The PIC/S “Good Practices for Computerised Systems in Regulated GXP Environments”, clause 14.3, advises:

“The regulated user’s range of computerised systems needs to be formally listed in an inventory and the scope/extent of validation for each detailed in a consolidated written Validation Master Plan.”

It is advisable to keep maintaining a good Computerised Systems Inventory.  Each inventory item includes the Validation requirement classification code determined by the Impact Assessment for that system (see SOP VAL 045).

3.2. System Description

“A written detailed description of the system should be produced (including diagrams as appropriate) and kept up to date.  It should describe the principles, objectives, security measures and scope of the system and the main features of the way in which the computer is used and how it interacts with other systems and procedures.” – FDA

The Impact Assessment form for each system / component performs the main System Description duty.

3.3. Entry and Maintenance of GxP Data (including ER/ES Requirements)

A hierarchy of controls may be used to protect the integrity of GxP data:

a. Firstly, physical and logical security provisions that restrict system access to trained and authorised users.  This includes protection of entered data from wilful or accidental loss or damage.

b. Next, consistency and accuracy verification over the entered or calculated data.  This may include validated, automatic checking processes or the comparison of a data re-entry by a second person, (i.e. a ‘blind-check’ where the original data is hidden).

c. Finally, a log (or ‘audit-trail’) to permit traceability of a manual data-entry to an authorised user.  An ‘audit-trail’ records the identity of the operator entering the data, as well as the time and date of entry.  Where alteration of a critical entry is permitted, the ‘audit-trail’ must retain the original data as well as that associated with the change.  Any ‘audit-trail’ produced is to form an integral part of the data record, (i.e. not a separate file).  Ideally, measures should be provided to ensure the uniqueness, authenticity and appropriateness of a user-identity.

Some GxP data entry may only require the lowest level of control, (e.g. selecting a machine set-up); other data, the maximum, (e.g. entering a dispensed weight, batch number, batch release decision).  The appropriate level of control must be determined as part of the system design and included in the development and testing program.  To support this decision, a risk assessment of the criticality of the data is recommended.

Evidence of the review and authorisation of GxP data must be maintained with the master record.  This may not require the generation of an associated electronic ‘audit-trail’ or authorisation.  If a record is generated on a computer solely for use in its printed form, (e.g. a SOP produced on a Word processor), the hand-signed paper-record may be considered the ‘master’.  To be consistent with such a decision the distribution, access and use of any electronic image must be restricted.  The Business System Owner should decide which format (i.e. paper or electronic) is to be treated as the ‘master’ and develop procedures that document and align with this decision.

The introduction of a Computerised System must not compromise the reliability of records that support the assurance of product quality. Record reliability should comprises four separate attributes:

– Accuracy:         Data is factually correct; free from error, defect or misrepresentation

– Authenticity:      Data is genuinely sourced from the reputed author, device or origin.May include the ability to uniquely trace the data to that entity.

– Availability:       Data is suitable or ready for timely, future, authorised use.May include restriction of access to intended purposes / users.

– Integrity:           Data is complete and entire; not altered in an unauthorised, unanticipated or unintentional manner.

SOP VAL 060 outlines the risk assessment process to verify these reliability attributes where computerised systems create, process, use or store data that has GMP impact.  This process confirms the adequacy of control measures (management, technical and operational) employed by a system to support the record reliability.

3.4. Retrospective Validation / Legacy Systems

Validation of an existing system, whether it was purchased or internally developed, is called retrospective validation.
Retrospective validation is employed:

a. When a system not previously validated is allocated to GxP duties.

b. When a system that was validated has lapsed to a non-validated status (including when the standard of

c. validation performed is no longer considered adequate).

In general, where retrospective validation is required it will be based (as much as possible) on recovering the equivalent documents for prospective validation,
(i.e. reverse engineering).  The effort required to generate these documents depends on:

a. The adequacy of existing documentation

b. The degree of system customisation

c. The intention for future changes

The process for identifying this work is illustrated in Figure 3.

Figure 3: Retrospective Validation Approach

4. Appendix 1 – Computerised System Validation Overview

TaskDescription
Identify systemIdentify and record GxP regulated systems and Owner.
Produce User Requirement Specification (URS)Define clearly and precisely what the user wants the system to do, define regulatory and documentation requirements.
Develop Validation strategy 
System Impact AssessmentDescribe system, assess GxP criticality and categorise system components to determine the required Validation approach.
Risk AssessmentThe Validation Plan / Protocol should be supported by risk assessments carried out through the project as specifications are developed.
Supplier AssessmentSuppliers should be formally assessed (as part of selection process and in support of the Validation Plan).  The decision to audit should be based on the System Impact Assessment.
Record ReliabilitySystems that create, process or store electronic records need controls to protect the reliability of that data. The extent of these measures depends on the System Impact Assessment and their effectiveness should be assessed.
Produce Validation PlanThe Validation Plan defines the activities, procedures and responsibilities for establishing the system adequacy.
Review and approve specificationsThe supplier produces specifications.  These should be reviewed and approved by site representative.
Monitor development of systemThe development and configuration activities should proceed against an agreed plan.  Site representative should monitor.
Review source codeWhere the Impact Assessment indicates, the source code should be formally reviewed during system development.
Review and approve test specificationsSite / Supplier test specifications should be reviewed and approved by site representative prior to formal testing.
Perform testingA structured program, targeting errors in critical areas.
Supplier TestingA site representative may be involved in testing, as a witness during test execution, or as a reviewer of results.
User TestingUser testing (eg Acceptance, OQ) is conducted according to the Validation Plan / Protocol requirements.  Supplier testing may be used to support and reduce this testing.
Document the installationRecord as-built configuration, including hardware and software. Archive the specifications for future updates.
Review / Approve reportsThe results and reports are reviewed and approved by site
Produce Validation ReportThe Validation Report summarises all deliverables and activities and provides evidence that the system is validated.
Maintain the SystemPrior to approval to use the system, the Owner should establish adequate system management and operational procedures
Change ControlChanges to the system are approved, managed and recorded
System BackupStore electronic copy of latest version of application, program, configuration and critical data files.  Copy to be suitable for recovery and restoration of system.
Performance MonitoringRecord issues with system performance and any impact on business processes.
Periodic ReviewValidated state of system is reviewed and gaps addressed.
Security MaintenanceAccess privileges are maintained, limited to appropriate users. Control measures protect data reliability.
CalibrationCritical field devices are calibrated according to a plan.
System RetirementManage the replacement or withdrawal of the computerised system from use.

5. Summary of Changes

Version #Revision History
VAL-040New