Expert content

A Complete Guide to Computer System Validation (CSV)

What is it and why do we need it?

If you wonder why needs a computer system validation, how the process looks like or you need some examples, have a look at this page.

Table of Contents

Introduction

This guide aims to suggest the tools and strategies necessary and appropriate for use in the validation of computerized systems for (human and veterinary) Pharmaceutical Industries, Pharmaceutical Chemicals (APIs and excipients), Biologics, Biotechnology, Blood Products, Gas Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP), activities include:

  • Manufacturing / Production (GMP)
  • Clinics (GCP)
  • Laboratory (GLP)
  • Good Distribution Practices (GDP)
  • Storage (GWP)
  • Documentation (GDP)
 

It provides a suitable approach to compliance with all types of computer systems, according to national and international regulations; based on the guidelines established in the GAMP® 5 Guide ISPE, providing an understanding of the logics of work, definition of scope, and selection of the validation strategy that best suits the system to validate.

This Computer Systems Validation Guide is based on the following approaches:

  • Risk-based approach
  • Approach based on the life cycle of the system
  • Approach on “V”-model for development and system test
  • Approach based on the process which serves the system
  • Approach on GAMP category system

This guide provides a general review of the guidelines required for qualification identifying regulatory infrastructure base (NOM / FDA / WHO), prior to the validation of computer system requirements is performed.

It also identifies the documentary base to support the validation of computerized systems, in accordance with the particular QMS of each organization.

This work is designed to be used regardless of their knowledge or experience related to validation or compliance with Good Practices, among others, the following areas or business functions:

  • Administration
  • Quality Unit
  • Investigation
  • Development
  • Manufacturing
  • Laboratory
  • Engineering
  • Maintenance
  • Regulatory issues
  • Human Resources
  • IT
  • Support staff
  • Associated suppliers

Through the principles and methodologies suggested here, this guide will help the organization to ensure that computer systems prove their fitness for intended use, meet the good practices of the industry in an efficient manner, provide practical guidance to facilitate the interpretation of regulatory requirements, with a language and terminology easy to understand and interpret, clarify the roles and responsibilities of each of those involved in the validation of computerized systems.

Finally, this guide is designed for understanding the principles of validation of computerized systems by the most diverse personnel, both those who occupy this knowledge as part of their daily work and those who at some point will be involved in the effort to validate a system without any prior knowledge of Good Practices, validation or IT computer terminology, thus becomes a valuable tool for both and for anyone who wants to train others in basic and logical principles of work on Computer Systems Validation.

FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

Whitepaper IPAD - Complete guide to Computer System Validation (CSV)

What do we need to validate?

Currently, the Health Industries such as pharmaceuticals (human and veterinary), pharmochemical, (APIs and excipients), biologics, biotechnology, blood products, and medical devices, are required to establish a validation program to demonstrate that any procedure, process, equipment, material, activity or system actually leads to the expected results.

The computerized systems that have an impact on product quality, patient health, and good practices (GxP, as in the case of those who serve production processes, storage of inputs and finished products, insurance quality, documentation management, electronic records, etc.) must be validated, in order to meet normatively, ensure the integrity and traceability of information and product quality.

Computer systems with GxP impact are becoming of particular importance today due to technological advances in process automation and data management and information generated by applications, and the increasing acceptance and use of these technologies in both administrative industrial, and productive processes.

As computer systems are increasingly integrated into many of the most important business processes, they help to reduce or eliminate the risks inherent in manual processes traditionally performed by qualified personnel. Thus, the risks of “human error” are no longer constant, while increasing the productivity and efficiency of processes does not depend on people performing repetitive tasks or require a high level of effort, leaving human hands to control tasks and maintenance of these systems, which provides room for creativity and process improvement.

With the above, it should be emphasized that the use of computer systems does not completely replace the human factor, but rather enhances it, brings it to a higher level within the process, where human error still exists, but at another level. Equipment and systems still rely on humans to tell them “what to do” and “how to do it” and any human error in this part results in an error in the rest of the process. There is a phrase that says “The machines does not commit a mistake, but the humans do it”. Wrong instructions will result in erroneous results. It is for this reason that the human factor is decisive in validating, from defining its responsibilities to the training and qualification of personnel.

Eventually, the growth of Artificial Intelligence (AI) integration into technology systems, mobile device interfaces, and the use of cloud-based systems presents new challenges for current validation schemes, which must demonstrate fitness for use and compliance with requirements at all times.

What is validation? What is a computerized system?

During the validation process of computerized systems, various stakeholders are belonging to parts of the company where the knowledge of issues related to validation, computer systems and information technology is usually not always the common factor. It is necessary to use common concepts to avoid subsequent misunderstandings or problems due to a lack of conceptual approval.

The following section lists some definitions of validation according to regulations and national and international guidelines:

Definition of Validation

NOM-059-SSA1-2015

“Documentary evidence generated through the collection and scientific evaluation of the data obtained in qualifying and specific tests throughout the entire life cycle of a product, which aims to demonstrate the functionality, consistency and robustness of a given process in their ability to deliver a quality product. “

Food and Drug Administration (FDA)

“Validation is the confirmation by objective evidence, that the previously established requirements for the use of a process or system are met.”

WHO guidance on the requirements of good manufacturing practices (GMP)

“Establishment of documentary evidence that provides a high degree of assurance that a planned process will be uniformly in accordance with the expected specified results.”

 

 The above definitions have the following elements in common: 

  • Generating evidence
  • Compliance of requirements
  • In accordance with expected results

The definitions that handle national and international standards and guidelines for computer systems must include: 

Definitions computerized / computer system

NOM-059-SSA1-2015

“Computer/computer system, any equipment, process or operation having one or more computers coupled and associated software or a group of hardware components designed and assembled to perform a specific functions group.”

Food and Drug Administration (FDA)

“Functional unit of one or more computers and input/ output devices, peripherals and associated software, used in common for all or part of a program and storing all or part of the data necessary for program execution.”

GAMP® 5 ISPE 

“System containing one or more computers and associated software, network components, functions controlled by them and associated documentation.”

Good Manufacturing Practice Guide for Active Pharmaceutical Ingredients (ICH7)

“Computer system: hardware components and associated software, appointed and assembled to perform a specific function or group of functions.”

“Computer system: process or operation integrated with a computer system.”

ANSI

A functional unit, consisting of one or more computers and associated peripheral input/output devices, and associated software, that uses common storage for all or part of a program and also for all or part of the data necessary for the execution of the program; execute user-written or user-designated programs; performs user-designated data manipulation, including arithmetic operations and logic operations; and that can execute programs that modify themselves during their execution. A computer system may be a stand-alone unit or may consist of several interconnected units.

According to the above, you can define computer systems as a combination of hardware and software that perform functions for the process they serve (where process means, all the constituent elements of it, such as personnel, equipment, activities, input and output elements, related documentation, among others).

Illustration 1: computerized system (ICH (2))

In other words, the computerized system is the combination of hardware and software in conjunction with the process they serve and its operating environment.

Illustration 2: computerized system

How are they classified computerized systems?

To identify what must be validated it is important to know that the classification of computerized systems can be performed as follows:

  • For its functions and design: Refers to identify, according to the functions performed, what kind of system belongs.
  • Process: That is, according to the mode in which the system is being used. According to the needs of users, you decide what features are required to set up or if a system tailored to that function must be designed (see topic: Characterization of the process).
  • System impacts: Identify the risks of computerized patient’s health, impact on product quality, data integrity, and business to determine whether or not the system requires validation and scope of validation (see topic: What is the scope / impact of GxP).
  • Category GAMP: After identifying the type of system, as used according to the process it serves and the risks inherent system classification according to the category GAMP be performed, which identifies whether this corresponds to a Category 1 (infrastructure software), 3 not configurable, 4 or 5 configured (tailored) and from here is derived which is the methodology to be followed for validation.

What are the types of systems functionality and design?

Depending on its functions and design, computerized systems can be classified into the following types:

System type
Description
Example
Immersed Systems Equipment
PLC, Controllers, Control Panels equipment for the various processes.
Software COTS
Standard Software (Commercial Off The Shelf, or shelf software) with zero degrees of customization and limited configuration capabilities. They are sold as proven solutions that require further adaptation and standardization of the process to meet the requirements. Usually, they are GAMP® categories 3 and 4.
StatSoft® Statistica, EmpowerTM, Minitab, data processing software supplied with measuring equipment, etc.
Spreadsheets
Application for manipulating numeric and alphanumeric data arranged in tables consisting of cells (which are often organized into a two-dimensional array of rows and columns). Functions include performing statistical calculations, logical operations and data management, automating tasks using formulas and macros, performing pattern recognition, merging data from different spreadsheets, creating charts and managing a large set of variables. Usually it addresses the handling of validation results, handling of analytical results, and preventive maintenance programs calibrations for statistical processing of results, among other applications. The application that spreadsheets are made not validated, its extensive sheet and its functionality is validated.
They are made with applications such as Microsoft Excel, StarOfficeTM, Calc TM, Open Ofice®, IBMTM / LotusTM 1-2-3, Corel Quattro Pro®, KSpread etc.
DMS
Document Management Systems (DMS), used for storing and tracking electronic or scanned documents. These systems are designed to facilitate distribution, consultation, review, versioning, document creation and capture.
QualityKick TM, EASYTOOLSTM, Master C.
LIMS
Laboratory Information Management System (LIMS) is called software to allow the acquisition and management of information generated in the laboratory. It has several specific options for each laboratory operation.
FreezerPro®, LabWare ELN ©, LabCollectorTM, NautilusTM, Core LIMSTM, etc.
ERPs
Enterprise Resource Planning (ERP) software are modular designed to integrate and manage information from each of the processes and activities of the company. It is responsible for managing inputs, managing resources and workflows. It helps to have more control over internal activities, generates reports and queries in real time.
SAP®, JD Edwards®, BPCS, Microsoft DynamicsTM, Macola ©, Epicor, Axapta®.
PAT
Process Analytical Technology (PAT), the FDA defines it as: “System for designing, analyzing and controlling manufacturing through timely measurements of critical quality attributes and performance for raw and process materials and processes in order to ensure final product quality.” They are based on the principle that the quality of a product must be from the design.
Eurotherm®, SIPAT ©, etc.
Software Infrastructure
It is any software that serves as a platform to make business applications work or improve their functionality. It is sufficiently proven software that requires no further validation. The validation of business applications running on the software infrastructure is considered as indirect evidence of its operation. They are classified as category 1 GAMP®.
Operating systems like UNIX or Windows, office software like Office, Adobe, Antivirus, etc.
Table 1: Types of computerized systems for functionality and design
FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

CATEGORIES OF HARDWARE AND SOFTWARE FOR COMPUTER SYSTEMS

What are known categories of computer systems?

It helps to determine the strategy and scope of system validation depending on the complexity and risks inherent in each category, both hardware and software. However, it will depend on their respective risk analysis to establish the same appropriate strategy. The categories suggested by the GAMP® are a good reference for determining the complexity and therefore the risks inherent in systems.

For example, in the validation of an ERP (SAP) Category 4 GxP to determine their impact and corresponding risk analysis (for each of its modules), you can decide not to challenge some of the functionality modules that compose (eg Finance), making only one installation verification this module if necessary.

How many categories are there?

There are 2 categories for Hardware and 4 for Software, although one is considered out of use (Category 2) from version 5 of the GAMP® Guide. Two important aspects are taken into account when determining the categories: the complexity of the system and the risk inherent in depending on their degree of proof (quality control software) during development. Thus, the more standardized and tested, the lower its category and therefore lower the standard of proof required for validation. On the other hand, the more personalized, the more need for configuration, and the least test conducted during development, will require more and detail in the validation challenges:

Illustration 3: comparison lesser vs. largest category
Illustration 4: validation test level vs. category

Why is there no category 2?

Category 2 was deprecated because when version 4 of the GAMP® was created, the firmware differed from the rest of the categories own characteristics. Today, it has evolved so that most of the firmware can be classified under category 3, 4 or 5, so category 2 disappeared. That’s why there are only categories 1, 3, 4 and 5.

For example, there are cases of firmware for laboratory equipment, such as the firmware of a TOC (Total Organic Carbon Analyzer), which is based on the GxP impact and the risk analysis can be considered as category 3, which only requires a requirements review, or as category 4, which requires a more complete V-model. 

What are the categories?

Category 1

Infrastructure software: established or commercially available software layer, eg.: operating systems, database engines, programming languages, firewall, and antivirus and office software. It is on this software that the operation of business software categories 3, 4 and 5 depends. Without denying the existence of the category 1 systems, they will not be independently tested, but, indirectly when testing systems category 3, 4 or 5. Their proper operation is shown in the process they serve. This software during design and creation is subjected to extensive quality testing software.

Category 3

Non-configured systems: no configurations are freely traded in stores or as part of teams. These are called COTS (Commercial Off The Shelf). Examples include tools for statistical computing, software for data acquisition without configurability, Scribble control panels, spreadsheets only used as databases or documents without any configuration level, etc.

In this type of software there are very low or zero level settings that the user can personalize. They are sold as “as is” solutions because they are acquired as they are used (except worksheets category 3, which are not considered software “as is”). They have the advantage that the operation can not be modified, which means that the risks arising from incorrect operation is reduced. This generates the same disadvantage of having to adapt the processes to system operation.

The V-model which is usually the simplest, involves verification against user requirements only.

Category 4

Configured systems or products: Partially configurable software packages that allow you to run a specific business process. These configurations include, but are not limited to, operating parameters, measurement, and control, and can use other external interfaces to complete the function.

Examples of these systems are ERP (Enterprise Resource Planning), LIMS, Applications spreadsheet in Microsoft Excel with formulas and/or input data linked to specific cells (this is considered configuration), control systems production equipment associated with the process (eg. autoclaves PLC), process control system as SCADA (depending on the degree of customization can also be category 5), systems of equipment quality control (M3 electron microscope), control temperature processes, among others.

The V-model usually includes the traceability of user requirements, functional and design specifications and protocols, Design, Installation, Operation, and Performance Qualification.

Category 5

Custom systems are those systems that are custom developed to meet the specific needs of the organization to optimize processes.

Examples are add-ons software for categories 3 and 4, MS Excel with VBA scripts, unique and dedicated systems, ERP systems, or developments of these facts to the specific needs of an organization, among others.

In the validation strategy for this category of systems it is recommended to put more emphasis on:

  • Specifications and testing modules
  • Design documentation and system operation
  • Service-level agreement
  • Technical support
  • Updates
  • Troubleshooting, errors and failures
  • Change control

 

It is important to note that worksheets, depending on your level of configuration or customization (use macros or Visual Basic programming) can be considered category 3, 4 or 5.

Remember that in the case of the Mexican Health Regulation the qualification design based on user requirements is very important. Related to this requirement, it can include revisions documentary aspects mentioned above.

How many hardware categories are considered GAMP®?

There are two hardware categories based on their level of customization:

  • Category 1 line: Most companies have regulated this type of hardware. It is standard hardware. Only verified and documented characteristics, and must be consistent with the need for software.
  • Category 2, user program: include additional elements to standard hardware, considered design specifications, and must undergo acceptance testing, aimed at evaluating suppliers. Its requirements are more robust and require detailed design specifications and performance. Category 2 hardware implies a higher level of risk of being less standardized and, therefore, less subject to pre-release testing and implementation.

Legacy systems

What did computerized legacy systems inherit?

For purposes of this guide, you can define the legacy computer systems in 3 ways:

  1. Computer system that has become obsolete, but still used by the user and not easily willing or able to be replaced or upgraded, or is no longer supported by the supplier.
  2. Those who do not comply with the 21 CFR part 11 and launched before August 20, 1997 on old computer hardware according to FDA Compliance Policy Guide 7153.17. This definition applies only if it is consistent with FDA regulations.
  3. Systems in use before the start of automated systems validation activities (should be included in the validation master plan).

During the preparation of the Validation Master Plan, it is necessary to identify the legacy systems and define the criteria to be cataloged in this way, and whose characteristics have advantages and disadvantages, as well as very particular risks to be taken into account when defining the strategy validation.

Since in legacy systems there are often no documents and much fewer tests performed during the design stage, the validation strategy can have different shapes, in relation to the development of requirements and specifications, for example:

Illustration 5 : Possible strategies for legacy systems

In the flow chart of the validation guide ISPE legacy systems, it is observed that for a legacy system depending on its characteristics, you can choose to develop only the requirements or both the Requirements Specification and specifications. Furthermore, existing procedures should be evaluated and updated according to their risk, taking into account changes that the process and the system may have undergone. The appropriateness of each of the routes to take risk depends on the system and the required level of proof and the availability of information.

FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

Why validate computerized systems?

What is the validation of computerized systems?

Validation of computerized systems is a documented process to ensure that a computerized system does exactly what it was designed to do in a consistent and reproducible way (suitability to use), ensuring the integrity and security of data processing, product quality, and complying with GxP applicable regulations. The robustly and documented evidence shows that the system is suitable for the contemplated purpose and it is doing what it is designed to do, with the certainty that the result or the final product will have the expected quality.

Suitability to use

What is the suitability to use?

The suitability of the system involves verifying that the system is functioning properly according to the needs of the process for which it was acquired. This will be demonstrated during validation and routinely checked during operation.

How is the suitability for use demonstrated?

The suitability for use is demonstrated by compliance with all established (mandatory) requirements. Since the requirements are directly traceable to the protocol Qualification Execution or Performance (PQ) and indirectly traceable to protocols Installation Qualification (IQ) and Operational Qualification (OQ) (through the functional and design respectively, serving them to meet the requirements), it is the satisfactory conclusion of the PQ that we can state that it meets the suitability for use.

For the qualifying infrastructure, legacy systems and Category 3 may only test IQ true and OQ traceable requirements, in this case, the suitability to use is also demonstrated by compliance with user requirements.

GxP

What is GxP?

It is an acronym for Best Practices (x), where “x” stands for some of the best practices related to regulations and national and international reference guides. The English acronym is GxP, where G refers to Good and P refers to Practices.

What good practices are applicable?

Among others, the main Good Practices that apply are the following:

  • Good Manufacturing Practice (GMP) is associated with the manufacture of a product, which is produced and controlled to quality standards for use, according to regulations that help ensure reliability to the final consumer.
  • Good Laboratory Practice (GLP)
  • Good Distribution Practice (GDP)
  • Good Clinical Practice (GCP)
  • Good Documentation Practice (GDP)
  • Good maintenance practices
  • Good industrial safety
  • Etc.

What is the scope / impact of GxP?

The term refers to any action or omission that adversely affects any Good Practices defined as part of regulatory compliance. In the case of systems, processes, activities, equipment, facilities and personnel, they can have an impact if GxP serve to support compliance with one of the defined Good Practices.

Since the Validation of Computerized Systems provides an approach to risk, its proper determination involves knowing the GxP impact of systems from which the scope of the validation study are established.

What do you gain from knowing about the impact of GxP?

Knowing the impact of GxP allows for a better validation strategy, with particular emphasis on those points where there is an increased risk of negatively impacting the performance of Good Practice also to determine during characterization of the system if it is impact GxP allows differentiate those requiring validation for regulatory compliance of those who do not. The role of process owners and/or Quality System is crucial in this determination to be guardians and empowered the elements of the system under its interference in relation to each regulatory requirement.

In the inventory of computerized systems, the definition of those that have a GxP impact is important. This definition can be provided by considering the following aspects:

Functionality of systems with a GxP impact:

1. The creation, maintenance or preservation of records or documentation required by the Good Practice regulations for evaluating product quality and making security decisions.

2. Automation of Best Practices, product quality or product safety decisions.

3. The data output to other system modules or external systems with the above features.

4. The input process data from other system modules or other systems with the above features.

It is also important to generate evidence of why some systems are not considered GxP compliant. This evidence can be provided during system characterization using a checklist.

What are the different types of impacts of computerized systems?

There are 6 major types of impacts to be considered for computerized systems. These should be evaluated both in the initial risk analysis system and subsequent risk analysis (to requirements and during maintenance of validated state):

Examples of GxP impacts:

  • Patient safety: This type of impact involves systems that release products and manage information for patient use (batch, instructions, expiration date), eg HPLC, software coding, etc.
  • Product quality: This impact is directly involved in making or evaluating critical parameters, e.g. IR spectrum, TOC, PLC an autoclave, etc.
  • Data integrity: The ERP systems, inventory control, document management systems, etc.
  • Regulatory compliance: QMS control systems, spreadsheets with training plans or maintenance control, electronic logbooks, etc.
  • Internal policies: Systems that manage the qualification of personnel, security systems, etc.

Examples of non-GxP impacts:

  • Business: This impact is determined by how the malfunction, damage or loss of the system and/or information can result in economic (business) losses to the company. All systems have this type of impact to a greater or lesser extent. The business impact cannot be neglected it enables continuity of the organization that owns the system (user, according GAMP®)

It is important to remember that the same system can have more than one type of these impacts. For example: An ERP system directly impacts the business to manage company resources, but also, if you have a quality module, can have an impact on product quality, patient safety, and data integrity. In the case of the control system of a tableting, this can have an impact on product quality, but also on the integrity of the data it manages. Depending on the type of impact(s), it increases the criticality of the system.

Is there an order of importance for each type of impact?

The impact on product quality and patient health are the most important GxP impacts. The impacts on product quality, patient safety, and data integrity are crucial to the decision whether or not to validate the computerized system. In this sense, experience and end-user knowledge are of great value for the correct weighting of the impacts.

Although business impacts are not important from a regulatory point of view, they should be seriously considered since the business must continue to exist in order to generate value through their processes. An incorrect strategy that considers a potential impact on the business due to a breach of good practice, or costs not covered by the maintenance of the system or required by the validation study can result in significant losses that jeopardize the operation of the business.

For example:

A drug distributor does not manufacture and does not perform analysis to determine the purity and identity of the drug. They have a system that manages the inventory and distribution. This system does not directly affect the patient safety, however, it may affect product quality if the distribution is not performed according to the manufacturer’s instructions. Because of the affected product quality, patient safety may also be compromised. These impacts would generate a loss for both monetary and business reputation.

Illustration 6: impacts

As the illustration above shows, possibly any of the other impacts can lead to an impact on product quality and this, in turn, has an impact on patient health.

How can the system impact internal company policies?

The failure of the quality management system policies has the most significant effect. It is therefore important that every staff member part of the validation process, becomes aware of them. Any breach of Good Practice may eventually cause business impact through loss of credibility with customers, loss of demands, plant closure, fines or damages to patients.

How can the system impact regulatory compliance?

The main impact on internal company policies is the failure of the quality management system policies. Therefore, it is important that staff involved in the validation process (IT, Human Resources, Quality, Validation, Production, Maintenance, etc) are aware of them.

An example is when a company does not adequately control the training and qualification of its personnel, resulting in unqualified personnel operating the system which breaks the validated state, increasing the uncertainty of the system and taking, therefore, the inherent risks.

How can the system impact business?

The absence of a validated system, failure to comply with the good practice provisions or falsification of results or evidence that does not take into account hidden costs of maintaining or supporting the system, the purchase of a system that will soon be updated, loss of money due to the system not having the required functionalities that would require changing the process, etc.

How can the system impact data integrity?

When it lacks controls for use, archiving, backup, restore, transmission and modification of data and information that the system manages.

An example: an employee, using control software inventory in the warehouse, has an error in moving product “X” between stores. An IT colleague responsible for the management system, solves the error of the employee by modifying the data. In this case the system is vulnerable and its information can not be considered as integrated.

What is data integrity and why is it important?

The FDA states in its guide “Data Integrity and Compliance With cGMP” the following: data integrity means that the data must remain Attributable, Legible, Contemporaneously recorded; Original (or actual copies) and Accurate. The above attributes are mentioned by the FDA under the acronym ALCOA as well as complete, consistent, lasting and available. These concepts have been reproduced in other guidelines and regulations.

Illustration 7: attributes data integrity

When it comes to data, it should not be forgotten that it is part of the records managed by the system.

When dealing with data, there is a life cycle that generally includes the following steps:

  • Generation
  • Process
  • Report
  • Check
  • Use for decision making
  • Storage
  • Discarded at the end of the retention period

 

Transfers between manual and/or IT systems may occur within these phases.

Data integrity must be maintained when the data managed by the systems are relevant for compliance with good practices, when part of the evidence of regulatory compliance or when they are critical for compliance and the measurement of product quality attributes or patient safety.

When the system is not able to support and maintain the integrity of the data it manages, a major risk is generated as these critical data can be falsified, deleted, disclosed without authorization, modified or denied by issuers. Data governance can maintain data integrity.

What is data governance?

It is the sum of total arrangements that provide assurance of data integrity, regardless of the process, format, or technology by which they were generated, recorded, processed, stored, retrieved, and used.

There are two types of data governance controls for maintain the integrity:

  • Organizational
    • Procedures
    • Staff training
    • Governance system design
    • Routine data verification
    • Periodic surveillance audits
  • Technical
    • Computerized control system
    • Automation

Data governance should also have a risk approach (see topic: Risk Analysis)

How is data integrity verified?

Verifying the integrity of the data in electronic records system is done in two ways:

  1. By routine verification of data and system logs at fixed intervals (checks, audits) (see topic: Maintenance of Validated Status).
  2. By studying validation through the installation of testing protocols, operation and system performance (see topic: How to validate computerized systems?).
 

In developing the left side of the applicable V-model, there must be defined requirements and identified risks related to the generation, processing, reporting, verification, use for decision-making, storage, and discarded end of the data as well as making sure its attributes remain complete, consistent and accurate, attributable, legible, contemporaneously recorded, original and veracity copy, accurate, durable and available. When applicable, develop specifications for these data in accordance with the above.

During validation testing, the challenges necessary to identify and define the location of data and electronic records (IQ), the verification of processes and procedures creation, file transfer, the backup and restoration of data as well as the evidence for the maintenance check of these attributes during the operational process (OQ) and as part of the results of the (PQ), should be established (risk-based).

One element that contributes to the control of the data and the traceability of its integrity elements is the data audit (see topic: Data Audit) as this keeps an unalterable record of the actions performed with the system information.

Electronic signatures (see topic: What are electronic signatures?) also contribute to data integrity by allowing them to be attributed and verified for use in decision-making.

What are electronic records?

Electronic records are data and information that are created, modified, archived, retrieved and distributed by a computer system. Electronic records are GxP relevant and those that are not, the difference is whether or not the performance impact of good practice in force.

Until a few years ago the management of information was made 100% on paper; from procedures, records, production orders, logbooks, maintenance programs, among others. However, with technological innovation, more and better information management tools are emerging every day. Such as systems that manage the quality management system, manage inventory systems, production control systems.

Derived from this, some companies decide to eliminate the use of paper, and manage all or some of the documents electronically; the use of these tools brings benefits such as reduced costs, availability of information, ease in finding information, environmental benefits by reducing paper consumption; but it requires security measures to ensure the data integrity is at all times established.

Technological tools are becoming more accessible and versatile, allowing in some cases to make designs tailored to the needs of each customer and company, increasing the electronic records that are generated; however, not all records are subject to validation. It is important to discern which of these have GxP impact and therefore will be subject to verification/validation.

Testing electronic records during system validation demonstrates the integrity of the data being processed by the computer system. The file is given a name and a format with which it can be reproduced. Records to be checked during validation are those with an GxP impact, for example, a production order to be accepted, then the product data are recorded and electronically secured through a system. This is an electronic record. However, if the same order was created by mail and it is printed for manual data recording, the product is signed and stored in a folder for evidence of compliance with the authority, this physical evidence is no longer considered an electronic record.

Another example are the procedures. If a standard operating procedure is created by an application, distributed, protected and is electronically authorized, it is an electronic record. However, if the procedure is written by an application and printed for approval and disseminated through physical copies, this is no longer considered an electronic record. The original electronic protected record can be considered if it is carried out through this medium as control versions.

In the example above ‘electronically authorization’ was mentioned, this refers to electronic signatures validating the authenticity of the person signing the document (see topic: What are electronic signatures?)

What are electronic signatures?

It is a set of encrypted electronic data accompanying or are associated with an electronic document, whose basic functions are: unequivocally identifying the signer and ensuring the integrity of the information and data contained in the signed document.

Electronic signatures require the user to be able to identify themselves electronically in a manner equivalent to a handwritten signature. An electronic signature must have the same legal validity as a handwritten signature.

The standard also allows the use of biometrics and tokens.

Characteristics of an electronic signature and related best practices:

  • All user actions can be configured to require signing or signing and authorization.

  • Privileges on the use of electronic signatures should be set according to the authorization level of each user.

  • Guarantee the identification of each user by removing accounts without deleting them.

  • Usually, electronic records are linked to other documents, such as procedures, that are used for the same purpose and are used by the company to approve or reject the information contained in these documents.

  • For purposes of FDA compliance, electronic signatures should also include the signature motif.

  • As electronic signatures and their use may have a legal implication, it is necessary to document (via a policy in a procedure or a manual), the date from which they are implemented and its validity as equivalent to handwritten signatures and scope (on documents applicable).

  • The organization will ensure that the electronic signatures remain unique and non-transferable to each user. This is achieved by verification of electronic signatures where at least one of the elements is only known to the user. To ensure that electronic signatures cannot be misused, it is highly recommended that the enabled user accepts the responsibility of the electronic signatures document by committing not to disclose the password and to report the stolen identification element.

  • To ensure that electronic signatures can not be altered, copied or transferred to be counterfeited in another electronic record other than the original, it is necessary to include in the validation tests the verification of their encryption, of the way they are attached to information and attached to the document, so that they cannot be extracted by ordinary means. Several documents should be tested to verify that a specific signature (a string of characters or electronic data attached for authentication) has been placed for each document.

  • Electronic signatures should be used sparingly, implemented only in those activities and processes that are justified by their criticality and importance.

What does an electronic signature guarantee?

Authenticity

The document information and electronic signatures are undoubtedly from the person who signed it

Non-repudiation

The persons who signed electronically can not say it was not them

Integrity

The information in the electronic text has not been modified after it was signed

Notice

The information has been encrypted and the issuer will only allow that the receiver can decrypt

What are the risks associated with the use of electronic signatures?

Among the risks associated with the use of the electronic signature, it is known that it can be hacked. These risks, as always, are borne by the end user who uses it.

Digital signatures (other than digitized signatures), are a type of electronic signatures that have a higher level of security. To perform the digital signature, a username and two keys, public and private, must be used. The public key is what can be shown and accessed by a third party and the private key will be in no case known or accessed by someone else, because this key is integrated in our identity and our firm. 

The exposure of the private key is a very high risk because its security is unique and ensures the security of electronic signatures. Anyone with the same key can create fraudulent signatures with the same legal value as a handwritten signature. Knowledge of the key by a third person can lead to phishing, can be passed around by the user and can be signed anywhere.        

It is recommended to have a clear policy of control and password protection, and implement a secure system for managing them. The organization must ensure (and generate evidence of this) that the user is aware of the responsibilities associated with the use of electronic signatures. Such a system must have the necessary elements in place to store and manage keys and allow access only to authorized users so that it is known who signed, where they signed, and when they signed.

The standard also allows the use of biometric identifiers and tokens to establish control measures that are not used by outsiders.

What does not require validation?

Throughout the chapter  ‘Why validate computerized systems’ the reasons why computerized systems should be validated were described, however, it is important to identify which features or components of the computerized system do not require validation.

The clearest examples are the commercial operating systems, eg Windows, Unix, and Linux. They are infrastructure systems that are required for the application to work properly. However, these are indirectly verified during validation, because they are commercial systems. These are tested from a design by the companies that develop them.

Another example is antivirus and firewall, which as in the previous example, are extensively tested before the release and because they are constantly developing new infectious agents. They need to be constantly updated to protect companies from malicious agents.

Finally, office or complementary software such as Microsoft Office (Word, Excel, etc.), Adobe Reader, or Teamviewer are not subject to validation. They indirectly prove their operation during the validation of the software and document the business characteristics.

FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

Who is responsible?

What are the main responsibilities?

Before the execution of validation, the people and areas that should be involved throughout the process must be identified. Each person must know and accept the responsibilities that apply.

There are 4 main responsibilities in the process of validation of computerized systems:

  • Process owner
  • System owner
  • User
  • Provider
Illustration 8: Responsibilities

Who owns the process and what is their responsibility?

The process owner, usually the boss or manager of the area that the process serves. There can be more than one process owner. He is the main actor of success, regulatory compliance, and economic benefits that the system can generate. He is called the process owner because he is empowered to make decisions about the process because of his hierarchical level, knowledge, and experience related to it, his interactions, and relationships.

The process owner may also be responsible for the system, however, this will depend on the type of system and the size of its operations. It is recommended that the process owner is a person who knows the company widely, not only process variables that serve the application but also the system. 

Part of his responsibilities will include:

  • Manage the development of user requirements
  • Having a robust and comprehensive knowledge of the regulatory requirements related to the process
  • Assemble the team that will participate in the operation, validation, and test execution system and provide the necessary resources
  • Involvement in the risk assessment team
  • Approve the documentation resulting from the validation
  • Maintenance of the validated state
  • Training management system users
  • Elaboration of the required procedures for the process that serves the system
  • Allocation of system operating personnel during validation
  • Develop and report change controls
  • Monitor the matrix training of the personnel involved
  • Perform follow-up calibrations of equipment as required

Usually, he is responsible for the availability of information, configuration, maintenance, support, training of personnel, and the access control security system and takes measures to ensure compliance and GxP.

Who is a user and what are its responsibilities?

According to ISPE, the “user” refers to a pharmaceutical or consumer-oriented customer organization that engages a provider to provide a product. In this context, it does not only apply to individuals and is synonymous with customer. It is common to conceptualize the user as a personal direct contact with the routine operation of each of the activities within the process system, and defining the user causes problems when establishing user requirements because each conceptualizes their requirements from the limited perspective of their specific activities within the overall operation.

In these terms, we can have multiple levels of users. The most important is the one who has the overview of the process (managers, managers of the area or areas where the system serves, process, and system owners). Their level of responsibility is high for the results of the process and system. Second, we have the key users (can be any of the above), their characteristics are that they have broad skills and knowledge management system as well as a level of responsibility that allows them to supervise younger users or parts of the authorized process.

Finally, we have the younger users, who are those who are responsible for performing most system operations related to the entry and exit of information. They usually have very small and limited liability, the critical process steps run by the system often require additional monitoring and authorization.

As part of their responsibilities, are the following:

  • Report errors in the system
  • Communicate information about the requirements the system must meet to be suitable for the intended use 
  • Provide information about the process 
  • Data entry system
  • Extract and use information system
  • Support in the execution of validation tests 

What is the responsibility of the (internal / external) provider?

Providers play an important role during the system life cycle and each of the selected stages of the V-model.

 We can classify them as follows:

  • Provider of computer system and consumables (implementation, support, maintenance)
  • Provider of infrastructure (implementation, maintenance, support)
  • Service provider qualification and validation
 

To reduce the possibility of any inconvenience with any provider, you must carry a rating and approval of this before purchasing supplies or services. Also, a way to eliminate risks associated with suppliers is to ensure that the UK, is established from the same purchase order.

Suppliers’ responsibilities are defined in terms of the activities for which they are hired and the criticality of their participation in the process.

Supplier qualification

Given the criticality of its activities to the system and its results, it is important to define which suppliers should be qualified and which not. The criteria for this should be defined in each organization’s internal policies.

The decision to perform a supplier qualification should be documented, based on a risk assessment and categorization system.

Supplier evaluations may include the following:

  • Completion of a checklist audit by the supplier
  • Gathering documentation from the supplier related to the development, testing, and system maintenance, including supplier procedures
  • Site audit of the supplier’s facilities
  • Supplier’s reputation
  • Supplier’s quality systems
  • Collection of the documentation from the supplier on the system development
  • Supplier questionnaire
  • Staff competencies provider
  • Historic quality (for already approved suppliers)
  • Service-level agreement (SLA) if it meets the client’s requirements for optimal performance and system maintenance
  • Response times
  • Contracts

Usually, for systems from suppliers with a lower category and therefore lower risk, a lower level of verification is required. For example, COTS providers, require a lower level of verification compared to the tailored systems.

It is important to always conduct a proper cost-benefit analysis of the appropriateness of an audit of a site provider versus a remote audit or merely documentary. In these cases, it is also important to mediate a risk analysis.

Each of the evaluated criteria should be given a grade that should be analyzed and documented in a report card of the provider, to indicate whether the supplier is approved or rejected, and what results are out of specification and what preventive and corrective actions should be taken by the supplier.

If the supplier is not approved or provides an unsatisfactory service, you should proceed with a check of the quality management system of the company that hired the supplier or a risk analysis to determine the frequency and scope; based on that, you should proceed with a check of the quality management system of the company.

What subject matter experts can be useful?

Having experts in the validation, implementation, updating, and removal of a computerized system is helpful because expert judgment is useful in increasing the reliability of decisions about the undertaken activities.

The informed opinion of people with experience in the subject, who are recognized by others as qualified experts, and can provide information, evidence, judgments, and assessments, is what gives value to the inclusion of experts in the field during the system lifecycle. The expert’s role is critical for strategy planning, eliminating irrelevant aspects, including essential aspects and/or modifying aspects that require it, test selection, and defining acceptance criteria.

Experts in the field can be:

  • People who develop software
  • People who were responsible for the implementation of infrastructure
  • People who implemented the software
  • People who give support to infrastructure, servers, and PCs
  • Users who operate the system daily
 

And responsibility is the support for validation according to their area and knowledge management system that will remain in the organization.

What is the responsibility of the area validation?

This area is responsible for creating the necessary documentation to validate the system and demonstrate what it says and does, and how it works, such as:

  • Meetings with the process owners for the characterization of the process and system processing requirements, risk analysis, and specifications, as well as reviewing evidence and reports
  • Meetings with the system owners for the characterization of the process and system processing requirements, risk analysis, and specifications, as well as reviewing evidence and reports
  • Meetings with users who use the software every day
  • Meetings with stakeholders to monitor the progress of validation
  • Perform the necessary documentation for validation
  • Run the protocols and document the evidence
  • Develop validation reports
  • Report if there is a deviation in the process

How are responsibilities assigned to the system and validation?

The responsibilities of those involved in the different areas involved in software validation, as outlined in the Validation Master Plan of Computerized Systems.

A validation committee (or equivalent) is formed with specific responsibilities that delimit the VMPCS. (See: Validation Master Plan).

What are the main problems that arise in the allocation of responsibilities?

The most common problems are:

  • The lack of communication between the participating areas
  • The lack of information on why you should validate software
  • Not accepting the responsibility assigned to it and distancing themselves from it, arguing that the activities are up to other areas (Validation, IT, production, or quality)
  • Areas arguing that they have the time to dedicate to software validation
  • Assigning all validation activities to one area or person to be disregarded responsibilities from other areas, owners and users. The validation of computerized systems requires close collaboration with various roles and responsibilities where staff is responsible for integrating the information generated by other areas and functions involved
  • Not knowing the process or setting it up incorrectly
FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

How to validate computerized systems?

For the implementation of the validation process of computer systems, it is extremely useful to view it as a project, depending on the criticality, impact, complexity, and risks of the system. Many variables are involved to control them in a timely manner. 

It is common for validation activities to be subject to urgency and stress which in the case of systems with greater complexity (such as categories 4 and 5), increases the probability of errors if you do not have adequate management scheme activities and timing. For this purpose, it is important to start understanding both the process- or system-related activities as well as those related to the validation itself.

Who should coordinate the validation project?

The project coordination of the validation of any system is recommended to be carried out by a person who has the most control and knowledge of the system and the process, for example:

  • In the case of ERP systems, it is recommended that the IT area coordinates the validation effort because, it is a system that serves several processes. That would not be practical for process owners to coordinate.
  • In the case of LIMS systems for document management, it is recommended that the owner is the person who coordinates the process as they have better control over the process and system requirements.
  • For spreadsheets, the same user would be responsible for coordinating and implementing the validation of the leaves.
  • If your organization has a PMI or a project office, that would be the responsible entity.
 

Notwithstanding the above, it should not be forgotten that the validation of computer systems is a joint effort where various stakeholders provide information for the preparation of the elements of the chosen V-model.

It is important that all areas involved support and test the system.

Life cycle

What is the life cycle of a computerized system?

The life cycle is a period of time during which a computerized system ‘lives’ from conception to retirement. For companies, the concept of life cycle begins with the need or opportunity to automate one or more processes and ends with the retirement or replacement of the system that served for the automation.

A common mistake is to confuse the life cycle of a system with the “V”-model for the development and testing of the system, calling it a “V-model life cycle”. These are two different things. The V-model will be discussed later as it relates to the stages of the life cycle of systems. 

What is the life cycle approach?

The life cycle approach allows us to consider the characteristics of each stage for planning activities and scopes of validation, the risks and benefits for each stage and implementing the necessary controls. 

What are the phases of the life cycle of a computer system?

The phases of the system life cycle are:

LIFE CYCLE STAGES - - Annual Product Quality Review (APQR/PQR) in Pharma: importance, benefits & challenges - QbD Group
Life cycle stages - Annual Product Quality Review (APQR/PQR) in Pharma: importance, benefits & challenges - QbD Group
Illustration 9 and 10: Life cycle stages

At each stage you can perform various activities that typically accompany it. These steps become cyclical with each change, improvement or implementation of a new system.

Life cycle approach

The GAMP® Guide 5 establishes a life cycle approach as good practice for a better understanding of the system and its implications. This approach involves systematically defining and implementing activities from the four main phases:

  1. System design
  2. Draft
  3. Operation (It is usually the longest phase)
  4. Retirement System
Illustration 11: Life cycle stage activities

Depending on the stage of the life cycle in which the system is in, the activities and V-model will apply.

Recommendations:

  • Suppliers of products and services, as appropriate, can participate in improvement activities, maintenance, validation, auditing, etc., throughout the life cycle. It is subject to satisfactory evaluation measures and approval of the supplier (Supplier Rating).
  • It is important to maintain an inventory of existing computer systems in the Validation Master Plan of Computerized Systems (VMPCS) or the General Validation Master Plan (VMP). This inventory is recommended to include data such as the date the system was implemented and the date the last validation was completed, so that the validation management can track what stage of the life cycle stage the system is in.
  • As part of the initial system characterization stage of the life cycle it is in, prior to the validation study.

Inventory of Computerized Systems
Risk-based priority
Code
Name of the system
Area
Use
Do you manage electronic records?
Do you require electronic signatures?
Process owner
System Owner
Hardware Category
Software Category
Age
Table 2: Example inventory of computerized system

What are the characteristics of each of these phases?

Concept phase 

The main activity of this phase is to establish the focus of the organization to justify the start of the proposed system implementation, defining the scope needed for enterprise resource optimization.

Initial requirements for determining the use of the system based on operational needs and the process in the same way may be the overall specifications for the system need their construction and use. 

Project implementation phase

Phases of project implementation are:

  • Planning
  • Specification, configuration and coding
  • Verification 
  • Reporting and release

 

Key supporting processes for project implementation compliance are:

  • Risk management
  • Change and configuration management 
  • Design review
  • Traceability and document management

 

The results obtained during the execution of these phases provide documentary support for justifying the system as suitable for its intended use. This generated documentation can be used by the company as proof of compliance during inspections by the corresponding regulatory body.

Planning 

Project planning includes the following activities to be carried out:

These activities are generally sequential, but may run in parallel or overlap.

In this phase, the requirements and specifications should be clear enough for a risk assessment and ultimately for a correct definition of verification tests (protocols).

In this phase, activities should be carried out taking into account the following:

  • Impact of the system on patient safety, product quality, data integrity, business operations, internal policies and regulatory requirements
  • Complexity of the system
  • Capacity provider (vendor classification)
  • System seniority
  • Category System
  • Existing GAPs

Specification, configuration and coding

During this phase, the following activities are performed:

  • Specification: specifications are made with the level of detail required by the type of system and its use.
  • Coding and configuration: Providers must choose and use the development methods and models most appropriate to the coding and configuration requirements and based on the approved specifications.

They should also ensure that their requirements and specifications take into account those coding needs and system configuration for the intended use and how these developments and configurations should be documented.

Verification

This phase confirms that the specifications have been met, through inspections and testing of the system (depending on the type of system). This phase is present throughout the project.

Qualification tests and validation infrastructure for new systems run during this phase. 

Reporting and release 

In this phase, the system must be acceptable for use in the operating environment, according to a documented and controlled process.

Release and acceptance for use in activities GxP should be done by the process owner and system owner.

A computerized system validation must be prepared at project closure, summarizing the activities undertaken, any deviations from the plan, and the results of the study.

In order to effectively maintain the system during operation, a handover (or release system) by the process owner, owner and operating system users is required as a prerequisite.

Operation phase

This phase is the longer phase, at this point you can still make changes to the software, hardware and process for which it has been released and authorized by the organization. These changes must be monitored and managed as part of continuous improvement and maintenance of the validated state.

System and infrastructure procedures must be continuously updated in accordance with the organization’s quality management policy.

Retirement phase

This phase involves the removal, decommissioning and migration of data needed for decommissioning.

It reaches this stage when it is determined that the computer system is obsolete for the process for which it was designed, among other reasons because:

  • The software is obsolete or has lost supplier support
  • The hardware is not compatible with software updates
  • The process for which it was designed has undergone significant changes that originally affected its suitability and use
  • There are new and better options to replace (retooling)

What is the V-model?

The V-model is a graphical representation of software development and testing activities, including its verification and validation process.

The V-model can be viewed not only as the development activities and testing of the system, but also their sequence, their interrelationships and the validation process of the deliverables applicable to the V-model selected for each system.

FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

What is the V-model?

It helps determine the best validation strategy according to the computerized system categorization, specifying the documentation to be generated and the type of tests to be performed at each stage of the validation process. Shows the logic of work in the process of system development and verification.

Determine in advance the V-model to be used so that those involved become familiar with the validation strategy to be followed.

There are several V-models, each suitable for a specific context.

How many V-models are normally handled?

Although several V-models and even combinations of them exist, 3 main models for the validation of computerized systems have been proposed in the GAMP® 5 Guide. Their selection and application depend first on the category into which it classified the computerized system and then on the evaluation of other factors discussed below.

What determines the application of each model?

It depends on the complexity, category, impact and risks, the degree of outsourcing of system components, the life cycle stage you are in, its age and maturity.

Each V-model test includes a degree system, this degree should be defined according to the criticality of the system or its components. 

The definition of the applicable model should also be from a practical point of view that proves only what needs to be verified, without ultimately verifying less than necessary. In this regard, the experience of the validator and an appropriate risk analysis and characterization of the system are crucial for the correct choice of the V-model. In any case, when in doubt, it is preferable to increase the level test than to decrease it.

The models are not restrictive, so that if, for reasons specific to each system, more tests than mentioned in the model are performed for all or part of the system, it is possible to include them in the validation strategy.

How does the life cycle approach relate to the V-model?

Depending on the stage of the life cycle that the system is in, the chosen V-model and its activities can be placed so that they are more compatible with development activities and system testing. 

In the illustration above, the 3 V-models overlap indicating the increasing complexity of validation research based on the GAMP category.

Also, depending on the time set for project implementation and system validation, the V-model can be shortened or extended within the cycle of the system.

In any case, it is important to consider the following premises:

  • The validation of the system should be completed before the release of the preferred system.
  • The assessment infrastructure should be completed before the start of the validation preferred system. It can be performed in parallel, assuming the risk that in case of failure to pass the qualification infrastructure, the rest of the validation cannot be approved and tests must be repeated until the assessment infrastructure has been satisfactory.
  • Extended runtime of the selected V-model may lead to loss of control over the validation process, unnecessary costs, changes that require reconsideration of the left side of the model, or obsolescence of the system.
  • For new systems, the best time to begin the left side of the V-model (processing user requirements, functional and design specifications, as applicable) is during the system design phase and the best time to complete the right side of the V-model (developing protocol design, installation, operation and performance, as applicable) is before the system operation phase.
  • For legacy systems, the beginning of the left side of the V-model (processing user requirements, functional and design specifications, as applicable) is usually during operation, as in the term on the right side of the model.
  • Systems that have not completed their validation study in the operation phase pose a risk and therefore it is not recommended that the system be released through validation testing. An exception may be legacy systems, in which case they should establish controls to mitigate the release risk.

What activities constitute the validation process of computer systems?

The activities in the validation process of computerized systems are:

How is each phase related to the V-model and what are the deliverables in this process?

Mandatory deliverables for process validation of computerized systems are divided into prerequisites, protocols, reports and traceability matrix:

  • Characterization system (highly recommended) 
  • User requirements
  • Risk analysis
  • Functional specifications
  • Design specifications
  • Design qualification protocol
  • Design qualification report
  • Installation qualification protocol
  • Installation qualification report
  • Operation qualification protocol
  • Operation qualification report
  • Performance qualification protocol
  • Performance qualification report
  • Traceability matrix

Not all deliverables are mandatory for all systems, this depends on the V-model chosen for each validation and its scope.

The GAMP® Guide suggests that functional and design specifications should be established as a prerequisite for categories 4 and 5 because they provide more specificity and robustness testing.

What is the relationship between the quality management system and the validation of computerized systems?

The quality management system is an umbrella term covering all business processes. In this context, the validation activities and processes that serve the systems are no exception. There are three direct links of the validation study to the quality management system that must be taken into account to obtain the expected results:

  • Compliance with established quality policies and processes
  • Documentation of inclusion in the BPD system and documentation of its operation, maintenance, design, control, system definition and validation and the electronic records it manages
  • Risk Management
  • Maintenance of the validated state through the use of Change Control tools, handling of deviations or non-conformities, internal audits, CAPAs management, staff training, supplier evaluation, maintenance and calibration programs
 
Records are documents to be considered within the Quality Management System (QMS). Procedures related to the use of systems and their management (IT procedures) must comply with the GDP and be within the QMS. Validation protocols, reports and evidence must comply with the provisions of the QMS.
 
At all times, the quality policies established by the company must be adhered to and they must be consistent with expectations and conformity validation.
Validation studies and the management of their results must be based on the following quality processes:
  • Non-conformities
  • Corrective and preventive actions
  • Customer complaints
  • Risk management
  • Internal audits
  • Change control
  • Qualification of suppliers
  • Training and qualification of personnel

The minimum procedures for specific preparation of computerized system validation are divided as follows:

Procedures for IT system management 

  • Management of systems and new developments
  • Management of physical and logical security
  • Supplier management for computer systems
  • Management of electronic signatures (if applicable)
  • Maintenance of computer systems
  • Backup, archiving and recovery of information
  • Verification of electronic records
  • Contingency plan in case of emergencies
  • User management
  • Virus management
  • Configuration management
  • Patch management and updates
  • On and off infrastructure (if applicable)
  • Help desk
  • Spreadsheets for version control
 
 

Procedures for use and IT system management 

  • Installation of network equipment
  • Escalation management and demand
  • Change control (HW & SW)
  • Security Management
  • Preventive Maintenance
  • Research problems
  • Training
  • On/off servers
  • Performance measurement
  • Capacity management
  • Help desk
  • User management
  • Virus management 
  • Backup, archiving and recovery
  • Configuration management
  • Disaster recovery
  • Commissioning and decommissioning

Procedures for use system 

  • Operational procedures that include not only the operation of the process, but also the system as part of the process

In addition, the procedures of the quality management system to maintain the validated status are mentioned above.

It is not necessarily important that these documents have these names, even several of them can be included in other procedures or unified into a single one or even those that do not apply to certain systems. In this regard, the decision to create or not create procedures should be based on the following assumptions:

  • They will add value to the process
  • They must be appropriate to the context and system control needs and processes
  • They must be appropriate for the users who will use them
They should contribute to better control and reduction of risks.
 
In addition, the counting of some, all or more of the items presented here will depend on the characteristics of the computer system and the type of infrastructure being counted. The company must establish the necessary procedures.

One factor underscoring the importance of the relationship between quality management system and process validation of computerized systems is that there are validatable systems with high impact on the quality management system, some of which manage the QMS. Examples include:

  • Document Management Systems (DMS)
  • Quality control systems and Laboratory Information Management Systems (LIMS)
  • Quality modules in the ERP
  • Spreadsheets for process control and quality management
  • Etc.

As for document management systems, these are some of the points to consider during validation.

International guidelines recommend:

  •  Keep documented information to support the operation of our activities
  •  Maintain documented information to ensure that the activities are carried out

Therefore, it is vital to implement a document management system to demonstrate that activities meet the previously established requirements.

DMS systems reflect the processes and associated documents that are part of the documentation system activities. The management of the organization’s work methods computerized systems are critical because they manage the most important in the implementation of good practices element documentation. In them, documentation is integrated in an orderly manner to ensure proper understanding. Implementation of a documented system helps build a hierarchical structure of documents.

This system allows access to information recorded in files and documents and ensures that the information is stored securely and thus remains intact for the necessary period of time. The documents managed by these systems should be considered “electronic records”.

Because of the high importance of the information needed for validation and maintenance, a document management strategy appropriate to the type and size of the company is needed.

The structure of the quality system includes the following elements managed by the DMS:

  • Quality manual and quality policy
  • Procedures
  • Instructions
  • Records of activities

Preparation of documentation on the validation of computerized systems 

Descriptions and job profiles should be prepared of personnel involved in validation, describing their responsibilities and activities. These responsibilities should be assigned and ensure that communications are understood.

The next step is to establish the requirements for preparing and updating documents for validation.

It is recommended that a procedure is developed that defines at least the following points:

  • Assignment of a document identifier, title, issue date, validity and next revision
  • Definition, format, language, format code and type of format (e.g., paper, electronic)
  • Responsible person for review and approval of the document 

For the control of information generated, a process should be developed that includes at least the following:

  • Availability of documentation for consultation when needed
  • Security and protection against loss, misuse or loss of integrity
  • Change management

Documented information is kept as evidence and protected from unauthorized changes. Ensure that all information used in computerized validation systems is approved and up to date and that personnel receive corresponding training. With the establishment of the quality system and documentation, you have the confidence that you have validation evidence to support the results.

What requirements must the Quality Management System comply with?

Formats and protocols for reviewing reports must be discharged into the QMS, and therefore must be controlled by the system organization documents. They must comply not only with internal policy documentation, but also with aspects specified by NOM 059 for protocols and reports.

That is, the protocols must explain the method used to perform each test, which produces a result that in turn must meet acceptance criteria that should be reflected in the protocol.

The report must contain the reference code protocol. Each protocol should have its own report and at the end should declare the analysis of each qualification phase and the system as validated or not validated. Each test protocol must have a unique identifier that allows relating the requirement or specification corresponding to it, along with the report there must be a traceability matrix.

The report of each qualification phase should address the results obtained during implementation and conclusions based on compliance with requirements and specifications. It is recommended that the report include a format for recording deviations, supplemented by an analysis of the results.

Changes to the protocol during implementation must be documented and justified. Change controls included in the document can be used or reliance can be placed on the changes in the internal control procedure.

Formats and reporting protocols must be associated with an overall Validation Master Plan or a Validation Master Plan of Computerized Systems.

FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

Project Validation Master Plan

What is the purpose of a Validation Master Plan of Computerized Systems?

The Validation Master Plan (VMP) specifies and coordinates all qualification/validation activities to ensure that the manufacture of pharmaceutical products is carried out in accordance with the standards and quality policies required by regulatory agencies. The plan establishes guidelines for conducting validation studies.

What should the Validation Master Plan of Computerized Systems include?

The Validation Master Plan should clearly define the following:

1. Introduction

1.1 Description of the project 

1.2 What is a validation master plan?

1.3 Scope of validation master plan

1.4 Definition of term validation

1.5 Members of the validation committee

1.6 Responsibilities of the validations committee 

2. Objectives

2.1 General objectives

2.2 Specific objectives

3. Qualification/validation concepts

3.1 Fundamentals

3.2 Life cycle validation

3.3 Qualification/validation elements

4. Description of facilities

4.1 Capacity of production lines

5. Description of the computer systems

5.1 General characterization

5.2 Inventory

6. Project phases

6.1 Phase 1: Diagnosis

6.2 Phase 2: Planning

6.3 Phase 3: Execution

6.4 Phase 4: Control

6.5 Phase 5: Attainment

7. References

8. Functional tests – acceptance criteria – annexes: standard mark scheme/validation and architectural drawings

9. Timeline of implementation

What is in a validation plan?

The first step in starting a validation project is to develop a validation plan. With the development of this plan, projects will be managed, and the validation of computerized systems will be better controlled.

What is a project?

A project is a temporary group activity to produce a product, service or result that is unique. It is temporary because it has a beginning and a definite end, and thus has a defined scope and resources.         

What are the characteristics of a project?

  • A project tries to solve a problem (fill a need)
  • It is temporary
  • It is unique in time and not repeatable under the same circumstances
  • It is uncertain
  • It consumes resources: time, money, materials, and labor      

Where does validation begin and end?

The validation process begins with defining the systems to be validated; validation and maintenance of the validated state must be part of continuous improvement. These are cyclical and ongoing processes that end only with the retirement of the system.

What is the characterization of the computerized system?

It is a tool used to identify the type of system being counted to determine its scope and associated strategy.

Why is it important to characterize the previous computerized system at the beginning of the GxP?

Characterizing the computerized system serves to:

  • Determining the category of the system based on GAMP® 5
  • Determining the impact of GxP

Or, in simplified form, determining the strategy and scope of computerized system validation.

Another element to determine the scope and strategy is the characterization of the process that the system serves, which establishes the functionality and delimitation of the system (system constraints) for that process.

The lack of adequate characterization of the system and process leads to difficulties in determining the scope and appropriate strategies for the validation process. 

Is the characterization of the system a deliverable?

For some standards, it is not a deliverable, but nevertheless it is strongly recommended to be part of the deliverables of the computerized system validation process because it is the basis for the validation. 

How are computerized systems characterized?

The characterization of computerized systems is an important activity that precedes the development of a validation study. This characterization allows us to understand two fundamental aspects:

  • The computerized system
  • The process that serves the system
 
As we understand these two elements at the beginning of the validation work, we can better imagine how all parts of the deliverables have system validation. Their understanding also depends on the first deliverable validation, requirements and user requirements, which is signed off and all other deliverables. In an analogy, the characterization is to know the plans of the house and the use that will give, while the requirements are the basis on which the building (house design and validation process) is done.

Characterization, depending on the stage of the life cycle the system is in, can be based on user requirements (new systems) or on existing system specifications and process knowledge (legacy systems).

In some cases, a computer system can even be defined as the union of a host with other so-called “satellites” that complement the main functions of the system.

Among the elements to look at the characterization of the computer system are the following:

  • Scope and elements of which the system is composed
  • Objectives, main features and expected results
  • Type of system
  • Category
  • Age
  • Stage of life cycle it is in
  • GxP impact, criticality and complexity
  • Risks inherent in the system
  • Definition of responsibilities, mainly for: System owner, users and their levels, vendors, experts in the field
  • Detection of existing GAPs
  • Electronic records of these vulnerabilities, their criticality and risks
  • Data it manages
  • Use of electronic signatures, these vulnerabilities, their criticality and risks
  • Applicable regulations
  • Required level of security
  • Documentation that counts (procedures, manuals, technical specifications, manufacturer’s specifications)
  • Interfaces to that account
  • Etc.
 

With respect to process characterization that serves to define these elements:

  • Start and end of the process
  • Process owners
  • Inputs and outputs
  • General requirements related to the computerized system
  • Support for the documentation process (procedures, records and reports)

Scope and definition of the system

How are the scope and boundaries of the system determined?

Similarly, it is important to determine which components are part of the validation, and also which are not within the range. Much of this information is obtained from the process and risk analysis system map.

Characterization of the process

A process is a set of interrelated materials, personnel, equipment and systems. It coordinates a transformation of input elements into a product or output element. The aim of the process is to deliver a product or service that meets customer needs; a quality product or service.

For both computer system validation and infrastructure assessment, it is suggested that the processes be mapped to determine the scope of validation/qualification. Similarly, it is important to identify the components that are part of the validation and those that are not within the range. The inventory of systems and components covered by the validation should be included in the Validation Master Plan, including the justification or basis for determining exclusions. For the process mapping of computer systems, it is important to know why and how they are used, as this is key to determining the scope of the validation.  

Since user requirements are derived from knowledge of the process, an understanding of the process enables the development of requirements for a robust user. Those requirements in turn have a decisive influence on the quality of the rest of the elements of the V-model selected for each validation in particular.

Failure to define user requirements according to the process may result in failure to identify the risks associated with the process and, therefore, failure to implement controls to reduce the severity and occurrence of errors. Poor definition of user requirements also leads to economic losses, increases the likelihood of rework and change of objectives during validation, and reduces credibility by basing requirements not on facts but on assumptions.

Process mapping

Process mapping is a tool that provides a global-local view of the activities performed and to identify their components. Prior to mapping, the processes needed to determine the operating parameters are standardized. Here lies the importance of the process owner, which is the person with full knowledge of the activities, materials, suppliers, teams, customers (which can be internal or external, see topic: Who is responsible?), system, the one who leads the development. In addition, it is recommended that the staff has mastered the subject in question. When the process mapping is done without the users knowing or mastering the process, the following errors may occur:

  • The objectives are not clear or the process is constantly being modified
  • The process needs to be modified, which means reformulating the objective
  • Defined activities do not match the needs of the process
  • Developing user requirements that are not achievable
  • Not meeting user expectations
  • Designing a computer system that is not fit for purpose

You must propose and define a clear objective to indicate the What?, How? and Why?. Goals should be clear, achievable, measurable and consistent. To get a better understanding of the process, a description of it can be made in which the expectations, needs and products, once the objective is set, are identified to define the process.

It is important that the objective is clear and understood in order to move on to the next stage, which is determining the process components. As part of the process components, you should consider the following:

  • Input sources: e.g. previous processes, suppliers (internal or external), customers, other stakeholders.
  • Tickets: Identify the inputs needed for the process (raw materials, office supplies, peripherals such as printers, handhelds, applications, etc.) to obtain the good or service.
  • Operational processes: Those directly related to the service.
  • Support processes: Those that support the activities of the main process, provide them with the necessary resources to carry out the activity. These processes without being part of the main processes are necessary for the activities to run smoothly. It is important to identify the providers of these activities.
  • Outputs: Support processes and have identified what product is expected from this transformation. When referring to product, it may be a process of manufacturing a drug that is tangible or intangible, such as the administration of a document management system.
  • Customers: You can have internal or external customers. Internal customers are those who are members of the organization and receive the product or service. External customers are those who are not part of the organization but require satisfying a need (with a product or service).

Once the components are identified, we proceed to create the process map and thus we will identify the interrelationships with other processes or threads. You can run a process mapping macro from the general and breaking down activities to learn in more detail (macro, process, activities and tasks) so go from the general to the particular.

Below are examples of process mappings for different categories of computer systems (see topic: Hardware and software categories for computer systems).

Example 1: A computerized system category 3: labeling process by a computer system

Objective: labeling of finished products through a computerized system

Process description: The computerized system called “Tag2” is used by selecting the characteristics to be labeled and indicating the number of labels to be printed. Data is entered using the keyboard. Security is not enabled and settings are not changed.

Identify components: 

  • Input sources: Provider labels
  • Tickets: Customer need of labels to identify product
  • Paper labels, labeling procedure describing the characteristics: the following inputs are needed
  • Main process: It consists of the following activities, through conditioning in which the specifications that the label should contain are received; the characteristics of the label are entered and the number of labels to be printed is selected. Finally, they are sent to print through the software.
  • Support processes: Quality for verification of label attributes, maintenance for computer support, IT support for computer system, document management system to have up-to-date documentation
  • Output: satisfied with labels, which meet established customer quality specifications

Example 2: A computerized system category 4: labeling process by a computer system

Objective: labeling of finished products through a computerized system

Process description: The computerized system called “Tag2” is used by selecting pre-configured by the user for different products that are packed in the area templates. You have the option of user levels on (access to the system is by username and password) and have 3 user levels: Level 1 display, Level 2 input data (due date, batch, production date and number of printing labels) and the system administrator configures level templates and gives access to the system, has the option of audit tracking enabled.

Identify components: 

  • Input sources: Provider labels
  • Tickets: Customer need of labels to identify product, label templates
  • Paper labels, labeling procedure describing the characteristics: the following inputs are needed
  • Main process: It consists of the following activities: order conditioning which contains the expiration date, batch number and production date. The system gives the template according to the product to be labeled. If the correct one is missing, it is used by entering XX, expiration date, batch, production date and the number of labels to be printed in the process. Finally, the system administrator can configure new templates or edit existing templates. All changes to templates are displayed in the audit trail.
  • Support processes: Quality for verification of label attributes, maintenance for computer support, IT support for computer system, document management system to have up-to-date documentation
  • Output: satisfied with labels, which meet established customer quality specification

Example 3: A computerized system category 5: labeling process by a computer system

Objective: labeling of finished products through a computerized system

Process description: The “Tag2” called computerized system has an interface developed in JAVA® programming language that allows you to condition batch data, date of production and expiration date of a production system in which orders are recorded. The “Tag2” system is used by selecting the template of the product to be labeled and entering the order conditioning. In this way, the label is automatically filled and the number of labels to be printed is selected. The system has the option to 3 user levels (access to the system is done with username and password): Level 1 display, Level 2 input data (expiration date, batch, production date and number of labels to print) and the system administrator configures level templates and gives access to the system, has the option of audit tracking enabled. The system requires custom programming development through changes in the bar code previously established no alerts.

Identify components: 

  • Input sources: Provider labels
  • Tickets: Customer need of labels to identify product, label templates
  • Paper labels, labeling procedure describing the characteristics: the following inputs are needed
  • Main process: It consists of the following activities: Conditioning order is received. The system template product to be labeled is selected and the order conditioning is entered. In this way, the label is automatically filled and the number of labels to be printed is selected. The system administrator can configure new templates or edit existing templates. All changes are reflected in the audit trail.
  • Support processes: Quality for verification of label attributes, maintenance for computer support, IT support for computer system, document management system to have up-to-date documentation
  • Output: satisfied with labels, which meet established customer quality specifications

Analysis of the three cases, we have the same system called “Tag2”, but the scope is obtained depending on their use and is different for each.

That is, in the first example, where you have a category 3 system, the system has all the tools to control and audit tracking users, but the way it is used is not required. So, although the system is capable, for process mapping purposes only how it is used is considered. So, you can define the scope to determine which components are part of the process and what not.

In Example 2, you have the same system “Tag2,” but the process activities change as more options become available, such as access levels and audit trail tracking. 

In Example 3, you are still using the same system; however, it is used to automate the activities that an interface performs to values that require the label, and requires some degree of customization.

To determine the scope of the process, consider that the process is mapped as labeling, so the production process system is not within the scope of the labeling process.

Thus, you can see how the scope changes in the same system according to the activities performed.

At the end of the process mapping, which serves the computerized system, the user of this information will have more knowledge, complete and robust, so that he can conceptualize correctly:

  • The scope of project validation
  • The functionality of the system
  • The inherent risks
  • User requirements to be assessed 
  • Validation activities 
  • Existing GAPs
  • Allocation of responsibility 
  • Efficient use of resources allocated for validation
FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

Risk analysis

Why is it important to determine the risks of the system before validation?

Primarily to determine the scope of validation, implement controls that reduce or eliminate identified risks to an acceptable level. Knowledge is required to detail the system’s risks according to their rank, seniority, complexity and degree of customization, and the process the system serves. The more information you have about it, the better to control risk because we have data on controls performed.

The GAMP® 5 Guide recommends eliminating risk through changes in processes or system design. Design reviews can play a key role in eliminating risk from the start.

Risks that cannot be eliminated from the design must be reduced to an acceptable level by implementing controls. Risk reduction includes applying controls to reduce severity and occurrence and increase detectability.

It should be a systematic approach to ensure that the risk associated with a system has been eliminated or reduced to an acceptable level. The extent of verification and level of detail of documentation should be based on the risk to patient health, product quality and data integrity, especially taking into account the complexity of the system.

It is important to note that risk analysis is part of the risk management system implemented in each company and must be managed as established by each organization.

There are several tools to conduct risk analyses, including: HACCP, HAZOP; FMEA. The FMEA tool is recognized by WHO as the most appropriate for the pharmaceutical environment.

In general, it is assumed that the inherent risk increases the more critical and complex the system is. 

The risk assessment is made depending on the stage of the life cycle the system is in.

It then addresses the risk assessment for user requirements and for maintaining the validated condition.

Risk analysis system

Once you have identified the risks according to the category of the system, determine the impact of those risks in the following areas:

  • Patient health
  • Product quality
  • Data integrity
  • Regulatory compliance 
  • Internal policies
  • Business impact

An integrated approach to risk analysis in the areas of compliance, internal policy compliance, data integrity monitoring and business impact will help maintain product quality and thus patient health. These last two points are considered key points in the GxP impact assessment of the risks identified for each system and should be documented, as well as the categorization system (see topic: categories computerized systems) to justify the next steps of validation.

The impact assessment should be supported by the tools of the risk management system implemented by the organization. Organizations may have established a risk management process, which include use of the methods described in GAMP® Guide 5. However, the methods described in the ICHQ9 guide can also be used. The method chosen will make the process more functional. The diagram schematizes and summarizes the steps of a risk management system:

Conduct an initial risk assessment and determine the impact on the system

The initial risk assessment should be conducted and the process includes determining user requirements and regulatory requirements primarily. Based on this initial assessment and the result to the impact on the system, it may or may not be necessary to perform the next steps if the risk is at an acceptable level.

Identify functions with implications for patient health, product quality, regulatory compliance, internal policies, data integrity, and business operations. Functions with implications for patient health, product quality and data integrity should be identified from the information gathered in the previous step, with reference to the relevant specifications and taking into account the system architecture, its category and components.

Perform risk analysis and identification of controls

The functions identified in the second step should be evaluated taking into account the potential risks and how to control potential failures arising from those risks. The decision to conduct a detailed assessment of the specific functions for each case must be addressed and the criteria may vary widely. 

The criteria to be considered include: the criticality of the process, the specific impact of the function within the process, and the complexity of the system. It may be necessary to perform a more detailed assessment analyzing the severity of the damage, the likelihood of this failure and the frequency with which it occurs. However, for this type of evaluation, we recommend using the methodology of GAMP® guide 5, which is the most common and widely accepted.

Implement and verify appropriate controls

The functions identified in the previous phases must be evaluated, taking into account potential risks, and how to control potential harm from those risks. Verification should be documented that controls are operating efficiently.

Evaluating risks and monitoring controls

During the periodic evaluation of the systems, the organization should review the risks. This should include verification that controls are still effective. If necessary, corrective actions should be implemented as part of a change control.

What are the risks associated with electronic records management?

The main risks when dealing with electronic records is retention and integrity. These risks, in turn, generate potential risks to product quality, patient health, regulatory compliance and internal policies or even the business.

To address the risks associated with data retention, policies and/or procedures should be established to ensure that data, regardless of where it is stored, is managed according to established standards. Because the lack of a plan in the event of data loss can cause irreparable damage to the organization and usually causes economic losses. 

This policy should provide an action plan in case situations arise: 

  • A natural disaster
  • A human error 
  • A malicious action 
  • A technical error

If an organization wants to avoid risk and ensure that its operations continue despite the aforementioned potential risks, it is recommended that members of the organization be made aware of the consequences of not having an Electronic Records Management system to support their business processes, or that proper business continuity planning be undertaken.

Some supporting processes to reduce the risk of data integrity loss are:

  • Backup, archiving and restoration of information 
  • User management and access levels
  • Physical and logical security management 
  • Virus management 
  • Maintenance systems
  • Actions in case of contingencies and disasters
  • Etc.

Elements of the same system such as the audit trail and use of electronic signatures, as well as infrastructure such as antivirus and firewall help maintain data integrity.

What are the risks of using electronic signatures? Are there risks associated with computer systems?

The main potential failure modes that pose risks for misuse of the most common electronic signatures are:

  • Hacked and therefore unauthorized changes are made
  • Loss of a security element
  • Wrong assignment of authority to the user
  • Incorrect user credentials when verifying electronic signatures
  • Failure to document the date from which the electronic signature is valid and replaces handwriting

These risks often occur to the end user, system owner or process.

To be considered an electronic signature, it must contain at least two security elements: a combination of username and password. 

One can even use two keys, the public and the private. The public key is what can be shown and accessed by a third party. The private one will in no way be known or accessed by anyone else because this key is integrated our identity and signature, usually it is encrypted for some as the token device.

Exposing our key is a very high risk, because its security is exclusive and thus the security of electronic signatures, so anyone who has the same can make fraudulent signatures with the same legal effect as handwritten signatures. Knowledge by a third person of the key can bring phishing, can be passed by the user and sign anywhere.

A recommendation should have a clear control and password security policy, and implement a secure system for managing it. That system should include the necessary elements to store and manage keys and grant access only to authorized users so that it is known who signed it, where it was done and when it was done.

FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

Infrastructure qualification

What is infrastructure?

Infrastructure Information technology (IT) is the foundation that supports operational industries. It collects, processes and disseminates core information, and in some cases is critical to business processes. The uncontrolled use of this medium can cause direct or indirect involvement in product quality, harm to patient health, impact on good medicine production practices or economic loss. Infrastructure is also an intrinsic part of the computer system, as part of the computing environment.

The level of complexity and size of the infrastructure will depend on the type of business, business needs and user requirements. As part of the needs of the business, the choice of an in-house infrastructure, an outsourced site or an outsourced cloud follows. This will depend on cost, space, availability and application requirements, among other factors.

There is a software and hardware infrastructure.

IT infrastructure exists to support key business systems by providing:

  • Platforms on which business applications run
  • Processes: IT infrastructure that enable a controlled IT environment
  • Services: General IT

Platforms, processes and services are the main components of the infrastructure. 

Qualifying the infrastructure?

The infrastructure is the medium on which the computerized business system operates and must therefore be qualified. 

The risks of not performing an assessment of the infrastructure are manifested when the system is put into operation, for example, when teams have miscommunications with the server, when the infrastructure is vulnerable to attacks by malicious people, when it exhibits slow connectivity, when the infrastructure is not needed for the application. Therefore, it is important to take an appropriate design approach and consider the needs of the process and the systems that will support the infrastructure.

While not all applications require a dedicated server, it’s also important to identify operating system specifications and version, Internet server version and RAM to decide which server can keep the company functioning properly.

In terms of national and international regulations, there are different standards for the need to assess infrastructure as it is considered critical to the indicated process.

As compliance with good manufacturing practices, the following rules apply:

Regulation and international guidelines:

Eudralex

In the pharmaceutical legislation in the European Union, Volume 4 “Good Manufacturing Practice for Human and Veterinary Use guidelines in Annex 11 Computerized Systems” on page 2 states the following:                  

“This Annex applies to all forms of computerized systems used as part of the activities regulated by GMP. A computerized system is a set of hardware and software components that together perform certain functions. The application must be validated; the infrastructure must be qualified.” 

FDA
FDA 21 CFR part 211 its (CGMP for finished pharmaceuticals) subpart D, 211.69 refers to computer systems or related systems. The draft of the “Data Integrity and Compliance With CGMP, Guidance for Industry” guide clarifies that the term “related systems” refers to computer hardware, software, peripherals, networking, cloud infrastructure, operators and associated documents.

 

To comply with the request in national and international standards it has references guides, including the following: 

PIC/S (Pharmaceutical Inspection Convention and Pharmaceutical Inspection Co-operation Scheme)

In the PIC/S Guide, Annex 11 ” Computerized Systems” on page 94 states:

“This Annex applies to all forms of computerized systems used as part of the activities regulated by GMP. A computerized system is a set of hardware and software components that together perform certain functions. The application must be validated; infrastructure must be qualified.” 

Which specifies that computerized systems that are part of or involved in activities affecting good manufacturing practices must be validated and the infrastructure must be qualified.

GAMP ® Guide IT Infrastructure Control and Compliance (2005)

This is a guide to compliance with regulatory expectations that supports infrastructure assessment.

What elements does the infrastructure consist of?

As mentioned above, there are three main infrastructure components:

Each of these components is subject to specific tests conducted during infrastructure qualification.

1. Platforms

The key infrastructure platforms according to the “GAMP® IT Infrastructure Control and Compliance” guide are as follows:

Platform
Description
Examples
Networks
Computer networks send and receive electrical signals, electromagnetic waves and the like, and are used for information transport.
Cables, connectors, switches, routers, etc.
Hardware
The hardware allows information transmit between the core (memory of the CPU core) and associated peripherals. Hardware peripherals are classified as input, output, input-and-output, and storage. Thus, a peripheral is anyone that is not part of the CPU and main memory; however, they enable input and output operations, complementing the process.
Input devices: mouse, keyboard, barcode scanner, etc. Output devices: printers, screen (non-touch operation) earphones (speakers), etc. Input-and-output peripherals: touch screen control, modem, port drivers, etc. Storage peripherals: USB, magnetic disk, magnetic tapes, etc.
Operating systems
An operating system is the software that manages the basic processes of the system and what programs users should install according to their activities.
Windows®, Linux®, UNIX®, etc.
Data Management Software
It is software responsible for managing enterprise data across its life cycle.
Web services, SAS®, etc.
Servers
It is a computer that provides information needed by customers who have access to it.
Mail servers, proxy, web, etc.
Customers
A customer is an application or computer that uses a server over computer networks to use a remote service.
Client-server-web (the use of a wireless access network for an application), Client-mail-server (the use of computing devices with an active e-mail account on a mail host. Web access is also required to access mail.), etc.
Applications
Are programs that allow users to perform tasks or activities; applications use infrastructure platforms to perform their actions. Applications can belong to category 3, 4 or 5 according to the categories described in Hardware and Software Systems. Applications do not fall under the infrastructure classification.
ERP system for inventory, programming systems for recording temperature sensors, system administration for document management system, etc.
Table 3: Infrastructure components 

2. Personnel

Personnel operating, maintaining and managing the infrastructure must have: 

  • Defined roles and responsibilities
  • Established job descriptions
  • Documented qualifications and experience

Some of the key roles and responsibilities associated with infrastructure need to be identified and determined:

  • Executive management (sponsor)
  • Project manager
  • System owner
  • Data owner
  • Infrastructure processes owner
  • Platform owner
  • IT quality assurance
  • IT regulatory compliance and quality

3. Processes 

The main processes involved in the infrastructure are: 

Infrastructure processes are related to formalizing procedures and documenting these processes to make them standardized and reproducible.

With the above, you can see which programs or applications require infrastructure for their operation, and thus how important it is that these components work properly and are consistent with the needs of the application you want to use.  

The risk of current application failures when being in an operational environment increases to perform earlier infrastructure assessment. For example, consider a company that requires the use of wireless hand for loading and unloading inventory. They should consider issues such as the size of the network (which will reach the perimeter connection coverage), how many customers will have as this may saturate the network and cause the application to work slowly or not work among others.

Life cycle of infrastructure

Like computer systems, infrastructure has a life cycle, from planning to its disposal, and at each stage there are assessment activities. There is a life cycle and it runs from the conception of the project, where the user needs are identified, to the point where you decide to renovate or remove the infrastructure. At this point, user requirements must be reviewed to determine whether the information will be migrated, protected or removed and what the final destination of the infrastructure will be.

The relationship between the phases of assessment and the life cycle is shown below.

Qualification phases of infrastructure and deliverables

The first step in qualification is to identify the components that affect Good Manufacturing Practices; this is helpful in determining the scope of qualification.

The next step is to determine the qualification strategy; it is important to remember that this strategy must be reflected in the validation master plan to have consistency between what is documented and what is implemented.

The assessment can be performed in conjunction with the validation of the computer system. However, it is important to perform an analysis to determine which strategy to follow for this purpose:

  • Assessment of infrastructure included in the validation system
  • Validation of the assessment system independent of the infrastructure
 
The strategy will be determined depending on the type of infrastructure and usage. For example, if the infrastructure is specific to a computer system, it would be advisable to do it in conjunction with validation computer system. However, if the infrastructure used is shared with other computer systems, GxP performs severe impact rating recommendation independent of the infrastructure.

The phases of infrastructure qualification are as follows:

  • Development of user requirements, taking into account regulations, the process and the manufacturer. Within the user requirements, design specifications of the infrastructure components should be defined, or these can be created in a separate document.
  • Perform a risk analysis to identify potential failures and implement the necessary controls to mitigate the risk
  • Design qualification based on user requirements
  • Qualification based on design specifications and installation requirements of manufacturer
  • Operation qualification based on operating conditions established by user and manufacturer
  • Report card

Qualified infrastructure or “in compliance” means that the following aspects are controlled and documented:

Pay special attention to the following classification:

  • IQ/OQ of infrastructure components
  • Configuration and change management components and their settings in a highly dynamic environment
  • Management of their risks
  • Involvement of service providers in critical processes
  • Security management (access control, service availability and data integrity)
  • Archiving, backup, restore and disaster recovery

Why is infrastructure performance not assessed?

The purpose of performance assessment is to demonstrate fitness for use. But in the case of infrastructure, since these are the supporting business processes whose performance will be tested, infrastructure performance is indirectly verified during the validation of computerized business systems. The GAMP® Guide states that infrastructure must be qualified to be considered “compliant.”

Infrastructure qualification must demonstrate its ability to maintain information security.

Requirements/user requirements

In this guide, the terms “user requirements” and “user requirement specifications” will be considered equivalent, using the acronym URS.

What is a user requirement?

User requirements are expectations that an organization has regarding the system to meet your needs, whether for business or regulatory compliance. They are the “whys and wherefores” of the system.

How are user requirements classified?

User requirements are classified as:

  • Required: Those indispensable to maintaining product quality, patient health, regulatory compliance and data integrity. Non-compliance involves risks that should not be borne by the organization.
  • Desirable: Non-critical ones that serve for further improvement, aesthetic impairment of the system or certain company policies and issues that are acceptable, your risk is minimal. It is included as a possibility, but is not fulfilled if the organization is not involved.

This classification allows for the existence of requirements that may be exempt from compliance and does not affect the outcome of validation and regulatory compliance.

Required requirements include, but are not limited to, descriptions related to:

  • System operation
  • System functions
  • Data integrity
  • Technical requirements
  • Operating environment
  • Performance
  • System and information availability
  • Security of the information
  • Regulatory requirements
  • Use restrictions
  • Etc.

Desirable requirements include, but are not limited to, descriptions related to:

  • Data presentation format
  • Aesthetic graphical interface
  • Performance optimization
  • Non-urgent performance improvements
  • Etc.

How are the requirements/user requirements developed?

User requirements clearly and concisely describe the functions and capabilities that the computer system must meet. These requirements should be based on knowledge of the process serving the computerized system and should generally be considered as a requirement (if too specific, it becomes a specification). Preferably, no components or functions of their operation should be included since this makes their function more like specifications.

When user requirements are established based on knowledge of the process, critical product quality attributes and understanding of regulatory requirements, these requirements can be used to apply a quality by design (QbD) approach.

User requirements form the basis for qualification testing of design and performance, for the functional and design, risk analysis and traceability matrix. 

In the V-model, one can see how user requirements affect each of the model elements. The requirements depend on the rest of the development process and the test system, so good preparation not only produces a V-model that is robust and reliable, but also flexible enough to make necessary changes without also having to change the requirements.

If we set too specific requirements, this would mean that, if necessary, any changes made to the system components would also have to change the requirements affecting the rest of the chosen traceability V-model. Conversely, if the requirements are sufficiently general and aimed at covering the expectation or need, this allows the changes made to affect only the traceability of the specification and related qualification tests. Changing a requirement may affect not only the traceability of the specification and tests directly related to it, but also the numbering of the list of requirements, which in turn requires them to change all requirements.

The development of user requirements requires appropriate involvement of various areas of a company, not just the end user’s computer system. It is important that these are drafted by a multidisciplinary team, looking not only at functional, but also at regulatory, business operations, data integrity, internal policies, quality management system and patient safety issues.

Some key recommendations for creating user requirements are:

  • They should be general. More general in the wording of the requirement, gives more flexibility for implementation in different scenarios, it facilitates continuous improvement and change control also facilitates adaptability to unplanned changes.
  • The wording should be simple and concise and identify the need or expectation to be met from the user’s perspective (see user definition), for example:
    • “The [product, process, or element in the process] must (n) …. (Active verb: have, be, do, make) …. “
  • The wording of the requirement does not explain how the expectation will be met, should only identify the need.
  • Care should be taken to ensure that no more than one need or expectation is met, to facilitate traceability. Furthermore, the same expectation may be covered by a broad spectrum of solutions.
  • Each requirement should be assigned a unique code, for example (URS-001 …).
  • For new computer systems: it is possible to write generic requirements.
  • For legacy computer systems: it is possible to write slightly more specific requirements. 
  • Moreover, user requirements provide important information about the existing interfaces between the system and manual operations.

The wording and style used in the standard to define regulatory requirements allow greater flexibility for compliance in different scenarios, and are conducive to continuous improvement and control of change, i.e., are general. Moreover, the wording of the requirement does not explain how the expectation will be met, but only identifies the need to be covered.

Are user requirements needed for legacy systems?

To apply the method for validating computerized systems mentioned in GAMP® Guide 5, it is necessary to establish user requirements for all types of systems. This is supported if we consider that “user requirements are the description of what a computerized system should be and do, in order to meet a company’s expectations of such a system”. Then user requirements do not depend on whether a system is new or inherited, but on the fulfillment of the expectations that a business has of such a system, always in view of the process in which it participates and the changes that this process may have.

In the case of user requirements for newly developed systems, these are created thinking about what the system would be like. Usually they are created at the system design stage in their life cycle, when the process is not yet fully defined and mature. Here they serve to mutually adapt and incorporate requirements.

For user requirements developed for legacy systems, they are developed with the needs already identified by the process they serve. In this case, they serve to compare to what extent the requirements still meet the needs of the process and to make decisions to adapt the process or requirement to the organization.

Remember that processes are living entities that change and adapt over time, and that the requirements that were set for a process at one time may, over time, no longer meet the new needs of the process. This review is also part of maintaining the validated state.

FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

Risk analysis requirements

What is the risk analysis of user requirements?

Conduct a risk analysis of user requirements that contributes to scientific and systematic actions to detect, mitigate and monitor potential failures in systems, operations and processes that affect product quality.

Risk analysis of user requirements has two purposes:

  • To prevent potential failures associated with computer systems and process-related risks
  • From the elements identified and the risk assessed, determine the best strategy for validation scope and rigor in accordance with the risk level for each system requirement

It should be performed for each computer system and cover at least the assessment of the following effects:

  • Product quality 
  • Patient safety
  • Data integrity
  • Regulatory compliance 
  • Internal company policies
  • Business

The risk for legacy systems should be considered potentially high, as most of these systems do not have technical information such as manuals or specifications. Because systems are often old and outdated, they will not be supported or updated to maintain data integrity or ensure security of access and infrastructure for system operation.

What elements does the risk analysis requirements process consist of?

The following points should be considered in the risk analysis for each requirement and possible tools to define them are:

  • Identify and characterize the process (diagram/map process, 4 questions)
  • Have met the needs of the process and approved the final version requirements
  • Designate a person responsible for the risk analysis
  • Identify hazardous situations (brainstorm)
  • Identify critical control points and existing controls that prevent the hazardous situation from occurring and damage from being expressed (brainstorm)
  • Identify any damage resulting from the violation of each requirement, potential failure modes and effects/damage associated with each detected (fault tree, brainstorming, Ishikawa are three potential failure modes for each defined effect)
  • Establish the severity of each damage and occurrence and detectability for each potential failure mode (control charts, control sheets, statistical tools)
  • Establish an acceptable RPN, determine the risk priority number (RPN) and if applicable Critical to Quality (CTQ)

Subsequently for risk management:

  • Create action plan to achieve desired RPN
  • Define CAPAs
  • Re-evaluate results of actions
  • Compare with original RPN and take action

Risk = situation that can cause damage 

     What situation can cause damage/failure?

Damage = impact

     How is it affected? How can the risk become a reality?

Impact = impact type is one of the following: product quality, patient health, data integrity, regulatory compliance, internal policies and business organization

     What kind of impact?

Severity = level at which the impact of each failure occurs

     How bad is it that will happen?

Probability = possibility that a failure will occur

     How likely is it to happen?

Risk class = severity * probability

Detectability = ability to notice a failure before damage occurs

     How easily do I realize it will happen? (before it happens)

Risk priority = risk class * detectability

     How important is it to prevent it from happening?

Risk = combination of severity, probability and detectability

HOW DO YOU DETERMINE PROBABILITY?

Through brainstorming where users, process owners and system owners (in addition, suppliers and experts in the field can participate) contribute their experience and historical or statistical data to provide information on the frequency behavior of damage or failure in the context of the process.

HOW DO YOU DETERMINE DETECTABILITY?

By weighing the controls implemented to prevent damage and failures and their effectiveness. At this point, the knowledge and experience of users, process and system owners, subject matter experts and suppliers is also important.

How does risk analysis make user requirements?

It is recommended that risk analysis for computerized systems use the model proposed by the GAMP® 5 guides.

The purpose of risk analysis is not to control but to weigh the risks. Control comes after the decisions made to implement controls, which reduce the levels of severity and probability and increase detectability. It is not enough to have a risk analysis done on paper for the sole purpose of compliance, but extensive knowledge of the process is needed to make risk analysis an ally in controlling the risks inherent in the system and not a compliance problem. Properly used risk analysis can provide support.

Knowledge of product quality attributes and understanding of the process are critical in determining system requirements and making scientific decisions based on risk to ensure that the system is robustly tested and put to the test to demonstrate fitness for use.

Assessment against user requirements should allow identification of GxP covered by the system. User requirements should also provide the basis for demonstrating compliance with the GxP. The risk assessment and its results, including the rationale for which a risk is classified as critical or non-critical, should be documented.

There are several methods proposed by the ICH Q9 to determine which risks should be prioritized, but for the analysis of user requirements it is recommended to use the computerized systems proposed by GAMP® 5, because of its simplicity and practicality, given the amount of requirements to be analyzed. The following figure shows the weighting scale proposed by GAMP® 5, requiring the detection of potential damage and its severity, assigning a probability and detectability, the latter depending on the controls applied. The weighting levels proposed by GAMP are 3 for all cases: High, Medium and Low. The key to the use of these levels is that the objective assignment criteria for each case are robustly justified.

In other words, it is important that both severity, probability and detectability constitute objective criteria for assignment.                How high is high?, How low is low?, And where is the middle? These criteria should be documented.

For the objective criteria, you can use statistical measurements about the process and the experience of those involved in the operation and management of the system and process.

Once the level of severity of the potential failure has been determined, it must be assigned the value to quantify the probability of the failure occurring. Combining the two weights gives us as a result the 3 existing risk classes:

  • Risk class 1: High risk
  • Risk class 2: Medium risk
  • Risk class 3: Low risk

After determining the risk class, we assign a value for detectability, taking into account the current conditions of the process or system and using the same scale as the previous weightings. Finally, we get a value for the risk priority number (RPN), assigned as high, medium or low, which will help focus the validation strategy. It will help focus attention on user requirements that require detailed verification or controls to reduce the severity, probability and detectability of errors to an acceptable level:

  • Low priority risks: Require implementation of specific controls for prevention and/or detection
  • Medium priority risks: Require general implementation of preventive controls and/or detection
  • High priority risks: Require detection and immediate correction

The following diagram summarizes the steps required to perform a risk analysis. 

How do I determine the level of criticality?

There is still no approved criterion or parameter that determines severity for all cases. The reality is that determination is based on the knowledge or product itself, mainly taking into account experience, the impacts identified for that damage or failure and the historical events recorded over the life of the process: 

Low
Medium
High
Severity
No injury to patient, rejection of acceptable product for trial, acceptable economic losses and partial loss of data recoverable by other means
Minor patient damage, rework, partial rejection, non-compliance with one or more non-critical quality specifications, major economic loss, partial loss of data, safety risks and process operator, non-compliance with regulations
Greater economic losses that may cause the closure process, irreparable data loss, death or irreversible damage to patients, product completely out of specification, risks to operator integrity
Probability of occurence
It has not occurred when using the system or has strong controls to prevent it
Has occurred 1 time in the last 12 months of use, general checks have been made
Has occurred more than 1 time in the last 6 months of use, controls are inadequate or non-existent
Detectibility
It is detected less than 40% of the time before causing damage
It is detected 41 to 70% of the time before causing damage
It is detected 71 to 100% of the time before causing damage
Table 4: Considerations for risk priority 

Two examples of risk analysis of user requirements are explained here. It is a system configured ERP category 4 and one of category 5, that manages the input and output of a warehouse of raw materials:

URS-001 (category 4)
URS-001 (category 5)
Classification
Required
Required
Description
Access to computer systems must be controlled
Access to computer systems must be controlled
Potential failure mode
The system does not have controlled access according to a user profile to which rights are assigned or functions restricted
The system does not have controlled access according to a user profile to which rights are assigned or functions restricted
Severity
High
High
Probability of occurrence
Low
High
Risk class
Risk class 2
Risk class 1
Detectability
High
Low
Risk priority
Low
High
Table 5: Example risk analysis to user requirements system category 4 & 5 

In the case of the category 4 example, severity was high because of the critical nature of the data managed by the system. Occurrence was low because there is physical access control equipment where the loading and approval data is set with biometric access for a single person, because through biometric access control the system detects any unauthorized intrusion attempt and triggers an alarm when detectability is high.

In the case of the category 5 example, the severity remains high, but the probability is also high because the developer of the application, for lack of budget, did not include access controls when creating the application, which in turn significantly reduces the detectability of unauthorized access.

Considerations

In both cases, the organization has determined that configuring access control by user profile is the most efficient and least expensive way to reduce risk and implement a control system, and thus for the process. 

The introduction of controls helps reduce the level of occurrence of a potential failure mode and increase detectability. The more controls are in place, the greater the detectability and the less probable the failure will occur. The failure can be more easily detected and the frequency with which it might occur decreases.

A breach of internal policies and applicable government regulations carries the possibility of negative product quality and patient health, with consequences for the authority.

As a result of the risks identified for the organization to make decisions: 

  • Reduce the risk
  • Transfer the risk
  • Eliminate the risk

Each organization sets its own risk tolerance based on the risk and the ease with which it cannot remove or transfer the risk.

One of the most important aspects in risk analysis, and an important part of user requirements, is regulatory compliance. The more non-compliance, the greater the risk priority to address violations during computerized system validation.

As can be seen, performing a risk analysis to support and contribute to validation requires essential knowledge of the process and the context in which the system is developing. It is recommended that this risk analysis is performed with a multidisciplinary team capable of analyzing the potential failure modes with a comprehensive approach and combining all the knowledge and experience in the process. During qualification and validation, particularly rigorous, detailed and thorough evidence should be provided for those requirements with a higher RPN. Those with lower RPN levels, need less rigorous proof.

What are the deliverables of a risk analysis?

Each risk analysis should be integrated into a formal document that includes at least an objective, scope, description of the methodology used, the criteria to take into account the weighting scale and any other element to complement the understanding of the system, process and its own risk analysis.

One element that should never be lost sight of is the traceability of each potential damage or failure and failure to user requirements thus, regardless of the category the system is in. This can easily be achieved by assigning numbers or codes to each requirement and then to each potential failure mode. At the end of validation, each of these numbers should be included in the traceability matrix. A format of basic information to be used for the breakdown of the risk analysis is as follows, with the same table showing the relationship between probable failure modes and user requirements.

URS-001
Classification
Required
Description
Access to computer systems must be controlled
ID RA
RA-001
Damage or failure
Theft, accidental alteration or loss of information due to uncontrolled access
Potential failure mode
Anyone can enter the system and modify or steal critical information
Impact
Data integrity
Severity
High
Existing controls
Any
Probability of occurrence
High
Risk class
Risk class 1
Detectability
Low
Risk priority
High
Comments and observations
The immediate implementation of controls is required and detailed in the validation verification
Table 6: Format risk analysis

Functional Specifications

What is a functional specification?

Functional specifications (FS) describe how the functional elements of the computerized system should perform the expected actions to meet user requirements. Functional specifications are the basis for operational qualification testing.

What does a functional specification describe?

Functional specifications are based on user requirements and the technical specifications of the system supplier. They should describe “how the system should operate to meet the requirement (suitability for use),” whether it is a new system or a legacy system. They are usually written by the supplier or are based on technical information provided by the supplier. They describe in detail the conditions under which the system and its proper operation must work.

What are the key functional specifications?

Depending on the computer system and process, the main functional specifications relate to the following aspects:

  • Operational configuration
  • Interface with other systems and devices
  • Security and controlled access (cells, sheets, books, PC)
  • Audit trail (if required), passwords, backup, archive and restore, disaster recovery actions, information integrity
  • Performance calculations, macros, pivot tables, links, conditional formats, logical inference
  • Operational capabilities/information management
  • Backups, information retrieval

How are functional specifications written?

It is important to state clearly that the user requirements are the general description of the expectations of the system, so the functional specifications are specific descriptions of the functions of the system.

Some of the key points to consider when writing functional specifications are: 

  • They should indicate how or in what ways the system can function to meet the expectation described in each user requirement: “The [system’s] active verb (suffered, enabled, runs, etc.)”
  • Should describe configured functions established, authorized and documented by the system
  • Should be careful not to include functions that the system does not perform, since ultimately each FS must be tested during user qualification
  • Each Functional Specification should be assigned a unique code, e.g. FS-001
  • Care should be taken not to include operational tests whose execution or loss causes more problems than it solves (cost, damage, etc.)
  • Related to the above, it is very important to remember that FS ultimately become the acceptance criteria for testing the operational qualification protocol
 

In the case of logical security, there are elements whose functional specifications must be verified:

  • User name/password combinations
  • Length and format of the password
  • Locking of fields or cells
  • Assignment of user profiles
  • Biometric controls
  • Audit trail
  • Warning messages/alerts
  • Password rotation
  • Electronic signatures

It is recommended that for the preparation of functional specifications, the lead users, experts in the use of the system or providers, should be the ones who prepare them and/or perform their final review. 

FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

Design Specifications

What is a design specification?

Design Specifications (DS) are the description of the components that the system must have to meet the defined characteristics from the User Requirements and/or Functional Specifications.

What does a design specification describe?

In general, we can say that design specifications describe the structure of the system and how it should be maintained, hardware specifications, screen design, program design, interfaces, configuration specifications, parameters, documentation and supporting software. We say it is the system and the conditions in which it should be able to be used.

How are design specifications classified?

There are design specifications for hardware and software.

Depending on the computer system and process, key design specifications relate to the following:

  • Format printing, display, data entry
  • Design templates
  • Code macros and formulas
  • Required documentation
  • Hardware specifications and peripherals
  • Version control
  • SW support
  • Passwords and user information repositories
  • Program versions, operating system and formats
  • Procedures for operation, creation, modification, reception, etc.

In terms of procedures, there are some that are particularly important to check, depending on the characteristics of the system and the process requirements (see topic: QMS section).

How are design specifications written?

It is important to state clearly that the user requirements are the general description of the expectations of the system, so the design specifications are specific descriptions of the components of the system.

Some of the key points to consider when preparing design specifications are:

  • Must be linked to a user requirement
  • A single user requirement can derive more than one DS
  • An FS may also derive more than one DS
  • A DS may not be reducible to more than one URS
  • Frequently begins with an adjective (the, the, the, etc.), followed by a subject (computer, system, etc.) and then with the statement of what the system must have to meet functionality and suitability for use: “the system has …” followed by the description of component, document, etc. The description is important to include not only the component, but also its characteristics and how it must remain.
  • Recall that the DS ultimately becomes the acceptance criteria for testing the installation qualification protocol

In connection with the above, it is very important to remember that the design specifications ultimately become the acceptance criteria for the test protocol rate installation. Therefore, each of the specifications must be tested on the relevant protocols to demonstrate suitability for use.

It is recommended that for the preparation of design specifications, the key users or experts in the use of the system (suppliers) should be the ones who prepare them and/or perform their final review.

Why first the functional and then the design specifications should be done?

Because design specifications are the description of how the design elements should be present so that they can perform the functions of the computer system. This is especially important in new systems or in the design phase, where functions are designed first and then the elements to perform those functions. First what is to be done is identified and then the means to do it are defined.

Traceability Matrix

It is the document that shows the relationships between the elements of the multiphase validation, focusing and relating each of the user requirements to the risk analysis(es), design or functional specifications, as appropriate, and to each of the challenges in the qualification protocols (DQ, IQ, OQ, PQ).

What are the benefits of a traceability matrix?

  • Facilitates management and tracking of requirements, specifications and RA
  • Visualizes the scope of qualification and testing 
  • Helps demonstrate that validation is complete
  • Streamlines change management and visualization of its impact on qualification

It adds great value in audits and inspections, showing an overall map of validation.

How is a traceability matrix created?

The development of the traceability matrix begins when the user requirements have already been released, is plotted with risk analysis(s), and is converted into the respective design specification, if any, or the functional specification. It must then be placed in the document for each test at different qualification protocols:

  • If a design specification, it is plotted with its respective user requirement and test installation qualification protocol
  • If a functional specification, it is plotted with its respective user requirement and test operational qualification protocol
  • Requirements necessary for system design are traced directly to the design qualification protocol
  • Requirements related to system performance are plotted directly with the performance qualification protocol
#URS
#RA
#FS
#DS
#DQ
#IQ
#OQ
#PQ
1
1
1
2
5
1
2
2
2
2
NA
3
6
2
NA
3
3
3
NA
4
7
3
NA
4
4
4
NA
2
8
4
NA
5
5
5
5
NA
9
NA
6
6
Table 7: Example traceability matrix

What is an assessment protocol?

An assessment protocol is a document containing the objectives, guidelines and evidence of a validation or qualification test that attempts to demonstrate that the system and its elements meet the qualification criteria specified in the protocol for each. These tests must be traceable to the user’s specifications and requirements according to the chosen V-model.

The assessment of protocols must comply with the provisions of each organization’s internal policies and with the formatting requirements and content established by the Quality Management System (see topic: What Requirements Must the Quality Management System Comply With?).

Why are there qualification protocols?

Qualification protocols are not only required by law, but also clearly present the methodology of each phase, establish a purpose and specific scope, document responsibilities and define acceptance criteria for each test so that results can be compared to them.

The scope and content of the protocol will be as extensive as the complexity of the system to be validated. It is not specified or required that an exclusive protocol should be created for each stage of validation in the case of large systems, or a single protocol covering all stages when we are dealing with a limited size system. The only limitation is the order in which the tests take place and the interdependence of the results, so that the qualification steps are performed sequentially in the order mentioned:

  1. Design Qualification (DQ)
  2. Installation Qualification (IQ)
  3. Operational Qualification (OQ)
  4. Execution or Performance Qualification (PQ)

Only when the previous phase is successfully completed, you can start the next phase or when the absence of significant open nonconformities is demonstrated, and assessment and documentation the absence of significant impact in the next step. In any case, it will not close a phase without successfully completed the previous phase. Although the standard now lets you keep going even with nonconformities, it is best not to proceed to the next stage if there are nonconformities not closed. 

The form of these protocols depends on the provisions of the VMP or the internal procedures of each organization, without losing sight of the mandatory issues raised by the authority.

Design Qualification Protocol

Applied to computerized systems, design qualification aims to demonstrate that the design proposed by the supplier meets the functional and regulatory requirements and is therefore fit for purpose. 

Since this is the first stage of validation, this qualification must be completed before starting the installation qualification. 

Why is design qualification required?

The purpose of this qualification is to have a high degree of assurance that the system proposed by the vendor, at least at the documentary level, can meet the design requirements and is suitable for its intended use. 

In the case of legacy systems, design qualification runs in preparation for the rest of the validation so that GAPS discovered during design qualification can be resolved before the most critical and costly steps of validation. At this stage, you may even decide to remove the system when you consider the potential risk that the system no longer meets the current requirements of the process. It assumes that it is convenient.

What should the design qualification verify?

It should check the technical documentation system, such as:

  • Manuals
  • Technical specifications of the system
  • Service requirements
  • Electrical or ladder diagrams
  • Architectural plans
  • It must document the intended use in user requirements
  • It must document configuration needs, modification and existing environment
  • It must include specific requirements regarding areas and facilities
  • Must document the general operational aspects of support that the system is suitable for use
  • It is very important to consider the qualification of the infrastructure design as this information can be used for this protocol

The above elements are tested against the requirements to show that the stated technical elements can meet the expectation and thus the system is (at least at the documentary level), fit for use. In subsequent tests, Installation, Operation and Performance are tested against the installed and functioning, as stated in the documents reviewed during design qualification.

What should the design qualification verify?

Since design qualification is a regulatory requirement, it applies not only to new systems, but also to all systems already in use that can be validated. 

Protocols for the design qualification of systems already in use provide evidence that the system has the potential to continue to meet the requirements and expected use, and help to identify risks of incompatibility with the intended purpose due to changes in the original requirements or the process for which they serve. 

For new systems, this information is additionally useful for:

  • Selecting the best option before purchase
  • Identifying the supplier of the system and requesting adequate technical information to serve as a basis for functional and design specifications

Examples of tests for design qualification

#DQ
#URS
Test description
Acceptance requirements
1
1
Verify that you have obtained a master list of documentation to support validation and regulatory compliance
You must have a master list of documentation that supports validation and compliance
2
2
Verify that the system has protection, integrity and backup information
You must have a manual that indicates the system has protection, integrity and backup information
3
3
Verify that access to the system is controlled with logical security
You must have a manual showing that the system is controlled with logical security
4
4
Verify that you have obtained authentication credentials, which should be changed and/or updated regularly
There must be a manual for the system to indicate the authentication data that must be changed and/or updated periodically
5
5
Verify that the security policy specifies the maximum number of failed attempts to enter the system
It must have a security policy system where revenue indicates that the system has failed. The manual and configuration must state the ability to set this requirement.
Table 8: Examples of design qualification tests

Installation Qualification Protocol

What is installation qualification?

Installation qualification verifies that all physical and interface components required for system operation meet their suitability for use, are installed in accordance with the requirements of the qualification and the design specifications. During installation qualification, it is verified that everything is installed, in good condition, and meets the design specifications.

Why perform installation qualification?

Installation qualification allows verification that the system, once its design has been accepted, has been installed according to their technical specifications and equipped with all hardware and software components necessary for the proper operation of the system.

What should installation qualification verify?

As the stage prior to operation of the system, you must verify all hardware and software elements of the system necessary to serve and operate the process. Among verified assessment elements at this stage are: 

  • Installed software and hardware elements and interfaces
  • The required documentation
  • Technical specifications
  • Topology
  • Types of interfaces
  • User profiles
  • Compliance with environmental conditions, required services, document policies, etc.
  • The HW and SW meet user requirements and design specifications and procedures or predetermined standards
  • A risk analysis is performed
  • Adequate protection and information security
  • Installation methods and configurations
  • Flat site structure
  • Process diagrams of the system
  • Description and characteristics of associated hardware
  • User manuals
  • SLA maintenance program or provider
  • List of standard operating procedures
  • Training
  • Site features
  • Verification services
  • Existence of business continuity planning

When implementing the Installation Qualification (IQ) Protocol, the use of photographs, print reports and screen capture as supporting documents is recommended.

Each of the installation qualification tests should be traceable to the respective user requirements and design specifications associated with it. It should be recalled that in these protocols the design specifications are directly translated into the acceptance criteria for each of the tests, even if deemed necessary, the specification may be expanded during test development.

Qualification/validation protocols are the proper definition of acceptance criteria, as test methods for demonstrating compliance criterion is so important. The testing method is usually a direct verification, where the existence and status of the qualified installation element is inspected and documented (through photographs, screenshots and print reports, photocopies, etc.).
The test methods and evidence that must be submitted for qualification of the installation cover the following minimum elements:
  • Identification of the part / document / element
  • Location
  • Condition in which it is situated
  • Attributes to have the item controlled
  • Validity
  • Absence of change checks and open deviations

With legacy systems, it is common that much of the required documentation no longer exists or is no longer updated and available. In these cases, consideration should be given to developing new documentation to support the assessment. In the case of category 5 legacy systems, it is very likely that no documentation on the system has been developed. Here, if the system has been in operation long enough to demonstrate a high degree of assurance that it is reliable, it is most convenient to omit the development of the documentation system and treat it as category 4, taking into account the time of maturity and adjusting this strategy validation.

#IQ
Test description
Acceptance requirements
1
Verify that you have obtained the qualification protocol and the team preparing the report. Verify the recommendation in the report card.
The recommendation on the design qualification must be satisfactory
2
Verify that the analysis has an infrastructure that supports the system
The infrastructure should be assessed and it should be satisfactory
3
Make sure you have obtained a document describing system configurations
The document must be valid and authorized
4
Make sure you have obtained a document describing the flow of information in the system
The document must be valid and authorized
5
Check that count with a manual system
The document must be valid and authorized
Table 9: Examples of installation qualification tests
FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

Operational Qualification Protocol

What is operational qualification?

The assessment of operation demonstrates that the system involved in the study operates as determined by design and determines the optimal operating values for each of the control variables. In this stage of qualification, each of the functions performed by the system and the security elements of the counting system are tested according to its category.

Why perform operational qualification?

The operational qualification is to verify that in each of its functions, the system performs according to what the provider says the system does. For example, in a filling liquid in which a computerized system operates the equipment, the developer mentions that by pressing the “Start” button by the user, the contents will start to be filled to manual flasks. So in operational qualification, the system is controlled to perform this function, based on what was defined in the functional specifications.

In this sense, the standard does not allow the vendor or developer to perform qualification testing to replace on-site testing by the customer, because the documentation provided by the vendor cannot be traced to the functional and, therefore, user requirements. Moreover, suitability for use can only be demonstrated by verification of the location of the organization’s system, personnel, facilities, equipment and procedures. A system that has been “pre-validated” or whose validation package has already been delivered cannot be considered validated.

What should operational qualification verify?

During operation, qualification should test all functions performed by the system and involved in the process. The functions not involved in the process can rightly be separated. All existing security elements in the system should also be checked.

It must be documented in the description of the testing steps needed to demonstrate the specification, for which it is also important that the operating procedures reflect not only the needs of the process, but also the same system operating according to the process, as well as the integration between the two. 

Functional specifications become the expected outcome of each function performed by the system, and may even specify multiple acceptance criteria.

The most important elements to consider in this phase are:

  • The correct operation of the installed software and hardware elements and interfaces
  • The required SOPs
  • Adequate protection and information security (testing, not just documentation)
  • Recording of emission data
  • Training of SOPs
  • Security
  • Access control
  • Audit trail (if applicable)
  • Alarms, alerts, messages
  • Communication with database interfaces and subsystems
  • Starting and stopping of the system
  • Deactivation of the system
  • Control sequences
  • 21 CFR tests, if applicable: creation of electronic records, use of electronic signatures
  • Backup, archiving and restoration of information

In implementing the Operational Qualification (OQ) Protocol, the use of photographs, print reports and screen capture as documentary support is recommended.

The test methods and supporting documents to be submitted for operational qualification include the following elements:

  • Operation yes / no
  • Operating parameters:
    • Response times
    • Absence of errors
    • Stress test system
    • Backup power supply
    • Server redundancy
    • Etc.
  • Absence of change controls and open deviations

Qualifications for both operation and performance, can potentially be tested with White Box or Black Box. The use of White Box testing is especially recommended in cases where there is custom software or hardware (customization); in other cases, the use of Black Box testing is accepted, which are more practical given the level of standardization of systems other than the 5 categories. White Box tests are those in which all process or system elements that lead to obtaining results are inspected and evaluated. These elements must be specified and verified in detail at both installation and operation. In Black Box testing, functionality is tested.

During validation of a Category 5 system, some elements that are usually checked in the White Box are:

  • Programming diagrams
  • Wiring diagrams
  • Hardware configuration
  • System logical inferences
  • Programming developments and macros
  • Formulation of operations (programming notation versus mathematical notation), etc. 
 
An example of systems that should be white box verified are SCADA systems; these are Category 5 systems built based on the need for automation and therefore all components should be verified.

What are the security features that are challenged during operational qualification?

There are two types of computer system security: physical security and logical security. Both must be verified during system operation.

What is physical security and what is logical security?

Physical security

The physical security of a computer system includes the application of physical barriers and control procedures against threats primarily to hardware and the integrity of the data it manages.

This type of security aims to deal with the threats posed by both humans and the nature of the physical environment in which the system resides. The main threats are:

  • Natural disasters, accidental fires and any variation caused by environmental conditions
  • Threats caused by man as theft or sabotage
  • Intentional internal and external disruptions
 
Assessment and ongoing monitoring of the physical security of the system is the basis for beginning to integrate security as a primary function thereof. A controlled physical access environment helps reduce losses and have the resources to deal with accidents.
 
Elements of physical security to be verified include:
  • Physical barriers: doors, security bars, etc.
  • Devices such as switches with key locks, etc.
 
Even combinations of physical and logical security elements, such as biometric doors with access control, can exist.
In all cases, the ability of the physical security element to protect the system must be documented and questioned.

Logical security

Logical security of a computer system includes application software barriers and procedures that protect and provide access to data and the information contained therein. 

Logical security aims to achieve the following objectives: 

  • Restrict access to programs and files
  • Ensure that users can work unsupervised and cannot modify programs or files that do not match
  • Ensure that data, files and programs being used are used correctly
  • Verify that the information sent is only received by the recipient that was sent and that the information received is the same as the information sent
  • Have alternative contingency steps for sending information
 

Key elements to consider for logical security include the following:

  • User name/password combinations
  • Locking of fields or cells
  • Assigning user profiles
  • Biometric controls
  • Audit trail
  • Warning messages / alarms
  • Password rotation
  • Electronic signatures
  • Firewall and antivirus settings

Security policies must be established for the above items. These policies may be defined in the Quality Manual, Validation Master Plan, procedures or internal policies of the organization, as appropriate.

During operational testing, the following should always be checked for both physical and logical security:

  • The existence of the security element and its constituent elements. The configuration of the security element.
  • The non-violation of the security element.
  • The effectiveness of the security element in maintaining data integrity and system operation.
  • Documentation management and configuration of the security element.
  • For SCADA systems, part of operational verification includes verification of each of the signals in the system, both in intensity and in relation to each of the activated functions.

What is Data Audit or Audit Trail?

A data audit (also called an Audit Trail) is a software component with a secure log and timestamp that provides evidence of the sequence of activities that at any time affected the data of an operation, procedure or event in a computerized system. It is the chronology of “Who”, “What”, “When”, and “Why” of a record. It also includes monitoring the creation, modification or deletion of data. In this context, the audit trail can have two effects on the security system:

  • Allows tracking of all changes to both the system and the information it manages for delineation of responsibilities for an anomaly and reconstruction of any changes
  • It is a deterrent against attempts by intruders to make unauthorized changes

What tests should be performed to check the audit trail?

During validation, tests are performed to demonstrate the following:

  • All activities are recorded in the log
  • No one can shut down or suspend
  • Only personnel with appropriate privileges can view and print the report
  • The records generated may not be altered or changed
  • The log contains at least:
    • Date
    • Hour
    • Name of the user
    • Activity performed
#OQ
Test description
Acceptance requirements
1
Check that you have obtained the qualification protocol and system installation report. In the report card verify the recommendation.
The recommendation of Installation Qualification must be satisfactory
2
The user must approve the document by electronic signature
The user must approve the document by electronic signature
3
Check that the system administrator allows the user to create, delete, modify new folders according to the required root level
Check that the system administrator allows the user to create, delete, modify new folders according to the required root level
4
Check that users cannot change the date and time on the computer equipment
Users cannot change the date and time on the computer equipment
5
Check that the system asks for a username and password to access the system
The system should prompt you to enter username and password to access the system
6
Check that the system administrator allows the user to view the data audit (audit trail)
The system should allow the user to view the administrator’s data audit (audit trail)
7
Check that the data audit (audit trail) allows searching for tracks
The data audit (audit trail) system shall allow searching by traceability using the following options: Search by a date parameter (from-to), search user, search action tasks
8
Check that the history recorded in the control system cannot be changed or deleted
The audit history system for recorded data (audit trail) should not allow modification or deletion
9
Check that the system activates cycle start on command of XX
XX command to activate the system starts the cycle, displays the cycle data screen and allows the deactivation cycle with the cancel button
Table 10: Examples of operational qualification tests

Performance Qualification

What is performance qualification?

The purpose of the performance qualification system is a computerized document verification that a system is capable of performing and controlling the activities required by the process in accordance with the approved specifications before operating in a specified operating environment (GAMP). This specified operating environment includes personnel, facilities, procedures, equipment and any other policies relevant to the implementation of the system in the organization. It is to demonstrate the effectiveness and reproducibility of the operations performed by the system once it is integrated into the process. This verifies the demonstration that the previously established requirements of the process and routine use conditions are met, always within the established operating ranges.

Why perform performance qualification?

In addition to compliance, performance qualification is the stage where validation is to demonstrate the suitability to use the system by meeting all user requirements (expectation system). These qualification tests attempt to mimic the process in which the system performs to demonstrate that the procedures and process controls meet user requirements.

What should performance qualification verify?

During performance qualification, evidence is provided that the system, with all its elements, is performing according to the process parameters, and the system performance is verified.

The set of tests in this qualification must be increased according to the procedures and process controls, with respect to the user requirements established at the beginning of the validation cycle. It should verify the consistency of results through tests involving complete workflows and their application to different users.

For performance qualification, it must run with different users and different scenarios within the process. The execution under process and its results are expected to be consistent with meeting the established user requirements. Unlike operational qualification, this qualification will test the system for expected results for the entire flow (or flows) of the process with all integrated components. Tests are defined with reference to the process flow identified under process.

When implementing the Performance Qualification (PQ) protocol, the use of photographs, print reports and screen capture as documentary support is recommended.

#PQ
Test description
Acceptance requirements
1
Check that you have obtained the qualification protocol and report from operational computerized system. In the report card check the recommendation.
The recommendation of Operational Qualification must be satisfactory
2
Request a user role to prepare documents to enter the system with your username and correct password
The system must allow the user to enter the system
3
Request all users with roles to prepare to attach a file to the folder during processing and notify user reviewers
The system should allow all users to attach a document and should also send an e-mail notification to user reviewers
4
Request the user with a review role to move the file to the review stage and download it according to the route that mail notification shows
The system allows the user to download the file
5
Request the user with a review role to process a folder and notify the processor
The system should allow the user to attach a document and should also send an e-mail notification to the user processor
6
Request the user with a review role to attach the file and notify the user with authorization role to review the document
The system should allow the user to attach a document and should also send an e-mail notification to the user with authorization role
7
Request the user to electronically sign the document, add it to the folder authorization, and send a notification to the user and the reviewing developer
The system must allow the user to authorize a role to electronically sign the document and must also allow the file to be attached and associated notifications to be sent
8
Allow users with a role to develop, review and approve the document by digitally signing it
The document must contain digital signatures with the date and time they were made
Table 11: Examples of performance qualification tests

Qualification and Validation Reports

At the end of each phase, a report presenting the results must be prepared. 

This report should be prepared as a conclusion of the qualification, summarizing its activities, if there were any deviations from the validation plan and the status of the system compared to the objectives of the project and the previously established requirements. It should also clearly identify the status of the accepted or unaccepted research to which it relates. The format and additional content of each report reflects the provisions of the Quality Management System and validation procedures in each organization, without losing sight of the absolute minimum required by regulatory compliance (see topic: What requirements must the Quality Management System comply with?).

The report can be prepared in several ways to present the results and conclusions. It is common to prepare a report for each running protocol with an analysis and summary of results. However, you can also create a general report card, especially for those simple low-risk systems or where multiple protocols are unified.

After performance qualification, we conclude whether or not a system can be declared validated (deemed qualified or not in the case of infrastructure). This requires the preparation of a validation report justifying such a conclusion and establishing compliance with the following regulatory aspects that should have been demonstrated during the validation phases:

  • Ensure accuracy, reliability, functionality, consistency and the ability to distinguish between invalid or changed records
  • Expect the ability to control protection, integrity and backup information
  • You have obtained protection of records to be created, modified, maintained, archived, retrieved and/or transmitted
  • Has a system of protection, integrity and backup information
  • Access is controlled
  • Determination that individuals who develop, maintain or use systems have the ability, training and experience to perform their assigned duties

It must also indicate the existence of all documents recorded in the validation plan relating to the deliverables, showing release.

Finally, the report must clearly establish that all mandatory requirements have been met for the system to be considered suitable for its intended use (validated). The report should include the conclusion of all validation phases (DQ, IQ, OQ and PQ) and the system must be declared as VALID or NOT VALID.

FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

Maintenance of validated status

Once you have qualified the infrastructure and validated the computer systems, it is important to have oversight to ensure they are kept under control.

While computer systems and infrastructure are dynamic in nature, i.e. changes, version upgrades, extensions, infrastructure upgrades, staff that operates changes, are presented; methodologies are established to maintain the qualified/validated state, as these changes may directly or indirectly affect product quality, patient safety, data integrity, internal policies, regulatory compliance or business. Changes are transparent to users, that is, unobserved level interface do not affect system operation, but changes may also partially or completely alter process flow.

Ideally, during the design phase of the life cycle of computer systems / infrastructure, the implementation of the qualification / validation system is carried out. In the next step, which is the operation of computer system / infrastructure and where changes may occur according to their effect, addenda to documents are developed, or through a new assessment / full validation. 

An addendum to document qualification / validation is recommended only if the change is small, affects only part of the study qualification / validation and the impact is not significant. That way you do not have to redo the study, but only those items that were affected. It is important that the risk analysis identify and consider all other elements of the qualification affected by the changes that may lead to greater damage and loss of validated status.

In the retirement phase, which can include data migration, withdrawal for destruction and retention of information or infrastructure as part of maintaining the validated state, the decommissioning can be performed in which all information that needs to be traced to the computer system / infrastructure must be removed and the final disposal will be indicated. In the case of specified information if a migration has been performed, or if it is obsolete, indicate how long you intend to retain as a consultation and where to verify this. Similarly, the risk analysis should form the basis for evaluating and deciding on the elements to be controlled at this stage.

Qualified status must be maintained by implementing programs as described below:

  • Change Control: You must document the changes in infrastructure / computer systems and conduct an impact assessment
  • Training and qualification of personnel: There must be evidence of training as per what is indicated in the parent training and personnel must be qualified as specified in the company’s procedure.
  • Preventive / corrective maintenance: There must be evidence of preventive maintenance according to the program, and evidence of corrective maintenance for both software and hardware, as well as deviations, CAPAs and risk analysis assessing the impact on the qualified / validated state of the infrastructure / computerized system
  • Deviations: You should follow up deviations or non-conformities according to the procedure established in the organization
  • Preventive actions and corrective actions: It should follow up CAPAs and assess the impact on the qualified / validated state 
  • Risk Management System: A system should be in place to identify, mitigate and monitor potential failures in the infrastructure and computer systems
  • Continuous monitoring: Check by routine monitoring the status of the infrastructure/computer system and the monitoring of these programs to determine if they are in a state of control

You must document the control methodologies for the above programs and the allocation of responsibilities for them. Maintenance of the qualified/validated condition is an activity that occurs on an ongoing basis; adequate monitoring, documentation and evaluation of program impacts can maintain the control condition.

It is imperative that any field personnel that is involved in the operation or management of infrastructure / system is aware of nonconformities, CAPAs and change controls. Thus, they are documented in a timely manner when a deviation or nonconformity, CAPA or change control occurs.

Some examples of cases where qualified/validated states are lost:

  • System changes that are not documented: Change of version of the computerized system, changes of settings, server change, removing / gaining modules in systems
  • Maintenance: Not performed or not according to the program, undocumented maintenance, major maintenance where components are replaced and an impact assessment is not performed
  • Training: Lack of continuity of training or new personnel are not trained in accordance with existing procedures. This may result in process flow not being respected and activities not approved.
  • Deviations: Failure to follow up on deviations that arise or are not documented
  • CAPAs: CAPAs are not implemented or their closures are not documented
  • Risk management system: Not having a risk management system, not having tools to identify errors, controls are not in place to mitigate or eliminate risks

Maintaining the validated status in outsourced activities

Very important to keep computerized systems validated/qualified is keeping track of all outsourced activities that affect them. For example, hosting services, systems operating from the cloud and maintenance performed by a third party.

In such cases, it is necessary for suppliers to be involved to keep the maintenance program in a validated/qualified state; the key activity to consider is: 

Site audits 

If possible, it is advisable to conduct a vendor visit, for example hosting to verify compliance and document activities such as backups, debugging information, physical server maintenance and personnel training. When it is not possible to conduct on-site visits, for example, for cloud-based system vendors, the maintenance plan should document why you cannot conduct an on-site audit (geographic location, internal vendors or other) and ensure that service contracts are updated, valid and adhered to according to the service and contract scope. This can be done as part of the ongoing vendor assessment.

Conclusions

This guide establishes the necessary basic guidelines, the scope of validation, to understand the process of validating computerized systems and the GxP impact in organizations. However, the activities to be performed depend on the life cycle chosen, the deliverables to be met, and the level of effort and documentation required to demonstrate regulatory compliance. These depend on the complexity of each of the systems to be validated, their age, their effects, the process for which it is intended and their respective risk analysis, determining the best strategy for validation. 

This entire guide emphasizes the great importance of knowledge of the process for which the computerized system is intended to verify the suitability and thus the use of the system. The process drives, and in this sense it is essential for a good computerized validation study to have robust knowledge of the computerized system and the process it serves through proper characterization systems. Another very important aspect is having a complete QMS that allows interaction of the areas involved to prepare the documentation to support such validation.

In addition to the risk categorization of each system, the validation process of the deliverables depends on its age, the stage of the life cycle it is in, and the chosen V-model that best fits the need to demonstrate the appropriateness of using the system.

This guide provides a comprehensive overview of the Validation of Computerized Systems, where an understanding of the logic of the work allows the methods presented here to be adapted to the needs of each organization in different scenarios. 

FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

Basic concepts

ADDENDUM: Addenda, an addition to a document.

REGRESSION ANALYSIS AND TESTING: A task of verification and validation of software to the extent of analysis and testing.  Verification and validation should be repeated when changes are made to a component or system software previously discussed.

APPLICATION SOFTWARE: Software installed on a platform/hardware that provides specific functionality. Software or program that is specific to the solution of the application problem (GAMP5).

ARCHIVE: It is the process by which records are protected from the possibility of modification or deletion, and these records are stored under independent data control for the required retention period. Archived records must include, for example, associated metadata and electronic signatures.

AUDIT TRAIL (data audit): The audit trail is a form of metadata that contains information about actions associated with the creation, modification or deletion of GxP records. A data audit provides a secure record of life cycle details such as creation, addition, deletion or modification of information in a log, paper or electronic, without hiding or overwriting the original record. A data audit facilitates the reconstruction of the history of such record-related events, regardless of medium, including the “who, what, when and why” of the action.

AUDIT: Systematic, independent and documented process for obtaining audit evidence (records, factual statements or other information) and its objective evaluation to determine the extent to which audit criteria (set of policies, procedures or compliance process requirements used as reference) have been met.

BACKUP: A security copy. It is a copy of one or more electronic files made as an alternative in case the original data or system is lost or becomes unusable (for example, in the event of a system failure). It is important to note that the backup file differs in that backups of electronic files are usually stored only temporarily for the purpose of disaster recovery and may be overwritten periodically. Such temporary backups should not be considered a mechanism file.

BIOMETRIC: The method of verifying the identity of an individual based on the measurement of the physical characteristics of the individual(s) or repeatable action(s) if those characteristics and/or actions are measurable specifically for each individual.

BIOMETRIC DEVICE: They are used in computerized security systems primarily to identify physical characteristics such as facial features, eye patterns, fingerprints, voice and writing.

GOOD DOCUMENTATION PRACTICES: In the context of these guidelines, good documentation practices are measures that collectively and individually ensure that documentation, whether paper or electronic, is secure, attributable, legible, traceable, permanently and simultaneously recorded, original and accurate.

BUSINESS CONTINUITY PLAN: A documented and maintained plan to define the ongoing process supported and funded by management to ensure that necessary steps are taken to identify the impact of potential losses, maintain strategies for viable recovery and recovery plans, and ensure continuity of services of the written plan through staff training, testing and maintenance plan.

BUSINESS CONTINUITY PLANNING: A managed process for developing and maintaining interorganizational plans to counteract disruptions to business operations.

INSTALLATION QUALIFICATION OR INSTALLATION VERIFICATION TESTS: Documented that a system has been installed according to written specifications for design and configuration verification.

INSTALLATION QUALIFICATION: A documented evidence that equipment, facilities and systems have been installed according to previously established design specifications.

OPERATIONAL QUALIFICATION: A documented evidence that equipment, facilities and systems consistently operate according to previously established design specifications.

PERFORMANCE QUALIFICATION OR PERFORMANCE VERIFICATION TESTS / REQUIREMENTS: Documented that a system is capable of performing or controlling the activities of required processes according to requirements and user specifications written in your business environment and computer testing.

OPERATIONAL QUALIFICATION OR OPERATIONAL / FUNCTIONAL VERIFICATION TESTS: Documented that a system operates according to the operational specifications written along sets of specified operation checks.

QUALIFICATION: This is the performance of specific tests based on scientific knowledge to demonstrate that equipment, critical systems, facilities, personnel and suppliers meet previously established requirements, which must be completed before processes are validated. 

CAPAs: Corrective and preventive actions

CHANGE CONTROL: The process of ensuring that a computerized system remains valid following a change. It includes assessing the effect of the change to determine when and if repetition of a validation or verification process or a specific part of it is necessary for appropriate action to ensure that the system remains in a validated state.

LIFETIME OF THE SYSTEM: The period of time that begins when a computerized system is designed and ends when the product is no longer available for use by end users. The system life cycle typically includes:

  • A requirements phase
  • A planning phase
  • A development phase including:
    • A design phase and a programming and testing phase
    • A qualification phase and a release system consisting of:
      • A system integration and testing phase
      • A system validation phase
      • A system release phase
      • An operation and maintenance phase
      • A system withdrawal phase

LIFE CYCLE: All phases in the life of the system from initial requirements to retirement, including design, specification, programming, testing, installation, operation and maintenance.

Client-Server: The client-server model describes the interaction process between the local computer (the client) and the remote (server).
The client makes its requests (queries, applications, requests) to the server, which processes this request and sends the results back to the appropriate client.
Typically, clients and servers communicate with each other over a network, but they can also both be in the same system (same hardware).

CODING: It is the process of converting information from a source into symbols for communication. In other words, it is the application of the rules of a code.
The reverse process is decoding, that is, the conversion of these symbols into information that can be understood by the receiver.

CONFIGURATION: The adaptation of a software application or hardware element to other elements of the environment and the specific needs of the user.

CHANGE CONTROL (Parenteral Drug Association (PDA)): A formal process by which qualified representatives of appropriate disciplines review proposed or actual changes to a computer system. The main purpose is to document the changes and ensure that the system remains in a state of control.

CHANGE CONTROL AND MANAGEMENT OF DEPARTURES: Documented system for change control, including assessment and effect of proposed change on processes, computer systems.

COTS (COMMERCIAL OFF-THE-SHELF SOFTWARE): Commercially Available Software Direct Sales; a software component supplied by the supplier of a computerized system for which the user cannot claim full control over the software life cycle. Commercially available software whose suitability for use is demonstrated by a broad spectrum of users.

REQUIREMENT CRITERIA (IEEE): The criteria that a system component must meet to be accepted by a user, customer or other authorized entity.

DATA GOVERNANCE: All provisions to ensure that data, regardless of the format in which it is generated, is recorded, processed, retained and used to ensure complete, consistent and accurate recording throughout the data life cycle.

MASTER DATA: Unique data used on a shared basis by multiple users for different purposes.

DATA: Original records are true copies, including source metadata and all subsequent transformations. These reports are generated and recorded at the time of the GxP activity data. Data must be accurately captured by permanent means at the time of the activity. Data may be contained in paper records (such as worksheets and logs), electronic records and audit trail, photographs, microfilm or microfiche, audio or video, or any other means by which information related to GxP activities is captured.

DATA LIFE CYCLE: All phases of the process in which data is created, recorded, processed, reviewed, analyzed and reported, transferred, stored and retrieved and monitored until its disposal and destruction. Plans should be in place to assess, monitor and manage the data and the risks associated with that data in relation to the potential impact on patient safety, product quality and/or reliability of decisions made at all stages of the data life cycle. The set of records of all information relevant to the process in physical or electronic form. The computer system consists of hardware, software and network components, along with control functions and associated documentation. Tests performed to determine whether or not the system meets the acceptance criteria and to enable the customer to determine acceptance of the system. See also Factory Acceptance Test (FAT) and Site Acceptance Test (SAT).

ENCRYPTION: Encryption is the process of making information deemed important unreadable. Once encrypted, information can only be read by applying a key. This is a security measure used to store or transfer sensitive information that should not be accessible to third parties. It may include passwords, credit card numbers, private conversations, etc.

PRODUCTION ENVIRONMENT: The business and computing environment in which a computerized system is used by end users. Computerized systems for regulated production environment is the business and computing environment in which the computerized system is used for regulated good laboratory practices.

USER REQUIREMENT SPECIFICATIONS (URS): If prepared as a separate document, this is a formal document defining the requirements for using the system software on your planned production environment.

FUNCTIONAL SPECIFICATIONS: The functional specification document. This defines the functions and technology solutions specified for the computer system based on the technical requirements to meet the user requirement specifications (e.g., the specified bandwidth needed to meet the user requirements for the expected use of the system).

FDA: Food and Drug Administration, U.S. regulatory agency.

FDA Compliance Policy Guide 7153.17: Those that do not comply with 21 CFR Part 11 and were started on old computer hardware prior to August 20, 1997.

DIGITAL SIGNATURE: The electronic signature based on cryptographic methods for authenticating the sender, using a set of rules and a set of parameters capable of verifying the overall identity of the signer and the integrity of the data.

ELECTRONIC SIGNATURE: Compilation of computer data or a symbol or series of symbols executed, adopted or authorized by a person to be legally attested and equivalent to the person’s handwritten signature.

FIRMWARE: It is a program fixed on a ROM memory and logic sets the lowest level that controls the electronic circuits of a device. It is considered part of the hardware to be integrated into the electronic device as well as software, it provides logic and is programmed by some kind of programming language. The firmware receives external commands and responds to control the device.

STATIC RECORD FORM: Static record format like a paper record or PDF. It is one that is “fixed” and allows limited interaction between the user and the contents of the record. For example, a static record, once printed or converted to PDF files, cannot be reprocessed or allow more detailed baselines or display hidden fields.

GAMP: Good Automated Manufacturing Practices

GAP: G: Good, A: Average, P: Poor.

CONFIGURATION MANAGEMENT: A discipline that applies engineering, management, and supervision to identify and document the functional and physical characteristics of a configuration item. Management of changes to those characteristics, recording and reporting of change processing and implementation, and verification of compliance with specified requirements.

GxP: Acronym group of good practice guidelines for preclinical activities, clinical activities, manufacturing, testing, storage, distribution and post-marketing of pharmaceuticals, biologics and regulated medical devices, such as good laboratory practices, good clinical practices, good manufacturing practices, good pharmacovigilance practices and good distribution practices.

HARDWARE (HW): It is the physical part of a computer or computer system; it consists of electrical, electronic, electromechanical and mechanical components such as cable circuits and light circuits, motherboards, utilities, chains and other material, physical state, whatever it takes to make the team work.

GxP IMPACT: Action that can directly or indirectly affect regulatory compliance, quality policy, product quality and consumer perception.

INFORMATION: It consists of a group of data and ordered surveillance that serve to build a system based on a particular phenomenon or message from an entity. Information can solve problems and make decisions because its rational use is the basis of knowledge.

INFRASTRUCTURE: Hardware and software, and network software and operating systems, or in general, that allows the application to operate.

DATA INTEGRITY: It refers to the extent to which data are complete, consistent, accurate and reliable and that these characteristics are maintained throughout the life cycle of the data. Data must be collected and stored securely so that it is attributable and readable, while being original or a true and exact copy. Ensuring data integrity requires adequate quality systems and risk management, including adherence to sound scientific and good documentation practices.

ISO: International Organization for Standardization

IT: Information Technology

MAINTENANCE OF VALIDATED STATE: Maintenance of facilities, equipment and systems is another important step to ensure that the process remains under control. Once that is achieved, the qualified/validated status must be maintained through routine monitoring, maintenance, calibration procedures and programs.

METADATA: Metadata are data  that provide contextual information about other data needed to understand the data. This includes structural and descriptive metadata. These data describe the structure, data elements, interrelationships and other characteristics of the data. They also allow data to be attributed to a person. The metadata needed to assess data significance should be linked to data security and properly evaluated. For example, in weighting, number 8 is meaningless without metadata, i.e., the mg unit. Other examples of metadata are stamp date/time of an activity, the operator identification (ID) of the person who performed an activity, the ID of the instrument used, processing parameters, sequence files, etc. 

WHO: World Health Organization

PC: Personal Computer

PERIPHERAL: It is called peripheral and/or auxiliary and independent devices that are connected to the central processing unit of a computer, making input/output (I/O) complementary.

VALIDATION MASTER PLAN (VMP): The document containing information about the validation activities performed that establishes details and timelines for each validation work to be performed. Responsibilities related to the plan should be established in accordance with GMP. Systematic, independent and documented process for obtaining audit information and objective evaluation to determine the extent to which agreed criteria have been met.

PROGRAMMABLE LOGIC CONTROLLER (PLC): Hardware element that can be programmed to make decisions based on logical arguments (in the form of electrical signals that activate or deactivate a function). The PLC and connecting elements are considered Category 2 or modified hardware.

SOP: Standard Operating Procedure

PROCESS OWNER OR LANDLORD: The person responsible for the business process.

SYSTEM OWNER OR LANDLORD: The person responsible for the availability and maintenance of the security of a computerized system and the data contained in that system.

PROTOCOL: The written work plan that defines the objectives, procedures, methods and acceptance criteria for an investigation.

REQUIREMENT TEST (IEEE): It represents that stage of the software development life cycle in which the development team and the user area of an information system must ensure that the developed system conforms to the defined requirements.

USER REQUIREMENT TEST: Verification of fully computerized configured system installed in the production environment (or in an environment equivalent to the validation production environment) for, as intended, the computerized business process when standard end users trained in operational procedures are operating and that they define the use and control system. User requirement testing can be a part of performance qualification (PQ) or a separate step in PQ.

DISASTER RECOVERY: Process for planning or deploying resources to restore normal commercial function in the event of a disaster.

REGISTRATION OF LOTS: The set of records of all information relevant to the process in physical or electronic form.

ELECTRONIC RECORD: Any combination of text, graphics, data, audio, image or other representation of information in digital form that is created, modified, archived, retrieved or disseminated by a computer system.

VALIDATION REPORT: Document that states the conclusion and determines whether or not the system meets suitability to use force and good practice.

SCADA: Software for Supervisory Control and Data Acquisition. Often used to automate processes, it served as PLC hardware component to manage the actions by which the system takes control of the process.

SYSTEM SECURITY: Ensure the confidentiality, integrity and availability of their systems and networks. Security is divided into physical security and logical security.

SERVER: A server is a computer that is part of a network and provides services to other client computers.

QMS: Quality management system

OPEN SYSTEM: An environment in which access to the system is not controlled by persons responsible for the content of electronic records contained in the system.

CLOSED SYSTEM: An environment in which access to the system is controlled by persons responsible for the content of electronic records contained in the system. 

CUSTOM COMPUTER SYSTEM: A computerized system individually designed for a specific business process.

COMPUTERIZED/COMPUTER SYSTEM: Any equipment, process or operation that has attached to it one or more computers and associated software or a group of hardware components designed and assembled to perform a specific set of functions.

COMPUTERIZED/COMPUTER SYSTEM: A functional unit of one or more computers and associated input and output devices, peripherals and software, utilizing common storage for all or part of a program and all or part of the data necessary for program execution. 

QUALITY MANAGEMENT SYSTEM: Represents the set of measures taken in a planned and systematized manner to ensure that pharmaceutical products are of the quality required for their intended use. Quality Management therefore incorporates GMP, GDP, GLP, GDP, GVP, and Risk Management principles. Including the use of appropriate tools.

COMPUTER SYSTEM: A system containing one or more computers and associated software (IEEE).

COMPUTERIZED SYSTEMS: A computerized system collectively controls the operation of one or more automated processes and/or functions. It includes hardware, software, peripheral devices, networks, and documentation, e.g., manuals and standard operating procedures, as well as personnel who interface with the hardware and software, e.g., users and IT support personnel.

LEGACY SYSTEMS: A computer system that has become obsolete, but continues to be used by the user and is unwilling or unable to be easily replaced or upgraded.

SLA: Service Level Agreement, a written agreement between a service provider and its customer to set the agreed level of service quality.

OS: Operating System

INFRASTRUCTURE SOFTWARE: Infrastructure platform on which business software and systems operate or improve their operation.

BUSINESS SOFTWARE: Software used for business processes, which may include those subject to regulatory compliance. Software defined by a market-driven need, commercially available, and whose fitness for use has been demonstrated by a broad spectrum of business users. Software or a specific program for the solution of an application problem.

SOFTWARE (SW): A set of computer programs, instructions and rules for executing certain tasks on a computer.

THIRD PARTIES: Parties not directly managed by the holder of the manufacturing and/or import authorization.

TOKEN: It is a string of characters that has a coherent meaning in a certain programming language. Examples of tokens could be keywords (if, while, int…), identifiers, numbers, signs, or a multi-character operator (e.g. :=). They are the most basic elements on which any translation of a program is developed, they arise in the first phase, called lexical analysis, but are still used in the following phases (syntactic analysis and semantic analysis) before being lost in the synthesis phase.

USER: A person who uses a device or computer and performs multiple operations for different purposes. A user is often a person who acquires a computer or electronic device and uses it to communicate with other users, generate content and documents, use software of various types and many other possible actions.

COMPUTER SYSTEM VALIDATION (CSV): Documented process of ensuring that a computer system does exactly what it was designed to do in a consistent and reproducible manner (SUITABILITY FOR USE), guaranteeing data integrity and security, product quality and compliance with applicable GxP regulations.

VALIDATION OF COMPUTERIZED SYSTEMS: It is the confirmation by verification and provision of objective evidence that the specifications of the computerized system conform to the needs of the users and intended uses and that all requirements can be consistently met.

VERIFICATION: The act of checking, inspecting, testing, verifying, auditing or otherwise establishing and documenting whether items, processes, services or documents meet specified requirements.

 

FREE E-BOOK
A complete guide to Computer System Validation (CSV)

This +100 page guide aims to bring context and define the necessary and appropriate strategies for the validation of computerized systems for Pharmaceutical Industries, Biologics, Biotechnology, Blood Products, Medicinal Products, and Medical Devices, used in activities related to compliance with Good Practices (GxP). Download it now  for free:

Come visit our booth at CPHI Barcelona 2023

Come to see the QbD Group at stand #3G73 at CPHI Conference in Barcelona. And after the conference…Eat & Connect with lifescience professionals at our QbD’s CPHI Networking Drink.