Sunday, November 16, 2008

Qualification, Validation, and Verification

This article considers the distinction among the terms qualification, validation, and verification in the context of pharmacopeial usage.A recommendation for a standardized usage of the terms validation and verification is provided,and general requirements for validation and verification activities are given.The article also emphasizes the importance of knowing when validation or verification is necessary relative to the use of a method to satisfy pharmacopeial article requirements (for which a monograph..
The terms qualification, validation, and verification occur numerous times in US Pharmacopeia 29 (1). Qualification is found in Chapters ‹1035› "Biological Indicators for Sterilization," ‹1043› "Ancillary Materials Cell, Gene, and Tissue-Engineered Products," ‹1046› "Cell and Gene Therapy Products," and ‹1119› "Near-Infrared Spectrophotometry," among others. Validation appears in ‹1225› "Validation of Compendial Procedures," ‹1223› "Validation of Alternative Microbiological Methods," ‹1010› "Analytical Data—Interpretation and Treatment," ‹1043› "Ancillary Materials for Cell-, Gene-, and Tissue-Engineered Products," ‹1117› "Microbiological Best Laboratory Practices," ‹1120› "Raman Spectrophotometry," and many others. Verification appears in ‹1010› "Analytical Data—Interpretation and Treatment," ‹1035› "Biological Indicators for Sterilization," ‹1035› "Biological Indicators for Sterilization," ‹85› "Bacterial Endotoxins Test," and others. The terms also are present in documents from the US Food and Drug Administration, the Environmental Protection Agency (EPA), and the International Conference on Harmonization (ICH). Given the numerous definitions for the three terms, this article in part is intended to provide an approach to fostering more consistency in the usage of the terms.

A recent issue of the Pharmacopeial Forum (2) had a diagram from a proposed General Chapter ‹1058› "Analytical Instrument Qualification" that was intended to show "...four critical components involved in the generation of reliable and consistent data (quality data)." From most to least critical, the components were quality-control check samples, system suitability tests, analytical methods validation, and analytical instrument qualification. In this article, consider system-suitability tests to be the same as verification. Why this usage should be acceptable will be explained. This article will consider validation and verification in detail. Reference to analytical instrument qualification is made. For further discussion of the top tier, "Quality Control Check Samples," refer to Chapter ‹1058› "Analytical Instrument Qualification" (1).

Qualification

Qualification of analytical instrumentation is essential for accurate and precise measurement of analytical data. If the instrumentation is not qualified, ensuring that the results indicated are trustworthy, all other work based upon the use of that instrumentation is suspect. For the purposes of this article, the assumption will be made that the foundation of validation and verification work to follow is based solidly upon well-qualified instrumentation.

Validation

Definitions. Numerous documents provide definitions of validation. A dictionary definition (3) of validation includes "...the process of determining the degree of validity of a measuring device," and for validate: "to make legally valid," with synonyms "verify, substantiate." Clearly, the synonyms do not distinguish between validation and verification, so let us now turn to definitions provided by other sources. USP chapter ‹1225› "Validation of Compendial Procedures" provides the following:

"Validation of an analytical procedure is the process by which it is established, by laboratory studies, that the performance characteristics of the procedure meet the requirements for the intended analytical applications."

From the ICH document Validation of Analytical Procedures: Text and Methodology:

However, it is important to remember that the main objective of validation of an analytical procedure is to demonstrate that the procedure is suitable for its intended purpose (4).


FDA provides a definition of validation in numerous documents. One such document, Guidance for Industry: Analytical Procedures and Methods Validation Chemistry, Manufacturing, and Controls Documentation says "methods validation is the process of demonstrating that analytical procedures are suitable for their intended use" (5). There also are numerous documents defining validation within the context of processes. From FDA's Guideline on General Principles of Process Validation:

"Validation—Establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality attributes (6)."

The same definition is provided in other FDA documents, such as Guideline on Sterile Drug Products Produced by Aseptic Processing. FDA document Guidance for Industry: Quality Systems Approach to Pharmaceutical Current Good Manufacturing Practice Regulations provides this definition:

"With proper design (see section IV.C.1), and reliable mechanisms to transfer process knowledge from development to commercial production, a manufacturer should be able to validate the manufacturing process. In a quality system, process validation provides initial proof, through commercial batch manufacture, that the design of the process produces the intended product quality (7). "

The remainder of the discussion about validation in this article will be restricted to a discussion of method validation.

Does it suit its purpose? The foregoing is clearly not an exhaustive list of the manners in which validation has been defined. It does appear that a recurring theme among the various definitions pertains to demonstrating that the method or process is suitable for its intended use. In this article, consider validation to be the demonstration that a method or process is suitable for its intended purpose. Accepting that, it is imperative that the intended purpose of a method or process is clearly stated at the outset of the validation. An example of the importance of such a statement can be found in Chapter ‹71› "Sterility Tests" (1). It states that "the following procedures are applicable for determining whether a Pharmacopeial article purporting to be sterile complies with the requirements set forth in the individual monograph with respect to the test for sterility." The next paragraph states

"These Pharmacopeial procedures are not by themselves designed to ensure that a batch of product is sterile or has been sterilized. This is accomplished primarily by validation of the sterilization process or of the aseptic processing procedures."

During the years there has been concern that the tests for sterility as provided in Chapter ‹71› are not adequate to prove that a batch of product is sterile. As stated previously, the tests in Chapter ‹71› were intended only to show that a Pharmacopeial article is sterile. Such a demonstration constitutes a necessary but not sufficient condition for sterile pharmacopeial articles. If one were to validate an alternative procedure for that in Chapter ‹71›, it would not be necessary to develop one that is intended to demonstrate sterility of an entire lot of product.

In addition, it is appropriate that the conditions are provided under which the validation was performed. Given that there are essentially countless variations on experimental conditions, product matrix effects, and so forth, a validation cannot reasonably expect to address all such permutations. For example, Method 3 in the section of Chapter ‹1047› "Biotechnology-Derived Articles—Tests", which addresses assays for total protein, indicates in a note:

"[Do not use quartz (silica) spectrophotometer cells: the dye binds to this material. Because different protein species may give different color response intensities, the standard protein and test protein should be the same.] There are relatively few interfering substances, but detergents and ampholytes in the test specimen should be avoided. Highly alkaline specimens may interfere with the acidic reagent (1)."

Therefore, given the following from FDA's Guide to Inspections of Pharmaceutical Quality Control Laboratories: "Methods appearing in the USP are considered validated and they are considered validated if part of an approved ANDA" (8), the use of Method 3 would be valid if the conditions stated are met in testing the material of interest. The same FDA document states "For compendial methods, firms must demonstrate that the method works under the actual conditions of use," which, for the sake of this article, will be considered verification. Chapter ‹1047› provides several other procedures, all also validated, that could be considered given test material that does not satisfy the conditions for Method 3.

Remember the purpose. It is important to bear in mind the purpose of the method to be validated. If the method is intended to serve as an alternative to a pharmacopeial method, then one must establish its equivalence to the pharmacopeial method in terms of the end result. Remember that the purpose of a method in the pharmacopeia is to determine whether the pharmacopeial article (for which a monograph exists in the pharmacopeia) satisfies the requirements in the monograph. If instead the purpose behind the use of a pharmacopeial method is for a purpose other than demonstrating that the article complies with monograph requirements (for example, imagine that total organic carbon is to be determined using Chapter ‹643› "Total Organic Carbon"), it is not necessary to perform the validation relative to the pharmacopeial results. This means that the validation should be conducted relative to the specific purpose for which it is intended. Also implicit in this is the use of a nonpharmacopeial method to determine something for which a pharmacopeial method exists, but again for purposes unrelated to satisfying a monograph requirement. In such a case, it is unnecessary to consider validating the method relative to that in the pharmacopeia.

Verification

If the use of the term validation is restricted to mean the demonstration of suitability of a method or process for its intended purpose, and the term verification for the demonstration that the previously validated method is suitable for use given specific experimental conditions that may or may not be appropriate given the conditions present during the validation, the terminological situation may be clarified.

Figure 1
These actual conditions include specific ingredients or products, specific laboratory personnel, equipment, and reagents. There are, however, instances in the literature where this distinction is not maintained. Consider the dictionary definition given previously for validation and its use of verification as a synonym for validation. Further muddying of the waters occurs when phrases such as "system suitability tests" (see Figure 1 and the system-suitability section in Chapter ‹621› "Chromatography"). The phrase also appears in the "Suitability of the Counting Method in the Presence of Product" section of Chapter ‹61› "Microbiological Examination of Nonsterile Products: Microbial Enumeration Tests", the "Suitability of the Test Method" section of Chapter ‹62› "Microbiological Examination of Nonsterile Products: Tests for Specified Microorganisms", and the "Validation Test" section of Chapter ‹71› "Sterility Tests." (1). In all cases, the intention is to ensure that the validated method will work under the specific conditions the analyst plans to use.

This means that a chromatographic system can deliver resolution and reproducibility on par with the system used during validation. For the two microbiology test chapters for nonsterile products, one must show that microbial growth in the presence of the article to be tested is not hindered. This is because the method depends on unencumbered microbial growth for it to work. In other words, a condition established in validating the method initially was unhindered microbial growth. The use of "validation test" in Chapter ‹71› is unfortunate because the intention was again to demonstrate that microbial growth is not hindered, as indicated in the following text:

"If clearly visible growth of microorganisms is obtained after the incubation, visually comparable to that in the control vessel without product, either the product possesses no antimicrobial activity under the conditions of the test or such activity has been satisfactorily eliminated. The test for sterility may then be carried out without further modification."

It may be advantageous, and more consistent, for the text in Chapter ‹71› to be changed to "Suitability of the Test Method," if not to "Verification of the Test Method." The latter change also may be appropriate for Chapters ‹61› and ‹62›, given that what is being assessed is the verification that the actual test conditions relative to those established during the validation permits the proper functioning of the method. Given the harmonized status of these three chapters, such changes, although possible, would certainly take longer to become official.

The same cautions provided at the end of the section on validation are applicable here. If a method in use previously was derived from a pharmacopeial method but used for a purpose other than satisfying monograph requirements, it is not necessary to adopt a revised method in the pharmacopeia when it becomes official. It is therefore not necessary to reverify the suitability of your test article to the revised method. Likewise, the use of a nonpharmacopeial method for purposes other than satisfying a monograph requirement when a pharmacopeial method exists of potential relevance does not necessitate reverification.

General requirements for validation

There are numerous documents that describe the general approach to a validation process. They describe several characteristics (data elements in Chapter ‹1225›) that may be examined during validation, with specific sets selected based upon the nature of the test method. A brief description of these characteristics is provided herein using the characteristics as outlined in the IC Harmonization Harmonized Tripartite Guideline, Validation of Analytical Procedures: Text and Methodology.

Accuracy is a determination of how close the measured value is (in the case of an analytical method) to the true value. As such, one might define accuracy of method as equal to true value plus error. Error may contain both the systematic error (bias) and imprecision of measurement. With the potential error possible, it is important to include a means of reflecting the "true value" as closely as possible. For many compendial tests, this involves the use of a reference standard. Because a method is expected to be useful over a range of true values, the accuracy should be assessed over the expected range of values to which the method is to be applied. As stated previously, the validation should also state the conditions under which the accuracy was determined. Because it is not possible to determine all possible sets of conditions for which a compendial assay might be applicable, accuracy may need to be verified before use of a validated method. The concept of accuracy is more problematic for microbiological assays.

The precision of a method determined during validation should be representative of the repeatability (reproducibility) of the method. As was the case for the determination of accuracy, it should be determined over the expected range of articles to be measured, and the conditions used during the validation should be clearly stated. As for accuracy, the use of reference standards is common because the goal of the assessment of precision is to determe method repeatability without introducing unknown variance as a result of different test articles or test articles drawn from a heterogeneous source. The latter point also complicates the validation of microbiological assays.

Specificity refers to the ratio of false positives to false negatives. A highly specific method would have a very low ratio, given that it should be able to detect the article of interest present in very low quantities in the presence of much higher quantities of similar but not identical articles. As stated previously, specificity should be determined over the expected range of usage for the method, and conditions used during the validation should be clearly stated.

Linearity, in essence, refers to the existence of a direct relationship between the quantity of article contained in the sample being analyzed and the measured value resulting from the analysis. It is not the purpose of this article to delve into statistical intricacies pertaining to data transformation, the use of linear or nonlinear regression techniques, residual analysis, and so forth. Currently, it is sufficient that an assay purporting to be quantitative in nature must have a demonstrable quantitative relationship between the quantity of material of interest contained in the sample and the measured response.

Range is directly related to linearity, and ties in accuracy and precision as well. It represents the lowest and highest quantities of material of interest contained within the samples under analysis that provide data with acceptable accuracy, precision, and linearity.

Detection limit represents the least amount of material of interest contained within the sample under analysis that produces a signal exceeding the underlying noise. No assertions pertaining to accuracy, precision, and linearity are necessary at this level of material of interest. For example, if a method is validated to have a detection limit of 3 ng of total protein using Method 3 (1), then a sample containing 3 ng would elicit a signal discernible from underlying noise. It would not be possible to state from such data alone whether there was in fact an exact quantity ng of protein in the sample, only that there were at least 3 ng.

Quantitation-limit determination is more demanding in that currently it is necessary to establish the minimum quantity of material of interest contained within the sample that produces a signal that lies within the linear range of data. That is to say, the quantitation limit represents the lowest end of the range.

Intermediate precision (ruggedness in USP Chapter ‹1225› [1]) pertains to the establishment of "...the effects of random events on the precision of the analytical procedure" (4). Referring to the previous discussion under accuracy pertaining to error components, intermediate precision considers random error introduced by such factors as specific equipment, analysts, laboratories, days, and so forth. It is not meant to include systematic error (bias).

Robustness is probably most directly related to the consideration of conditions under which a validated method is shown to be suitable. This text is very useful in considering robustness:

"If measurements are susceptible to variations in analytical conditions, the analytical conditions should be suitably controlled or a precautionary statement should be included in the procedure. One consequence of the evaluation of robustness should be that a series of system suitability parameters (e.g., resolution test) is established to ensure that the validity of the analytical procedure is maintained whenever used (4)."

General requirements for verification

One question that may be asked of the compendia is whether a method provided as official (in the compendia or supplements) requires validation. USP Chapter ‹1225› states:

"...users of analytical methods described in the USP-NF are not required to validate accuracy and reliability of these methods, but merely verify their suitability under actual conditions of use (1)."

This text is consistent with the proposal in this article that the term validation be reserved for the process whereby one determines if a given method is suitable for its intended purpose (which must be clearly defined), and that the term verification be reserved for the demonstration that the conditions under which the method is to be performed will be appropriate for the method.

Another question may be given that verification involves demonstrating that the conditions to be evaluated are suitable for use with the validated method, how does one go about assessing that? It should be evident that a subset of the determinations performed during the validation would be appropriate. Important conditions to consider include equipment, possible matrix effects (components included in the article to be tested that were not evaluated during the validation), and other conditions for which there is no clear indication provided in the method as to their suitability. A proposed new General Chapter ‹1226› "Verification of Compendial Procedures" (see reference 9 for a discussion of this chapter) provides some guidance as to how the verification process may be executed, but ultimately the user is responsible for selecting which of the characteristics (data elements) evaluated during the validation should be examined as part of the verification. The user should establish which of those validation characteristics are critical to the successful use of the validated method.

Summary

There has been some confusion about when an analytical method should be validated and when it should be verified. In fact, there have been occasions when the terms have been used interchangeably. It is suggested that the term validation be reserved for the process necessary to demonstrate that a method is suitable for its intended purpose. Effective validation begins with a proper statement of the purpose of the method. This statement should accompany the method validation report, and in some circumstances, such as with Chapter ‹71› "Sterility Tests" (1), the statement should appear in the text accompanying the method. Depending upon the degree to which robustness is assessed during the validation process, there may be a set of conditions determined that may be suitable for the use of the method, and conditions that are contraindicated. If such conditions have been established, it is helpful for them to accompany the text describing the method (for example, Method 3 in [9]).

The term verification should be reserved for the process whereby it is established that the conditions under which an article is to be tested by a validated method are indeed suitable for that method. The verification process might be considered to include a subset of the validation process, as suggested by Figure 1. The characteristics (data elements) of a validation process are contained in several documents, and which of these are incorporated in the validation should be appropriate to the method's intended purpose (and spelled out in the validation protocol.) The characteristics from the validation that are assessed during the verification should be representative of the critical aspects of the method. An example of the verification of the range for Method 3 was provided. Given that verification, as described in this article, is intended to address the suitability of a particular set of conditions for use with a validated method, robustness is not likely to be important for the verification process.

For both validation and verification, one must remember the underlying purpose of the method. If the method is from the pharmacopeia and is intended to be used in demonstrating that a pharmacopeial article meets requirements (for which there is a monograph), the method is considered to be validated, and it would be necessary to verify that the test article is suitable for use with the method. If the method is from the pharmacopeia but is not intended for use in satisfying monograph requirements, it may need to be validated relative to the specific nonpharmacopeial purpose. If instead the method is not from the pharmacopeia but is intended to satisfy monograph requirements, it must be validated as providing equivalent results to the pharmacopeial method. Finally, if the nonpharmacopeial method is not intended to satisfy monograph requirements, it must be validated according to its specific purpose, and this would not require comparison to any pharmacopeial method.

David A. Porter is a pharmaceutical consultant with Vectech Pharmaceutical Consultants, Inc. (Farmington, MI),

Submitted: Oct. 26, 2006. Accepted: Dec. 11, 2006.

Keywords: qualification, USP, validation, verification

References

1. United States Pharmacopeia 29—National Formulary 24, United States Pharmacopeial Convention, (2006).

1. Pharmacopeial Forum 32 (2), 595–604 (Mar.–Apr. 2006).

3. Webster's Seventh New Collegiate Dictionary. (G. & C. Merriam Co., Springfield, MA, 1965).

4. International Conference on Harmonization Harmonized Tripartite Guideline, Validation of Analytical Procedures: Text and Methodology, (ICH, Geneva, Switzerland, 2005).

5. FDA, Center for Drug Evaluation and Research, Guidance for Industry: Analytical Procedures and Methods Validation Chemistry, Manufacturing, and Controls Documentation, (Rockville, MD, Aug. 2000).

6. FDA, Center for Drug Evaluation and Research, Guideline on General Principles of Process Validation, (Rockville, MD, May 1987).

7. FDA, Center for Biologics Evaluation and Research, Guidance for Industry Quality Systems Approach to Pharmaceutical Current Good Manufacturing Practice Regulations, (Rockville, MD, Sept. 2006).

8. FDA, Office of Regulatory Affairs, Guide to Inspections of Pharmaceutical Quality Control Laboratories, (Rockville, MD, July 1993).

9. Pappa et al., "Development of a New USP General Information Chapter: Verification of Compendial Procedures." Pharm. Technol. 30 (3), 164–169 (2006).

An unusual reclaim process

Q: Filled product is recovered from dose checks or improperly stoppered vials, is dumped into a holding tank, in a class 100,000 area, and once in the tank, it is pumped back to the main product holding tank. It is mixed with the product in the holding tank, refiltered and filled again during the course of the filling operations. Wouldn't this "reclaim process" be considered a reworking of the product? Unless this process was submitted along with the ANDA originally, and validated, it seems that this would not be following good cGMP practice for sterile filling operations. I don't think FDA or any other regulatory agency would find this to be an acceptable practice. What are your thoughts?

A:

Jim Agalloco replies:

I fully agree—that’s a most unusual process. I've seen re-claims of this type before, but they typically aren't a part of every batch. They are used to recover an entire lot that may be off-spec. Unless it is filed that way (along with supportive data), I think the practice described is non-compliant.

Validating the Validators

By Magaly Vega, Professor, Puerto Rico Polytechnic University; José Correa, Taratec Development Corp.; and Ivan Lugo, Executive Director, INDUNIV Research Consortium

PharmaManufacturing.com

Puerto Rico’s Validation Academy and Certification Program bridges the gap between knowledge and experience.

The pharmaceutical industry today is challenged by the need to keep up with advances in science and technology, as well as changing regulatory and compliance issues. Success demands highly skilled validation professionals who not only understand the interdependence of multiple technologies, but the complex integration practices required, and their impact on validation. Such skills are especially important with computer systems validation (CSV; see COMPUTER VALIDATION sidebar below).

To address the need for ongoing training in validation, many scientific and engineering professional associations, such as ISPE, have developed programs designed to ensure competence in key areas of practice. Within Puerto Rico, industry leaders have gone one step further, by establishing a “Validation Academy” within the Polytechnic University of Puerto Rico’s Continuing Education Department. After pilots, the program was officially launched on January 1.

Faculty within the University, concerned consultants and professionals within the industry believe that this program will be essential to fostering the professional growth of validation professionals. The program should also help validation specialists document their specific areas of expertise and sharpen their skills through additional education and training.

Certification as a Validation Professional is voluntary. However, as validation activities become more critical, we believe that more pharmaceutical companies will encourage their validation professionals to pursue certification.

The program will recognize professionals who meet basic eligibility requirements and demonstrate a minimal level of job-related knowledge and skills, based on performance in a standardized examination. To be eligible, one must have a bachelor’s degree in science and three years of experience in pharmaceutical industry validation. For individuals with a significant amount of validation experience, a “grandfather clause” applies, allowing them to receive certification simply by passing a formal examination. (Editor's Note: What's your validation IQ? Test yourself by answering the four sample questions (provided at the end of this article) from a test developed as part of Puerto Rico Polytechnic’s new certification program. The answers may be found by clicking here.)

Faculty members — expert consultants and industry professionals — have identified areas of competency and the knowledge and skills required for each one, and have written test questions targeting these areas. Certification candidates might be asked to:
  • Develop a list of User Requirements for a given scenario;

  • Develop test scripts for a given source code;

  • Use data provided to develop a Traceability Matrix.
Each test gives professionals an opportunity to demonstrate their understanding of the variety of industry standards, protocols and technologies, and how to apply them to specific situations (see table below).

Validation Academy Curriculum Modules General Concepts Module Topics
1. General Concepts 1. Basic Validation Concepts and Documentation
2. Computer Systems 2. Qualification Studies: Facilities, Utilities & Equipment
3. Pharmaceutical Process Technology: Solids 3. Regulatory Trends
4. Pharmaceutical Process Technology: Parenterals 4. Supplier Certification
5. Validation of Analytical Methods 5. Risk Management
6. Biotechnology 6. Cleaning Validation
7. Medical Devices 7. Statistical and Validation Matrix Approach Topics
8. Packaging


After completing a General Concepts module, applicants must complete a specialized module of their choice, and their associated examinations for certification as a validation specialist. Once an individual has completed all eight modules within the course, and their associated exams, they will receive a Validation Professional certificate.

The University will issue certification for a three-year period, and those receiving certification will need to keep up to date on new developments within the field, and build competency through continuing education. Every three years, certified professionals will have to take courses adding up to 14 “Recertification Units” in order to be recertified.


COMPUTER VALIDATION

One of the specialized programs within the Validation Academy will cover computer validation, where there is a great need for certification today. In the past, IT professionals received training while in college on the formal methodologies used to test software. Later, some of these techniques would find their way into a field now officially known as “Computer Systems Validation” (CSV). As technology replaced manual processes, the pharmaceutical industry realized that it needed to take a different approach to demonstrate regulatory compliance and the formal title “CSV Professional” was born.

Today, most, if not all, manufacturing, testing and business systems are controlled by computerized equipment. These Controlling Systems manage pharmaceutical controlled processes or functions.

Validation is required for controlling systems that contain records that are “created, modified, maintained, archived, retrieved, or transmitted, under any records requirements set forth in FDA regulations.” This requirement originated the need for a professional who could establish evidence and associated documentation to prove that a computerized system:
  • Has been developed according to quality software engineering principles;

  • Provides the functional capability required by its users and the regulations, and that it will continue to do so over time.
For example, a LIMS (Laboratory Information Management System) is a complex tool used by laboratories within the pharmaceutical, biotechnology and medical device industries. A CSV Professional using the life-cycle approach starts by asking:
  • What do we want the LIMS system to do?

  • Are we using the system to receive product?

  • Are we connecting the system to other analytical equipment?

  • What reports or certificates of analysis will the system generate?
By developing User Requirements, the CSV Professional provides the company the information needed to understand and specify the system that best suits the operation, and any and all emotion is removed from the decisionmaking process.

Software testing also requires someone capable of understanding the business, the application and computer validation. Testing (as defined by IEEE and adopted by FDA) is the process of analyzing a software item to detect the differences between existing and required conditions, i.e. bugs, and to evaluate the features of the software.

Today’s computer systems control the precise dispensing of a liquid ingredient, the air flow to a sterile area, the administration of samples in a lab and the highly sophisticated supply management handled by an ERP system. An undetected malfunction in a critical system could lead to loss of product, compliance issues, poor quality and damaging consequences to the business.

The CSV Professional should be proficient enough to create a Test Plan that captures the strategy, requirements, testing environment, acceptance criteria, responsibilities and schedule. The individual should have competencies to assess testing performed at the software development stage. These tests should be traced back to user requirements and specifications. A few questions to ask are: Has the logic flow of the software been tested? Has the software been challenged for dead code? Does the software coding follow a structured approach? Did the developers perform a source code testing (White Box) or a functionality test of the executable run code (Black Box)?

A critical mistake is to assume that computer system validation is an on-off event, when in reality it is an ongoing life cycle. A certification program has the benefits of providing a continuous flow of up-to-date information covering best practices in computer system validation. As technology evolves, we can ensure a workforce that keeps pace with advances.

System Validation Requirements

Kevin Lloyd is vice president, chief technology officer of State College, PA-based Centre Analytical Laboratories.

In 1997 the U.S. Food and Drug Administration, in response to requests from industry, issued a regulation specifying criteria for the acceptance of electronic records and electronic signatures as the legal equivalent to manual, paper- based systems. The citation for this regulation can be found in the federal register as 21CFR11.

Citation 21CFR11, also known as Part 11, offers many advantages to industry, including a shortened period for agency review, nearly instantaneous reconstruction of studies, and enhanced confidentiality through limited authorized system access.

In order for an organization to reap the benefits of Part 11, there are many requirements that must be met. Due to the wide scope of Part 11 this article will focus only on the validation requirement. In future articles we will cover the full scope of the regulation.

For an organization to be compliant with Part 11, a number of requirements must be met. See Table 1 (in the print version) for a list of these requirements.

Due to the wide scope of Part 11, this article will focus on system validation requirements only. According to information posted on the FDA’s web site, validation is the establishment of documented evidence that provides a high degree of assurance that a specific process will consistently yield a product meeting its predetermined specifications and quality attributes. Implied by this statement: “If you didn’t document it, you didn’t do it.”

At some point in time, the FDA will inspect every company operating within an FDA-regulated environment. Although the FDA has made it clear that inspections will not be conducted solely for Part 11 compliance, it is well within its purview to inspect for Part 11 compliance within the context of a general audit.

A review of www.fda.gov reveals some common FDA-483 citation letters that have been recorded at previously inspected sites. These citations pertain to absent or incomplete records and/or practices. From the 483s, the following assumptions can be made regarding what the FDA may be looking for when they audit a facility:

• High-level hardware/software documentation, including diagrams and narratives detailing all computer programs and their relationships to each other

• Comprehensive inventories, including operating systems software, terminal emulators and client PC configurations

• Change control and error reporting SOPs

• IT personnel training records

• Software versions and installation dates

• Error reporting logs

• Evidence of proper handling of raw data files and metadata

• Archiving/backup procedures and responsibilities

• Hardware maintenance records

• Software requirements/design/testing documents

• Problem tracking

• Environmental monitoring specifications

• Disaster recovery procedures

It is clear from these requirements that a comprehensive, systematic plan is necessary to address the general expectations listed above, the Part 11 regulations and any applicable predicate rules. This plan can be divided into four phases.


Phase 1: Management Buy-In
Management buy-in is essential to any significant project in any company. The role of management in the validation program is to foster company-wide support for compliance, to make personnel available for appropriate responsibilities and to approve the expenditures necessary for implementation. It is essential to the success of a validation project to not underestimate the commitment required. Our 80-person company has invested 2 FTEs (1.5 man-years) since Jan 1, 1999 to this effort. The project also required the creation of a new position, CSVS (computer software validation specialist), whose full-time job is to ensure Part 11 compliance.

It is also essential to identify an internal “champion” to ensure the success of the computer validation effort. This person should be management-level within the company and have the authority to make key decisions and commit resources to the project. This person should also have the authority to set and enforce policy.

Another requirement for a successful validation is input from disparate functional areas of the company, usually accomplished by the formation of a validation committee. Within this committee, all affected departments must be represented, including IT and quality assurance. All participants must be clearly identified and held accountable to the objectives of the group.

Finally, hire a consultant! This may be costly; however, the project is large, complex and dynamic. It is my opinion that the best way to get started is with the help of an experienced consultant. Our project was nearly 100% dependent on external help at the beginning. However, we have gained experience and confidence and expect to be completely self-sufficient by mid-2001.


Phase 2: Develop SOPs To Address Operation and Maintenance of the Data Servers and Network
Even before the advent of Part 11, most companies understood the need for reliable information backup strategies, the value of a disaster recovery plan and the need for system security. It is the first task of the validation committee to identify all SOPs currently in place that relate to a company’s computer systems. Specifically, the following areas must be addressed:

• Computer System Network Backup and Recovery

• Computer System Data Archive

• Computer System Disaster Recovery

• Computer System Security

• Computer System User Administration

• Computer System Server/Network Monitoring

• Computer System Media Lifecycle

• Computer System Maintenance

As stated, some of these SOPs may already exist. If so, they should be reviewed by the committee and re-evaluated as to their relevance to current and expected practices. If any SOP is out of date or absent, it must be revised or retired and replaced. Computer system SOPs must not only be written, they must also be followed. Regulatory compliance is not satisfied by a well-written SOP; it is the job of the computer software validation specialist to ensure that day-to-day activities are consistent with the SOPs once they are implemented.


Phase 3: Validation System Development
At this point, a validation program must be formally established through a series of SOPs. It would be unusual for pre-existing SOPs to adequately address all aspects of validation system development, although some current SOPs (change control, inventories, etc.) may already be in use. A consultant can be a valuable resource for this phase, providing sample industry standard validation practices that can serve as baseline documents. It is then the task of the validation committee to modify these baseline procedures into detailed practices and procedures that are functional for their specific company. The following SOPs must be written for a successful validation system.

• Computer System Validation Policy

• Computer System Validation Program

• Computer System Change Control Policy

• Development of Computer System Requirements Documentation

• Development of Computer System Design Documentation

• Development of Computer System Test Documentation

• Computer System Configuration Management Plan

• Computer System Software and Hardware Inventory

• Computer System Environments

• Computer System Vendor Evaluation


Phase 4: Conduct a “Gap” Analysis
Once all the policy statements and SOPs are in place, it is necessary to conduct a “gap” analysis. The purpose of this exercise is to represent visually all system validation requirements. Draw a grid with the developed requirements along the ordinate and the individual computer systems along the abscissa. Physically test each system according to the requirements. Any computer system that does not meet one or more of the requirements of the validation program must be considered a “gap.” Gaps require remediation. Compliance only exists when there are no gaps.

Before we look at specific required SOP elements, it is necessary to define the software life cycle and categories of software. These definitions are fundamental to the validation effort.


Software Life Cycle
The graphical representation below is a waterfall model. (There are other acceptable models, but this one, in my opinion, is the simplest and most effective.) The software life cycle is based on the concept that software moves through a series of steps: it is born, it is tested against specs, it is maintained and it is finally retired. The waterfall model depicts this as a series of successive events. At every step, there is documentation to verify compliance.


Categories of Software
According to information presented at the Good Automated Manufacturing Practices (GAMP) ’96 conference, there are five categories of software. Each category has its own validation requirements. Table 2 (in the print version) provides an example of each category.


Computer Validation Policy
The computer validation policy is a high-level SOP that spells out in very general terms the goals of the validation program. An analogy is the corporate mission statement. Most companies have a statement that identifies broad goals and a level of commitment required of each employee of the company. The computer validation policy is similar except that its scope is limited to computer validation.

The following is an example of a computer validation policy for an analytical testing laboratory. It is important to note that the system life cycle features prominently in this policy. This is a foundation concept, around which the rest of the validation program is built.

“To support our goal of establishing ourselves as a premier supplier of analytical laboratory services, XXX shall develop and maintain standards and procedures whereby all computer systems in the company shall be implemented and maintained in a validated state according to FDA computer systems validation guidelines. It shall be the policy of XXX that validation of computer systems used to generate, manipulate, store or transmit data related to all regulated products or services shall follow a System Life Cycle (SLC) approach. This approach shall include documentation of computer system requirements and design specifications, thorough and documented testing of the computer system for compliance, and documented procedures to ensure consistent operation and maintenance of the system in a validated state until it is retired.”

The computer system validation program is another core SOP. This document also has a high-level mission statement. An example is given below:

“At its most basic level, validation is the demonstration of control through documented evidence during development and routine operation of a computer system. This control provides a high degree of assurance that a computer system will consistently yield a product or result meeting its predetermined specification and quality attributes. The process by which computer systems are brought into a validated state and then maintained in that validated state is the facilities validation program.”

Following this general statement, the computer system validation program SOP should detail specific actions to be taken to bring the facility into compliance. The first step, the procedure development phase, sets in stone all the elements necessary for the validation program. There are two distinct types of supporting SOPs required: (1) procedures for the operation and maintenance of the data servers and network, and (2) computer validation procedures. Development can be a time- and resource-consuming task, as it typically requires the efforts of the validation committee as well as help from an external consultant, over multiple iterations of each SOP, until the SOPs reflect the needs of each department.

Once policy SOPs have been developed, it is necessary to conduct a system inventory of all new and legacy systems. Assign a unique number to identify each computer system. Inventory each system and enter the records into a database. All information about the system will be recorded, including hardware configuration such as hard drive size, CPU clock speed and amount of random access memory, as well as a detailed inventory of the software on the system, including operating system version and build number. All new systems records should also include the installation dates of each component.

Once the inventory is complete, a gap analysis can be conducted to identify discrepancies between requirements (the validation program) and what has actually been done for each computer system. Once the gap analysis is completed, the remediation can begin, based upon the remaining elements in the validation program.

Requirements Phase: A requirements definition must be written for each system to identify specific operating and use requirements for the computer system. Requirements are documented and approved by management. The level of detail should be sufficient to support the design phase, commensurate with the GAMP ’96 complexity model. In other words, once the requirements are defined, they will be used as the basis of the design phase. The testing phase will assure that all requirements elements have been accounted for in the design phase.

During the requirements phase, stakeholders from each department that may ultimately use the system meet to discuss features of importance to their respective groups. For example, consider a custom temperature-logging application: QA may be interested in the calibration features of the application; operations may be interested in the ease of use while management may be interested in the initial cost and cost of operation. Once again, it is imperative that all groups are represented and all requirements are defined in this phase.

Design Phase: The design phase follows the requirements phase. The purpose is to assure that each element defined in the requirements phase is addressed. Level of detail is commensurate with the GAMP ’96 complexity model, which states, for example, that there is a difference between commercial off-the-shelf (COTS) software and a fully customized system: the former requires much less validation than the latter.

Implementation Phase: During the implementation phase all elements of the system, as defined in the design phase, are brought together. Elements may include COTS or custom code, or both. The product is a fully functioning piece of software that meets the original requirements.

Testing Phase:
During this phase, the computer system is tested against specifications in the requirements definition phase. Tests are structured to provide traceability to both design and requirements documentation. Testing has three elements:

• Installation Testing – e.g., environment, power, plumbing

• Operational Testing – e.g., each individual function

• Performance Testing – e.g., maximum users, maximum data flow

Once testing is complete (successful) and the system is put into production, it enters a new phase: routine operation and maintenance. During this phase, the IT department must provide support and consultation to end-users. A possible outcome of this interaction may be maintenance. If so, all maintenance must be performed in a very controlled manner, under the conditions of the change control SOP. Any changes to the system inventory would also be updated at that time.

Finally, there is a decommissioning phase. At some point, it becomes easier to replace rather than maintain the current system. Starting over can be as simple as a new version/upgrade of currently used software or it can be as radical as replacement of the current system with that of a competing vendor. In any case, all data is archived and/or migrated to the new system. The old system is then removed from operation and a full audit trail of the dates and reasons for decommissioning is kept.

Change Control: The final SOP in the validation program defines how changes to a validated computer system are managed and documented. Changes are tested on a non-production system before they are implemented on the target system. The computer systems validation specialist must evaluate the impact of the change on the validation system. If it is significant change, revalidation may be required. Minor changes may require only documentation.

It is my opinion that a custom database application is the best way to implement change control. This system would include the following steps, each designed to demonstrate authorization and provide a documentation trail:

• Initiation – Anyone within the company can identify, recommend or initiate a possible change

• Acknowledgement – IT acknowledges receipt of the initiation (by e-mail, for convenience)

• Pre-approval – IT reviews the request, evaluates initial feasibility

• Implementation – IT executes the change

• Resolution – IT documents resolution of the problem/ change

• Testing – Demonstration of compliance and documentation of results

• Final approval – Acknowledgement of the action/resolution by the initiator, network administrator and the computer software validation specialist

• Update – Manual entry to system inventories reflecting the changes

• Audit – Performed by the computer software validation specialist

This article has focused on the validation requirement of Part 11, specifically, the high-level procedural SOPs that make up an effective validation program. Future articles will address specific objectives of the requirements, design and testing SOPs, as well as specific requirements of the operation and maintenance of the data servers and network.

Validation Requirements


This article is the third in a series of articles on compliance with the electronic signatures/electronic records rule found in 21CFR11. The previous articles summarized the requirements of Part 11 and specifically looked at the high-level, policy SOPs necessary to satisfy the validation requirement as well as requirements and design documentation.

This article will also focus on the validation requirement of Part 11. Here we will look at the specific elements necessary to meet the validation requirements for each computer system for test documentation.


Test Documentation
Following a System Life Cycle model, test documentation is developed based on the requirements and system functionality as defined in the system requirements documentation. The test documentation is used to verify that each requirement and function defined in the requirements documentation has been implemented in the design of the system.

Test documentation covers both hardware and software. The testing of the system is generally divided into three sections: Installation testing verifies that the physical installation of the system meets the defined requirements; Operations testing verifies that the system performs the defined system functionality; Performance testing verifies that the system will operate at the extreme boundaries of limits of the requirements and functionality, i.e., maximum volume and system stress. Installation testing and some form of operational testing are performed on all systems, while Performance testing is used for scalable systems and custom (categories 4 and 5) systems.

First, we must discuss the guidelines for the development of test documentation, defining how to write test cases and what tests are required for each complexity category model. Then, we will define how to document the execution of the test cases.

The guidelines presented in this section for the development of testing documentation shall follow the complexity model presented in Table 1 (p. 54).

Regardless of the complexity of the system, the following sections are required in any testing document.

Header and Footer: The header and footer should follow the format generally used in company SOP documents. For example, headers usually contain standard elements (company name), document type ("Requirements Document," for example), and title of the document ( "Requirements Documentation for LIMS version 2.5.") The footer usually contains at a minimum the page number ("Page 5 of 35.")

Approvals: All requirements documents shall have an approvals section where responsible persons will sign. I would suggest the following persons: the author of the document, the computer systems validation specialist and a member from IT.

Introduction: The introduction is a brief statement of what the system is, what the system does and where it is to be located.

Ownership: The ownership section is a statement identifying the owner of the system. The statement of ownership is written in a generic sense, identifying the department that owns the system and the department manager as the responsible person.

Overview: The overview is a detailed description of the system, indicating what the system is expected to do and how it fits into the operation of the department. If the system described in the requirements document is a replacement for an existing system, a brief description of the current system should be included. Enough detail should be included in this overview to give the reader an understanding of the system and what it does without going into discrete functions.


General Instructions to Testers
This section defines the procedures and documentation practices to be followed when executing the test cases. A further explanation of these procedures and practices are presented later in this document.

Discrepancy Reports: This section defines the procedures to be followed when the actual results of the test do not match the expected results.

Signature Log: This section is used to identify all personnel participating in the execution of the test cases. The person’s printed name, full signature and initials are recorded at the beginning of the execution of the test cases.

References: This section identifies the appropriate reference documents pertinent to the execution of the test cases. These documents shall include the SOP, the appropriate requirements documentation, and the appropriate design documentation.

Prerequisite Documentation: This section lists the validation documentation such as the requirements documentation and the design documentation that is to be in place before the execution of the test cases. The first test case shall be verification that these prerequisites have been met.

Test Equipment Log: This section is a log in which all calibrated test and measuring equipment used during the execution of the test cases is logged.

Test Cases: This section presents the test cases used to verify that the requirements and functions defined in the requirements documentation have been met. Each test case shall test one requirement or function of the system. In some cases, several closely related requirements or functions might be verified in one test case. Test cases shall be written in a landscape page setup using a table format and include the following elements:

• Objective – This is a brief statement indicating what requirement, function or module of the system the test case is intended to verify.

• System Prerequisite – This section describes any previously inputted data or other system status that must be in place to properly execute the test case. For example, when testing system security, users of various security levels may need to be in the system in order to test security levels at login.

• Input Specifications – This section defines any specific input data required to execute the test other than keystroke entries. This may include instrument test data files, barcode files, etc. A specific data file may be identified or the file identified may be generic.

• Output Specifications – This section defines the expected output of the test case other than output to the monitor. The output may be identified as reports, data files, etc.

• Special Procedural Requirements – This section details any special considerations that may be required to successfully execute the test case.

• Test Procedure – The test procedure is setup in a table format with the following column headings:


Procedural Steps – This is a systematic series of instructions to be followed in the execution of the test. These steps should be sufficiently detailed as to allow the test to be duplicated by any qualified personnel without changing the outcome of the test.

Expected Result – For each procedural step, the expected outcome of that step should be defined. The defined outcome should be detailed enough to allow an unequivocal determination of the pass/fail status of the step.

Actual Result – This column is to be left blank and completed by the person executing the test when the step is executed. The actual result of the step is recorded at the time the test is executed.

Pass/Fail – This column is used to record the Pass/Fail status of the test by comparing the actual result to the expected result.

Tester Initials and Date

• Comments – This section is complete following execution of the test case and is used to record any discrepancies or details not captured elsewhere in the test script.

• Test Conclusion – This is an indication of the Pass/Fail status of the test case.

• Tester’s Signature – This section records the signature of the person executing the test case and date.

• Verification Signature – This section re-cords the signature of the person verify ing the test results and date.


Requirements Traceability
Each requirement and function defined or described in the requirements documentation shall be reflected by one or more test cases. Following completion of the test documentation, the requirements documentation shall be reviewed to ensure that each requirement is reflected in the test documentation. The section number of the test case that fulfils the requirement shall be recorded in the second cell of the table in the right margin of the page. This provides a cross-reference between the requirement and the test case and ensures that the requirements are being completely verified.

As requirements of the system change, test cases that no longer have an active requirement shall be voided, not deleted. To void a test case, delete the test but not the section number, then enter "VOID" in place of the text. This prevents sections from being renumbered after a deletion and invalidating the references in the requirements document. This also eliminates the potential for confusion caused by re-assigning a section number previously used for an unrelated design element.


Considerations for Developing Test Cases
Installation Testing
Hardware Installation Testing – Hardware installation test documentation provides a high degree of assurance that the hardware has been installed according to the vendor’s specifications and configured in accordance with the requirements and design documentation for the system. This may include access space, power supply/UPS, network communications, interconnecting wiring/ cabling, ambient temperature, ambient relative humidity and peripheral systems connections.

Software Installation Testing – Software installation test documentation also provides a high degree of assurance that the software has been installed according to the vendor’s specifications and configured in accordance with the requirements and design documentation for the system. Hardware installation testing must be performed before installation and testing software. Software installation testing applies to all software components that are a part of the system including operating system software, network and communications software, OEM software and custom software. It should also include virus checking, verification of required drive space, RAM space and drive configuration, software version numbers, security access for installation and operation, directory configuration, path modifications and any system parameter configuration.


Operational Testing
Operational test documentation provides documented evidence that the system fulfils the requirements and functions as defined on the requirements documentation. Each function described in the requirements documentation shall be tested independently.

When a function is defined with a specified range of inputs or range for a data field, that function shall be tested at each end of the range. For example, if the range of acceptable inputs is 1 to 5, the test case shall challenge the functions using inputs of 0, 1, 5 and 6, with 0 and 6 expected to fail.

Test cases shall also test to verify that illogical inputs are not accepted. For example, test cases shall challenge a date function with non-date inputs, or a numeric field with non-numeric characters.

For COTS applications, many times the vendor will supply the test cases. These may be used in lieu of test cases developed in-house. There shall be a documented review of the test cases provided by the vendor relative to the functional documentation to ensure that the vendor test cases are sufficient. For applications where vendor-supplied test cases are used, requirements for the various functions of the application shall not be required, provided there is documented evidence that the vendor has followed a validation methodology.


Performance Testing
Performance testing ensures that the system shall stand up to daily use. Test cases during performance testing shall verify that the system can function properly during times of high user input and high data throughput.

Performance testing is not always applicable. For systems with only a single user, stress on the system is inherently limited. Performance testing is usually executed on complex systems with multiple inputs and outputs as well as network-based systems.


Test Execution
Before the start of testing, all personnel participating in the execution shall enter their names, signatures and initials in the signature log of the test documentation.

Once execution of a test case is started, that test case must be completed before moving to the next test case. An exception to this would be if the test case fails, the failure is noted and the next test case is started. Execution of the entire set of test cases does not have to be completed in one sitting, though once testing begins, the systems may not be altered. Any failed tests shall be recorded and testing shall continue unless the failure prevents continuation. If testing must be discontinued in order to correct any issues in the system, then all tests must be re-executed.

As test cases are executed, information shall be recorded neatly and legibly in black ink. Sign-off and dating of the test case shall occur on the day that the test case was executed. Mistakes made during execution of the test cases shall be corrected using a single strikethrough, so as not to obscure the original data. Any corrections made in this manner shall be initialed and dated by the person making the correction.

Errors in the test script shall be corrected using a single strikethrough so as not to obscure the original data. Any corrections made in this manner shall be initialed and dated by the person making the correction. Any corrections made to the test case during execution shall be justified in the comments section of the test case.

Completed test forms shall not contain any blank spots where information might be entered. If an item does not apply to the current test, the tester should fill in a ‘N/A’ followed by an initial and date. Completed tests are reviewed and approved by the Computer Systems Validation Specialist, or designee, who signs and dates the bottom of each approved test page.

Test Failures and Discrepancy Reports
Results of tests that do not match the expected results shall be considered failures. The failure of a single step of the test case shall force the failure of the test case. However, if a step in the test case fails, execution of the test case shall continue unless the failure prevents completion of the test case.

Errors in the test case that would result in test case failure if left alone may be corrected as noted above. This correction must be justified in the comments section of the test case. Steps in the test case where the expected result is in error shall not be considered test failures if corrected. Test failures shall be noted in the test case and documented on a test discrepancy form. The form is then logged to facilitate tracking of errors during system testing.

Upon resolution of the failure, the cause of the failure is examined with respect to the failed test case and any similar test cases. All test cases associated with, or similar to, the resolved failed test case shall be reviewed to determine the extent of re-testing required. This re-testing, commonly referred to as regression testing, shall verify that the resolution of the failed test has not created adverse effects on areas of the system already tested. The analysis of the testing shall be documented to justify the extent of the regression testing.

Regression testing shall be executed using new copies of the original test script to ensure that the requirement of the system is still being met. In some instances, resolution of the failed test requires extensive redevelopment of the system. In these cases, a new test case must be developed. In either case, the failed test shall be annotated to indicate the tests executed to demonstrate resolution. The additional tests shall be attached to the discrepancy report. This provides a paper trail from the failed test case, to the discrepancy report and finally to the repeated test cases.

Test Documentation by Complexity Model

Test documentation varies with complexity category. Com-plexity categories are defined in Table 1 above.

Category 1 – Operating Systems: Category 1 systems are local operating systems and network systems. These systems provide the backbone needed by all other systems to operate. Due to widespread use of these applications, they do not need to be tested directly. As these systems are the base of other applications, they are indirectly tested during the testing of other applications.

Category 2 – Smart Instruments: Category 2 systems shall be tested following the requirements and functions listed in the requirements documentation. All sections of the testing document as defined in the section "General Consideration for Developing Test Cases" listed above shall be included in the test document for Category 2 systems. Installation and Operational Testing shall be executed. The complexity of the test cases should be commensurate with the complexity of the system. Performance Testing is not required.

Category 3 – COTS Applications: All functions and operations embedded by the manufacturer of the COTS do not need to be tested. Only those functions and operations used by the applications developed in-house require testing.

Category 3 systems shall be tested following the requirements and functions listed in the requirements documentation. All sections of the testing document as defined in the section "General Consideration for Developing Test Cases" listed above shall be included in the test document for Category 3 systems. Installation and Operational Testing shall be executed. The complexity of the test cases should be commensurate with the complexity of the system. Performance Testing is not required.

Category 4 – Configurable Software Systems: Category 4 systems shall be tested following the requirements and functions listed in the requirements documentation. All sections of the testing document as defined in the section "General Consideration for Developing Test Cases" listed above shall be included in the test document for Category 2 systems. Installation and Operational Testing shall be executed. The complexity of the test cases should be commensurate with the complexity of the system. Performance Testing should be considered if the system shares data on a network.

Category 5 – Fully Custom Systems: Category 5 systems shall be tested following the requirements and functions listed in the requirements documentation. All sections of the testing document as defined in the section "General Consideration for Developing Test Cases" listed above shall be included in the test document for Category 5 systems. Installation, Opera-tional Testing and Performance Testing shall be executed. The complexity of the test cases should be commensurate with the complexity of the system.


Following a System Life Cycle model, test documentation verifies that all requirements have been properly met in the design phase of software development. Future articles will follow the SLC model into the next phase of software validation: change control.

Electronic Records & Electronic Signatures

When 21 CFR Part 11 was released on March 20, 1997, it was given an effective date of August 20, 1997. By any measure, Part 11 was a surprise to the healthcare manufacturing industry. Many of us had waited for the Agency's approval to use electronic signatures, and the concerns of industry proponents about electronic signatures centered on the belief that the Agency would allow for their use only after the incorporation of various complicated security biometrics. We expected that the provisions for electronic signatures would potentially include requirements for retinal scans, thumb prints, voice identification, etc.

When Part 11 was released, the security control requirements for electronic signatures were fairly straightforward and benign. The requirements for electronic signature manifestations and the use of a dual user's identification and password were very clear and reasonable. But the section of Part 11 that dealt with electronic records was anything but benign. That section required predicate rule-mandated records created and maintained electronically, to comply with the Part 11 requirements, i.e., audit trail, system security, system self-check, etc. There was no provision for grandfathering legacy systems into compliance with Part 11. This is a big deal, impacting literally thousands of legacy systems in the regulated industry. Furthermore there was no provision for a grace period.

Part 11 was not widely reviewed or discussed prior to its effective date, and many quality and regulatory professionals stumbled into the legacy system impact of Part 11 only after they began to read and study the rule in anticipation of pursuing the application of electronic signatures. In the last two years, the industry has begun to understand more fully the implications and impact of the final rule on its computerized systems. The rule does not create any new record or signature requirements. The use of electronic records as well as their submission to FDA is voluntary. The agency can use regulatory discretion and compliance expectations may be realized gradually.

The realities of Part 11 include the following facts: We are now more than four-and-a-half years past the effective date and Part 11 is not going to go away. Our booming e-commerce industry will only strengthen the need for controls of electronic records and signatures. The FDA provided for only a five-month implementation period so, as a result, the industry has been trying to work out of a state of noncompliance. We should be past grousing and complaining about Part 11 and well into trying to understand it and implementing remediation plans.

Definition and Field
An electronic record is defined as any combination of text, graphics, data, audio, pictorial or other information representation in digital form that is created, modified, maintained, archived, retrieved or distributed by a computer system, and is applicable to records required by any other FDA regulation and applicable to records submitted to FDA under the Food, Drug &Cosmetic Act or the Public Health Service Act, even if not required by FDA. The goal of the regulation is to provide a framework and set of rules for developing sound business practices to ensure the trustworthiness and reliability of electronic data, documents and signatures that are transmitted to FDA. It requires that industry demonstrate its ability to develop and maintain reliable and secure computer systems and sound business processes around these systems. Specifically, the rule applies to data captured in a computer system (electronic records) and signatures or authorizations generated by a computer (electronic signatures) as well as the security controls and business processes associated with them.

Electronic Records Provisions
Closed Systems: A closed system is defined as an environment in which system access is controlled by persons who are responsible for the content of electronic records that are on the system. Controls for closed systems:

Establish minimum controls for all systems;
Are designed to assure authenticity, integrity and confidentiality (as appropriate);
Are designed to ensure that the signer cannot readily repudiate the signature as genuine;
Validate the systems to ensure accuracy, reliability, consistent intended performance and the ability to discern invalid or altered records;
Maintain the ability to generate accurate and complete records in human readable and electronic form so that FDA may inspect, review and copy the records;
Protect records so that they are readily retrievable throughout the retention period;
Limit system access to authorized individuals;
Use secure, computer-generated, time-stamped audit trails for operator entries and actions;
Do not obscure previous entries;
Use operational system checks to enforce sequencing steps;
Use authority checks to ensure that only authorized individuals can access and use the system;
Use device checks to determine the validity of data input or operational instructions;
Ensure appropriate training of users, developers and maintenance staff;
Establish and follow written policies that deter falsification of records and signatures;
Establish adequate controls over the distribution of, access to, and use of system documentation;
Establish adequate controls over revisions and changes and maintain audit trails of modifications to system documents.

Open Systems: An open system is defined as an environment in which system access is not controlled by persons who are responsible for the content of electronic records that are on the system. Controls for open systems:
Ensure authenticity, integrity and confidentiality (as appropriate) of records from point of creation to point of receipt
Employ all of the controls required for closed systems
Implement document encryption.
Implement digital signatures.

Hybrid System: A hybrid system is defined as a system for which handwritten signatures executed on paper and paper-based records (if applicable) are maintained in addition to electronic records. The controls for hybrid systems are a combination of the above two systems.

Signature/record linking:
Applies to electronic and handwritten signatures.
Must ensure that the signatures cannot be excised, copied or otherwise transferred to falsify an electronic record.

Signature Manifestations:
Signed electronic records must include:
- Printed name of the signer
- The date and time of the signature
- The meaning of the signature (e.g., review, approval, authorship)
Electronic signatures are subject to same controls as electronic records.
The information required must be included in any human-readable copy of the record.

Electronic Signatures Provisions
Part 11 defines specific requirements for the design, use and implementation of computer systems that create, modify, maintain, archive and retrieve electronic records with or without electronic signatures. These requirements can be achieved either by technical or procedural implementation. Some requirements may include both a technical solution in the design of the system and a procedural process. Procedural processes may be used also as interim solutions while technical solutions are being developed and implemented. The electronic signature must be unique to an individual and not reassigned, and the identity of the individual must be verified by organization. It must be certified. The FDA example is given below:

"This is to certify that {Company X} intends that all electronic signatures executed by our employees, agents or representatives, located anywhere in the world, are the legally binding equivalent of traditional handwritten signatures."

Electronic signature components and controls:
Non-biometric signatures must consist of two distinct components (e.g., an identification code and a password).
In one continuous session, the first signing must use all components; subsequent signings may use just one component.
Non-continuous session: use all components of the electronic signature.
Must be used only by their genuine owner.
Administered and executed to ensure that use by others is precluded and that any attempted use would require collaboration by two or more individuals.
Biometric signatures are a method of verifying identity based on measurement of an individual's physical feature(s) or repeatable action(s) where the features and/or actions are both unique to that individual and measurable.
Voice Prints, handprints, retinal scans

Controls for identification codes/passwords:
Ensure no two individuals have the same combination.
Ensure that identification codes and passwords are periodically checked, recalled or revised.
Electronically deauthorize lost, stolen, missing, or compromised tokens, cards and devices.
Subject replacements to rigorous controls.
Conduct initial and periodic tests of tokens and cards for function.
Use transaction safeguards to:
- Prevent unauthorized use of passwords and identification codes.
- Detect and report (in an immediate and urgent manner) attempts at unauthorized use.

Audit Trail
One of the biggest concerns regarding Part 11 compliance is defining when the audit trail begins. Take a pragmatic approach, proceduralize it, adhere to it and be prepared to defend it. Audit trail initiation requirements for data should be different from audit trail initiation requirements for textual materials, such as operating procedures, reports or guidelines. If you are generating, retaining, importing or exporting any electronic data, the audit trail begins from the instant the data hits durable media. This should be recognized as an operational and regulatory imperative. It needs to be absolutely and demonstrably inviolate in this regard. But if the electronic record is textual and subject to review and approval, the audit trail begins upon the approval of the document.

Retaining the pre-approval iterations in the audit trail is not value added. If an operating procedure, for example, is typed into a word processor (stored to durable media or not) and subsequently routed either in hard copy or electronically for review and approval, it is not versioned until it is approved by all required approvers. The following procedures are imperative:
The document is not used until it has been fully approved and released into the appropriate documentation system.
The document is not released for use until it has, in a post-altered or amended state, all of the required approvals.
The document is maintained via appropriate version control and retention requirements.

With these procedural controls in place, the textual document is not complete and usable until it has been formally approved and released. At this point, the 21 CFR Part 11 required audit trail is applicable. Obviously, the predicate rule drives the need for a document and subsequently the document's approval, versioning and retention requirements. If the predicate rule does not require the retention of the document's draft versions, Part 11 does not apply to draft versions. However, as I write that, I believe that, during the document's iterative draft stages, it is necessary to fully control the draft versions until the document has been approved for use. Upon approving and version controlling the final version, all electronic draft versions of the document can be deleted. An example of this is as follows:
1. An author writes a procedure/report/guideline/etc., and sends a draft copy to five different reviewers/ approvers.
2. Each reviewer/approver makes a change to the draft copy and sends his/her copy comments back to the original author for incorporation into a new draft version of the document.
3. The author then consolidates the comments and sends the document back to the reviewers/approvers as a new and controlled draft version.
4. The new and controlled draft version is approved by the reviewers/approvers, and the document is released as a controlled final version.
5. After the document has been released as a controlled final version, all draft versions can be deleted.
6. If the released document is subsequently revised, the above process is repeated and only the various final approved and released versions are retained. The current approved version is retained in an active status, and previous approved versions are retained in an archival status.

The draft version document described in Step 3 is controlled and saved only until the final version, described in step 4, is approved and version controlled. After the approval of the final document, any versions or copies of the draft document can be deleted.

Agency representatives have differed on the point at which the Part 11 audit trail becomes applicable. The perspectives within the agency have ranged from a very conservative umbrella statement of, "whenever anything is stored to durable media," to the more pragmatic approach previously described for audit trailing textual documents that are not available for use until approved, released, version controlled and retained per predicate rule requirements. With 21 CFR Part 11 requiring an audit trail for human-entered transactions, as opposed to those initiated by machine or computer, and not describing exactly when the audit trail begins, the industry and the FDA must develop a consistent and reasonable approach to resolving this issue.

Compliance Strategy

FDA References
Compliance Policy Guide 7153.7 May 1999
- Nature and extent of deviations
- Effect on product quality and data integrity
- Adequacy and timeliness of corrective actions
- Compliance history (especially data integrity)
Guidance - Computerized Systems in Clinical Trials - 1999

Systems Covered
Inventory all systems
- Proposed
- Current
All proposed systems should comply with Part 11
Determine threshold of risk the company is willing to accept

Plan
Develop plan for compliance of high risk systems with time frames
Demonstrate progress in implementing timetable
Determine what will be done with other systems—support or validate transcription
Document process

SOPs
System setup/installation
Data collection and handling
System maintenance
Data backup, recovery and contingency plans
Security
Change Control

Policies
Systems should clearly identify the electronic version of records as confidential
Any printout of records should be automatically marked as confidential
Establish e-mail and voice mail policies
Inform employees about the legal consequences of certification

Compliance Mission
Many companies have adopted the following Part 11 compliance approach, keeping in mind the following mission statement:

"To develop an action plan for addressing Part 11 requirements in existing systems and to support the preparation and training of business processes and procedures to assure the development, implementation and use of compliant systems in accordance with the FDA regulations."

Compliance Plan
Study and fully understand Part 11. Applies to electronic and handwritten signatures
Ensure that signatures cannot be excised, copied or otherwise transferred to falsify an electronic record
Identify and inventory all of the Part 11-applicable electronic systems
Develop and apply a Part 11 compliance checklist in order to create a Part 11 compliance gap analysis for systems
Develop and apply a systems criticality matrix that can be used to prioritize systems for Part 11 remediation
Develop and execute against a comprehensive Part 11 remediation schedule

In order to determine a remediation path, it is necessary to project accurately the remediation cost of each system. This will include determining whether the most effective course of action is to upgrade the existing system, buy a new system that can be brought into compliance, buy a system that is scheduled to be in compliance, or buy a system that is already in compliance with Part 11.

Interdisciplinary Remediation Planning
When Part 11 remediation plans are being developed, it is essential that Quality Assurance, Regulatory Affairs/Compliance, Operations, and Information Systems personnel are all jointly involved in the planning. The software, equipment and intended use have to be considered at the very outset of planning. Is the record required by a predicate rule? What is the actual application and use of the equipment/ software? What is the criticality of the system? What is the extent of the noncompliance? Can the program be brought into compliance? Is a compliant new system available? These questions are best answered from a multidisciplinary perspective.

Legacy Systems
Part 11 remediation is especially frustrating for older systems that have been validated to other standards and have been operating in an otherwise nonproblematic state. Legacy system remediation presents a unique dilemma because spending a significant amount of time and money to update an older system could appear to be of limited value. However, remaining in noncompliance while new and compliant systems are sought is fraud with regulatory peril and can't be taken lightly. It may be very costly to remediate these systems, but the fact remains that Part 11 does not provide for grandfathering legacy systems, and it does allow the industry to use electronic signatures.

Software and System Suppliers
Software and equipment suppliers have begun to understand that Part 11 represents a new set of expectations for their products, and many are trying to respond, but most are not there yet. It has become apparent that "buyer beware" is a term or concept that is very applicable to Part 11 compliance efforts. In a recent review of several well-known systems/ software packages that were advertised as "Part 11 compliant," it was evident that some aspects of Part 11 were addressed, but others were not. It is imperative that manufacturers understand the requirements of the final rule and are in a position to ask the right questions of their suppliers.

Laboratory Equipment
The remediation approach of replace or upgrade will need to be looked at on a system-by-system basis or at least a system-type basis. Laboratory equipment will need to be assessed after a gap analysis has determined the level of noncompliance. If an analyzer is not designed to store data to durable media, and it holds the analysis in RAM, prints out the analysis results, and subsequently deletes the results from RAM to make way for the next analysis, it is generally interpreted that Part 11 does not apply. The electronic typewriter concept pertains, with the paper copy becoming your raw data, subject to appropriate predicate rule retention requirements.

If an analyzer stores analysis data to durable media, Part 11 applies. The raw data in this case is the electronic data, and any subsequent hard-copy printout of the data is ancillary. The printout must be, as part of the systems validation, demonstrated to be the same as electronic raw data. But the presence of paper copy does not remove the Part 11 requirements that probably represent the easiest and most direct compliance approach. If the analyzer can't be readily upgraded, the new purchase option exists, but the vast majority of new analyzers themselves are noncompliant.

FDA-regulated industry is just one player in the overall laboratory analyzer market, and demands from the industry to make new analyzers Part 11 compliant can be much like the tail trying to wag the dog. The industry believes that, while this can eventually meet with positive results, it is more likely to be met with frustration in the short run.

Discuss your options and be creative and innovative in your remediation approach. If your analyzers can't be made Part 11 compliant, get a laboratory information management system (LIMS) or an external data control system that can. Treat your analyzers as second generation for your LIMS, and assure that your LIMS software is Part 11 compliant. The FDA is not prescriptive relative to where the data is retained, which file or database. You are required to validate your system and to be able to demonstrate that your system and its data acquisition, retention and Part 11 controls are solid and repeatable.

Solutions
Information systems professionals, when introduced to Part 11 requirements, have come up with innovative solutions to the remediation quandary. With Part 11 providing the capacity for audit trailing to be accomplished via the use of ancillary equipment or different databases, the industry has the opportunity to view entire interrelated and interconnected systems looking for the most opportune mechanism to fulfill the various Part 11 requirements. Examples of this are the use of Documentum's underlying Oracle database to record time/date transactions or the use of an NT server's security function to provide the required level of systems security for an application accessed on line utilizing the server.

Many commercially available software programs already have systems self-checks and alert database administrators to prevent entry attempts. Instead of being dismayed by the complexity and all-encompassing nature of 21 CFR Part 11, we need to accept the likelihood that we will probably not find an answer that does it all for every system. We must begin to look opportunistically at the systems, equipment and processes that we already have in place for resolution.

The pharmaceutical industry is actively working to develop plans to address full compliance with Part 11. It has already taken several steps toward adherence to the rule in preparing standards for the development, validation and use of computer systems. The industry has begun to oversee the remediation of business systems, business processes and the development of new business systems used to generate, store and authorize information delivered to the FDA. It will also be used to drive and support the use of good business practices around the development and use of computerized systems. Part 11 will remain with us, and organizations that have delayed remediation are falling further behind the compliance power curve. Investigators are trained on Part 11, FDA 483 citations are being issued, and Part 11 violations are being noted in warning letters. Part 11, whether you like it or not, whether you feel it's needed or not, is a released Final Rule in the Code of Federal Regulations governing our industry and must be adhered to.

It is "foolish" to try to wait it out. You will fall further behind your peers and your competition, and you will put your organization at risk. The industry, working with the FDA, must develop a consistent and reasonable approach to resolving the Part 11 issue. Understand the rule, understand your requirements, and by all means understand your opportunities. Keep track of your plan, your actions and accomplishments, your innovations and solutions, and your remediation expenses.

Pharmaceutical Validation Documentation Requirements

Pharmaceutical validation is a critical process that ensures that pharmaceutical products meet the desired quality standards and are safe fo...