Friday, July 30, 2010

Computer system validation ? increasing supplier value 4

As this was a batch production plant, the functional specification was structured in line with the S88 model for batch automation.3 In this way, the specifications related directly to the configured elements in the system. The functional specification was supported by design specifications and project standards, where the latter defined the design and implementation principles that would be used on the project. Together, the functional and design specifications were sufficient to configure the system and, perhaps more importantly, provide the documents against which the system could be tested.
The structure of the functional specification is shown in Figure 3. Each document had its own number and revision history so that some parts of the specification could be reviewed and approved before other parts were complete. Basing the structure on the control levels defined within the S88 model results in the minimum of interaction between the documents, allowing parallel document development to be practical.

Figure 4
Also, everything about a particular level was defined in the one document. For example, the human interface aspects of control modules are defined along with other aspects of basic controls in the basic control definition document. The human interface definition covered the generic aspects of the operator interface, together with alarm management and security access. An extract from the basic control definition document is shown in Figure 4, where a motor symbol is defined.

Table 3
The traceability of requirements illustrated in Figure 2 includes the test specification. The requirements' trace process was made easier by generating a test case for each major section of the functional and design specifications so that there was a one-to-one correspondence between the FDS and the test cases. An extract from a test case document is shown in Table 3, where separate sign-off boxes are provided for internal tests and FATs. Where a particular test is not relevant to a group of tests then that box is shaded grey. Reaping the benefits
Our approach has been to make use of the work normally performed by the supplier to reduce the work that would otherwise be done on site. By simply ensuring that all test records are kept in a form that provides documented evidence that the system satisfies the requirements within the URS, we have added very little work for the supplier and opened up the opportunity for significant savings during the validation process. On one control system for an API manufacturing plant it was estimated that 11 weeks were removed from the site activities as a result of adopting this approach.
The validation master plan (VMP), IQ, OQ and PQ protocols still need to be produced, but parts of the IQ protocol and most of the OQ protocol can now be little more than lists that reference the test records from the FAT and SAT. It is often said that software does not break, it is already broken! The corollary to this is that the functionality provided by the software will not change when the system is installed on site unless the environment in which the software application is running changes; a simple justification for not repeating the tests.
The approach described in this article has been used on a number of pharmaceutical control system projects. Most of the purchasers have embraced the concept, including their validation staff, from the start. One major operating company actually provides document templates to their supplier so that all the documentation fits seamlessly into their document system. However, some have not had the courage to change their existing practice and so have missed the opportunity to reduce costs and get their products to market earlier. 
References
1. S. Haisman and P. So, "Factory Acceptance Testing", Proceedings of the ISPE 2000 Annual Meeting, San Diego, CA, USA (2000).
2. ISPE Gamp 4, GAMP Guide for Validation of Automated Systems, ISPE (Tampa, FL, USA, 2001).
3. ISA ANSI/ISA-S88.01, Batch Control, Part 1: Models and Terminology (Research Triangle Park, NC, USA,1995). Dr David James is principal engineer at Invensys Systems (UK) Ltd, UK.

Computer system validation ? increasing supplier value 3



Figure 2
Also noted in Figure 2 is the situation regarding requirements that are satisfied within product specifications. If the required functionality is provided in a commercial-off-the-shelf (COTS) product then this functionality has been tested by repeated use across many installations and so does not need to be tested as part of the system tests. Only where the supplier project team has configured standard functions do you need to test that the configuration is correct. As a purchaser, however, you cannot absolve yourself of responsibility during the requirements trace process. First, the URS should be structured so it helps produce a trace matrix by
  • limiting each clause to a single requirement
  • giving a unique number to each clause
  • indicating whether the clause is for information or is a requirement to be satisfied under the contract.

Second, you need to be satisfied that the references in the matrix do actually relate to each of the stated requirements. Including the trace matrix within the supplier's scope also encourages discussion on each of the clauses in the URS and anything that promotes communication between purchaser and supplier is a good thing.
Regarding supplier training, you must ensure that the test records are going to meet the needs of documented evidence. This guidance should include:
  • The generation of test cases for each major section of the FDS, with sign-off points for each individual function, rather than signing off whole sections or pages.
  • Using initials and date to sign-off tests rather than simply using tick marks. This also requires a separate list of names, responsibilities, signatures and initials.
  • The use of black or blue ball-point pen rather than red (which may not photocopy) or pencil.
  • The deletion of text by a single line strike-through, so that the original text is still legible, with the addition of initials and date.
  • The command to never use liquid paper to alter a document.
Because the FATs will be witnessed by your own staff, they can help ensure the propriety of all test records. They will also be signatories to the tests, therefore, you will have evidence of who performed the tests and who witnessed not only their correct execution, but also that the desired result was obtained.
Case study
The supply of an automation system for an API manufacturing plant can help illustrate the concepts discussed above. In this case, an existing pharmaceutical manufacturing plant was modified to produce a new product, while still retaining the capability to make the product for which it was originally designed.
Because the existing control system could not be further expanded, input/output modules for new parts of the plant, and new controllers and operator interface were added to the existing field interface equipment. Product was made for stock so as to maintain supplies during the plant shutdown.

Table 2
The URS, produced by the purchaser, was structured to aid traceability. A small extract from the URS is shown in Table 2, where numbered clauses can be seen along with a distinction between information and requirement clauses.

Computer system validation ? increasing supplier value 2

Remember that you are required to produce documented evidence that the computer system has been installed as specified and that the functionality provided by the system meets the stated requirements. Your validation staff must feel comfortable that this documented evidence will be produced and that it will withstand an inspection from one or more of the pharmaceutical regulators. Whether validation activities are performed by your control and instrument engineers, as occurs in at least one major pharmaceutical production plant, or entirely by a validation department, it is necessary to obtain buy-in from those who are responsible for validation.
They must also be involved in the entire process, if only as an approval signatory, so that they can be assured that the results will meet the intended aim. It is also necessary to consider the effects on the individual. After all, this proposal removes work from the validation department, a proposition that may require some 'selling' on the grounds of overall company benefits.
It may not only be the validation department staff who need to be convinced that a different approach can meet corporate goals as well as saving time and money. In 2000, I attended the International Society for Pharmaceutical Engineering (ISPE) Annual Meeting in San Diego (CA, USA). One of the sessions, presented jointly by a US operating company and one of its suppliers promoted the use of factory acceptance tests (FATs) as part of the supply of packaged systems.1
It seemed strange to me that there should be any form of debate about whether or not FATs should be performed. Over a period of 25 years as a supplier of control systems within the European market, FATs had always been an obligatory part of any contract. Additionally, the presentation made no reference to any form of validation activity. When I questioned this, the operating company presenter said that they would "never involve a supplier in validation". It was clear that the supply of a packaged system was seen as a completely separate activity to the validation of that system; perhaps because purchasing and validation had always been two separate areas of responsibility. To add value to the supplier's scope it is, therefore,, necessary to also involve your purchasing department.
When you consider what is tested as part of an FAT you soon realize that some of the tests that would be performed for IQ and most of the tests that form part of OQ will have already been completed and witnessed.
For example, the installation of equipment within cabinets and the installation of software packages within the control system will all be verified as meeting the project specification. Similarly, functionality associated with instrument and actuator interfaces, operator interfaces, equipment control and recipe actions will also be tested against specifications.
If we can ensure that these test records can be used as part of the documented evidence then these tests or their equivalent will not need to be performed on site, thereby saving significant time and money.
Those parts of the system that cannot be tested in the factory will be tested during site acceptance tests (SATs), which provide another opportunity to collect documented evidence. Similarly, where evidence of the functionality of a complete instrument loop is required, the information can be documented during commissioning activities.
Involving the supplier
Any supplier who has an accredited quality control system will have the basic controls in place during project execution. However, the scope of supply needs to be explicitly extended to include some basic training of supplier project staff and the production of suitable records. The training to be given to supplier staff must cover an overview of validation, including the objectives of IQ, OQ and PQ, and how to generate specifications and records.
Using the GAMP Guide as the basis of the training is a good idea because it describes the relationship between the user requirements specification (URS), functional and design specifications (FDS), system test specification (STS) and the need to trace requirements from URS, through the FDS to the STS.2 The arrows in Figure 2 illustrate the links that need to be recorded in the requirements trace matrix (RTM). It is beneficial for the supplier to generate the RTM, as many of the URS requirements will be covered by statements in the supplier's standard product specifications and manuals, and the supplier is in the best position to identify where these statements can be found.

Computer system validation ? increasing supplier value 1


All sectors of manufacturing are under continual pressure to bring new products to market quicker, stealing a march on the competition and maintaining their revenue stream. Manufacturing industries are also under pressure to reduce the cost of design, production and distribution processes to increase margins and maximize return on investment (ROI). Pharmaceutical manufacturing is no different; competition is fierce, and investors in drug development and manufacturing companies are looking for a healthy return on their money.
However, whereas most manufacturing organizations have seen health and safety regulations tighten over recent years, pharmaceutical manufacturers are also under pressure from the regulatory authorities in the regions where they wish to market their products. The Medicines and Healthcare products Regulatory Agency (MHRA) in the UK and FDA in the US, for example, regulate and monitor pharmaceutical manufacturers from the process plant design stage through to the day-to-day production of active pharmaceutical ingredients (APIs) and finished products. Whereas being first to market is usually a key objective for manufacturing organizations to maximize product returns, pharmaceutical manufacturers have the added pressure of patent lifetimes. If most of the patent duration is used to perform clinical trials and build a manufacturing plant, there is a reduced protection time in which to recoup drug development costs.

Figure 1
These various pressures work against each other as shown in Figure 1. Complying with health and safety, and regulatory requirements inevitably increases costs, whereas shareholders want to see expenditure reduced. Patent protection periods limit the time over which high margins can be obtained so reducing the time-to-market is paramount. It is the nature of control systems for process plants that they become an activity on the critical path of the manufacturing plant development. To complete the configuration of a control system all details of the process, instrumentation and control devices must be finalized. Similarly, to finalise site tests and commission the system, all aspects of the plant, including the installation of the instruments and control devices, must be complete.
Consequently, the time pressures on the control and instrumentation engineer are very high, regardless of the industry. For the pharmaceutical industry, however, we have the additional requirement to conduct the initial stages of computer system validation (CSV) on the control system before performance qualification (PQ) of the whole process control system can be undertaken.

Table 1
When it comes to the site activities associated with those initial stages of CSV, things are not as bad as they appear. Operating companies can significantly reduce the time taken to perform installation qualification (IQ) and operational qualification (OQ) activities, enabling them to both reduce cost and get into production earlier, by capitalizing on the work performed by their control system supplier. Definitions of PQ, IQ and OQ are provided in Table 1. The principles discussed in the rest of this article could be applied to any packaged system, whether it includes controls or not. Therefore, the installation of, for example, a clean-in-place skid or a vacuum package that undergo functional acceptance tests in the factory before delivery could also gain from the proposition that follows.
Obtaining 'buy-in'
Our objective is to minimize site activities associated with validation by reducing IQ tasks wherever possible and only performing OQ activities where they are dependent upon the site environment; for example, where the control system interfaces to another piece of equipment that could not be installed or simulated within the supplier's works.
A prerequisite for this approach to be effective is to involve your site validation staff from the start.

The Validation and Implementation of a Chromatography Data System 3

User acceptance testing (PQ) The user acceptance testing is concerned with the assurance that the functionalities specified in the user requirements specification (URS) are available and operate as specified by the vendor. Table II summarizes the test scripts produced for this phase of the validation.

Table IV: Individual instruments tested for use with the CDS.
Instrument tests In addition to the above functionality testing, qualification test logs were produced and executed for the individual instruments that would be operated in conjunction with Atlas. The instrument qualification was executed using a holistic approach. The test logs were designed to test that communication between the instruments, the Chromservers and Atlas could be established and maintained. Tables III and IV summarize the tests performed. Implementation Upon successful completion of the validation, release certificates were produced. The results of the validation effort at Covance, including the IQ/OQ results, were summarized in the Summary Report for Phase 1. This concluded that the application was found to meet the user requirements and was therefore fit for its intended purpose. Subsequently, Atlas was implemented into the Pharmaceutical Analysis Department. This implementation involved the following steps:
  • installation of the Atlas Client on each of the user PCs with IQ testing for this step and complete documentation
  • setting up the individual user accounts
  • production of the relevant standard operating procedures (SOPs)
  • training of the users
Validation documentation A whole suite of validation documentation was generated by this project. It was prepared by the user representatives, IT or the vendor. The documents were reviewed by QA to ensure that regulatory and procedural requirements had been met. The complete SDLC validation package comprised:
  • a URS
  • acquisition documentation: quotation, capital justification and special purchase requisition
  • product documentation as deemed necessary
  • vendor release documentation
  • a completed supplier assessment questionnaire
  • a validation plan
  • IQ documentation
  • OQ documentation
  • additional OQ test scripts
  • PQ
  • a test plan (user acceptance plan)
  • acceptance test logs
  • a URS traceability matrix
  • a validation summary report
  • user manuals
  • training material
  • SOPs.

Conclusion This article illustrates the procedures involved in the successful validation and implementation of a chromatography data system as a replacement for an old system. The project involved the use of SDLC principles and the processes were documented at each stage. Resources from various departments across the company were required and, along with the tight deadlines imposed, careful planning and teamwork were essential from the start to achieve a successful outcome.
The phased approach adopted for this validation meant that the system hardware could be fully qualified and the software validated for the functionalities required for the Pharmaceutical Analysis Department. In reality, these included the major components of the application and allowed for the departmental-specific functions to be validated in Phase 2. This phased approach also meant that the implementation into the first user department could be achieved on schedule and with minimum disruption to project timelines. This made the subsequent deployment into the rest of the site a shorter and more manageable project.
Acknowledgements The authors would like to thank the many users involved in the validation and implementation of the CDS and Thermo LabSystems' Pathfinder services group for the installation and operational qualification documentation. The authors would also like to thank Bob McDowall from R D McDowall Consultancy Services for QualifyPlus documentation, on which part of this project was based.
References 1. "21 CFR Part 11, Electronic Records; Electronic Signatures; Final Rule," Federal Register 62(54), 13429-13466 (1997).
2. "Technical Report No. 18, Validation of Computer Related-Systems," PDA J. Pharm. Sci. Tech. 49(S1), S1-S17 (1995).
3. "Technical Report No. 33, Validation and Qualification of Computerized Laboratory Data Acquisition Systems," PDA J. Pharm. Sci. Tech. 53(4), 1-12 (1999).
4. R.D. McDowall, "Chromatography Data Systems III: Prospective Validation of a CDS," LC-GC Europe 12(9), 568-576 (1999).
5. R. Chamberlain, Computer Systems Validation for the Pharmaceutical and Medical Device Industries (Alaren Press, Inc., Libertyville, Illinois, USA, 1991).
6. "FIPS PUB 101, Guide for Lifecycle Validation, Verification, and Testing of Computer Software" (June, 1983).
7. T.D. Thompson et al., "The Prospective Validation of an MS Data System Used for Quantitative GLP Studies," LC-GC Europe 14(11) 687-694 (2001).
8. QualifyPlus Generic Chromatography Data System Validation Package, Version 3 (McDowall Consulting, Bromley, UK).

The Validation and Implementation of a Chromatography Data System 2

In conjunction with the responses to the supplier assessment questionnaires, evaluations of each CDS were conducted. Each CDS was made available on-site for a period of a week. This meant that users were able to obtain hands-on experience of the application. Each application was objectively assessed by the users. This was done by completing a checklist, and scores were allocated for the following criteria:
  • general appearance
  • ease of use
  • amount of training required
  • ease of implementation
  • instrument control
  • handling of mass spectrometry data
  • electronic record and signature implementation (21 CFR Part 11)
  • connection to a laboratory information management system (LIMS)
  • exporting data
  • compatibility of data produced with Multichrom data.

The system user evaluations did not identify a clear 'winner.' The final decision was based on business criteria. These included compatibility with existing hardware and minimum potential disruption to the operating department during roll-out. Because of this, Atlas became the CDS of choice. The cost of implementation was low because the existing infrastructure hardware used for Multichrom could be used, as the software was compatible with Multichrom and therefore data migration would be simpler. Another contributory factor in favour of Atlas was the experience that Covance has had with the vendor for many years regarding CDS and LIMS products.

Table I: The key participants and responsibilities.
User acceptance testing (IQ/OQ) After the choice and purchase of Atlas as the required CDS, the key members of the validation team received training from Thermo LabSystems in the use of the software. For the user acceptance testing, a development environment was set up with access restricted to the validation team. The development environment was used to assess the functionality of the application and to allow for the preparation of the test logs. The application was then installed into an environment in which IQ/OQ/PQ and test logs were executed. An overall validation plan was generated that outlined the activities to be completed during the user acceptance phase. This included an introduction to what the project was aiming to achieve, the validation scope, definitions, system overview, roles and responsibilities of the project team, validation strategy and methodology, reporting and administrative procedures.
The operating environment comprised a Dell Poweredge Server 4400. The server uses two Pentium III 866 processors and has 1 GB memory, five 18 GB disk drives (with Raid 5 Stripping with Parity), 4 GB memory for the system drive and 65 GB memory for the Atlas system and data. The system is supported by an uninterrupted power supply (UPS 2200) and is located in a secure computer room, with access restricted to authorized personnel.

Table II: Tests for functionality.
The operating system is Windows NT Version 4.0, service pack 6a and the following co-installed software is present - Norton Antivirus software, Internet Explorer 5.5 and Netware Client Version 4.80. Arcserve open file agent is installed to enable the backup of open files. Installation and operational qualification was done by Thermo LabSystems personnel who were also responsible for the IQ and some OQ procedures and documentation. The IQ phase was concerned with formal assurance that the Atlas system (including software, hardware and Chromservers - the chromatography data acquisition device for Atlas) had been installed correctly. The activities undertaken during this phase were fully documented and comprised
  • documenting the hardware (servers and clients)
  • documenting the Chromservers
  • executing the IQ tests

The IQ testing of the Atlas server and clients was successfully completed before the OQ and user acceptance testing (PQ) was started.

Table III: Tests for instruments.
The OQ phase ensures that the installed system works as specified by the vendor and sufficient documentary evidence exists to support this. This phase was mainly undertaken by Thermo LabSystems using the Atlas Validation Toolkit. The activities undertaken were fully recorded according to vendor documentation. In addition to the standard OQ tests, Covance ran separate OQ tests to check the capacity of the Chromservers, and the ability of Atlas to divert acquisition to another server (the Pick Up Server). This was required because in the event of there being a failure of the live Atlas server, data recovery is essential. The ability to cope with samples containing a number of named analytes within a workbook, containing a large number of injections, known as capacity testing, was also performed as part of the OQ. All the documentation produced by the vendor during the IQ and OQ phases was reviewed by a user representative and the IT group. It was also fully audited by the QA department.

The Validation and Implementation of a Chromatography Data System 1

By: Terry Thompson, Nichola Stevens, Jayne Bradley, Marian Mutch
The Atlas 2000 chromatography data system (CDS), developed and supplied by Thermo LabSystems (Altrincham, UK), was implemented at Covance as an upgrade to the current CDS, Multichrom Version 2.0. Multichrom runs on a VAX minicomputer cluster using VMS as the operating system and, although it has proved a robust and reliable CDS for several years, it was thought necessary to upgrade the software. Multichrom was built using 'old technology' that was not adequate to meet the current and future expectations of Covance, and its validation was considered inadequate to meet modern standards. Also, because of the introduction of the US Food and Drug Administration (FDA) regulation 21 CFR Part 11 for electronic records and electronic signatures in August 1997,1 compliance with this regulation became a priority.
The introduction of a major new system into the company presented logistical challenges - the system needed to be implemented in a compliant manner following software development life cycle (SDLC) principles, whilst ensuring seamless integration into the operational areas to minimize the effect on ongoing projects (Figure 1). To deal with these issues, it was necessary to identify key personnel from across the site, to get high-level commitment to the entire validation process and to plan the validation by implementing in a phased approach by department.

Figure 1: Overview of the software development life cycle.
Phase 1 of the project involved implementation into the Pharmaceutical Analysis Department, which included validation of the major functionality of the application. This phase encompassed the installation qualification (IQ), operational qualification (OQ) and performance qualification (PQ). The documentation for the IQ and OQ was produced by the software supplier in conjunction with the company's own Atlas validation team. As Phase 1 of the validation was progressing, the company introduced its own system life cycle, the Covance System Life Cycle, (CSLC). Phase 2 of the project involved the roll-out of the software into the remaining five user departments and was achieved using the CSLC templates. The CSLC was an adaptation of the traditional SDLC, which met the specific requirements of Covance. Phase 2 of the project also served to capture the validation of additional functionality not addressed in Phase 1. Phase 3, which is ongoing at the time of writing this article, is involved with validating the migration of data from Multichrom into Atlas and the formal retirement of the Multichrom system. Life cycle approach Validation, implementation and management of computer systems using SDLC principles has been well documented.2-7 This approach has become standard practice for the validation of computerized systems for use in the pharmaceutical industry. Covance began to implement this life cycle approach with the validation and implementation of the Analyst software (PE Sciex; Applied Biosystems, Foster City, California, USA),7 using QualifyPlus (Version 3), a generic system of documentation specifically designed for the validation of CDSs.8 The project for the qualification and implementation of Atlas was based on the above validation. Figure 2 provides an overview of the overall validation process.

Figure 2: Validation process overview.
Identifying the key players and planning the project The implementation project involved personnel from the vendor working alongside personnel from various departments at the implementing company. These were the user department (Pharmaceutical Analysis), information technology (IT) and quality assurance (QA). The demands of the business meant that it was necessary to have this application installed, validated and running in the live environment to meet tight deadlines but also to have minimal impact upon the progress of current studies. For these reasons it was necessary to ensure that management and all the key players were fully committed to the project and that resources would be available when required to complete the project on schedule. Careful planning was therefore essential and use was made of project plans. The roles and responsibilities of the key personnel are highlighted in Table I. Concept The starting point for all vendor supplied software management projects involves the assessment of the available systems. This process involves the assessment of the application against the user requirements. This may include a vendor audit, which encompasses IT and regulatory requirements. With the advent of 21 CFR Part 11, this became a prerequisite before a final decision could be made.
Requirements gathering and system evaluation For the replacement of Multichrom, three systems were evaluated: Atlas (Thermo LabSystems), Millennium32 (Waters Corporation, Milford, Massachusetts, USA) and Chromeleon (Dionex Corporation, Sunnyvale, California, USA). Supplier assessment questionnaires were sent out to each of the vendors as part of this process. The responses to the supplier assessment questionnaires were retained as part of the SDLC documentation. 

Does 21 CFR Part 11 Provide Any Benefits?

Reading the good automated manufacturing practice (GAMP 4) guide acquaints you with the now classic and almost famous V-model.1 The V-model, originally used for describing a validation workflow of IT and automated systems, is easy to understand and very good at ensuring that the requirements and design are built into the final solution. It is also extremely versatile and can be used for almost any type of validation task you could meet in a development phase.Imperfections? However, its benefits are not unlimited. What problems could such a seemingly 'perfect' model have?
First, when performing validation, I do not begin by mapping out the user requirements, which are the result of large amounts of research and engineering — initially, I do not have any details or specifics. Also, to write the user requirements before the validation plan goes against the general requirements about working against predefined plans. Instead, I start a project based on a customer's idea and, through discussion and refinement, it becomes clear what the project is about. Up until this point, notes are made with no thought of good manufacturing practice, thereby encouraging free thinking without restrictions or limitations. Once the project objective is defined, it is recommended that the customer obtains a quote for its execution. If a serious supplier with knowledge of the pharmaceutical industry delivers a fixed price offer, they will almost certainly want to see the validation plan because this is where more than 50% of the project's time will be spent; they will want to know what documents they will have to produce. From an economical perspective, this suggests that the validation tasks are more important than the user/customer requirements.
The second issue with the V-model (and Vs in general) is that they have an end; once the validation is complete, the papers will be confined to storage where they will sit and collect dust. Subsequent changes or additions to the existing system will result in creating new, little V-models rather than reusing existing documentation. This is primarily because the engineering department does not have access to the end users' document archive. This promotes the idea that the project is in a constant state of development. In truth, it actually means discarding preceding user requirements, design and validation as any new version of V-model may add, delete or change the three cornerstones (requirements, design and test. I call this phenomenon 'fragmentation' — one of the most dangerous phenomena encountered in validation because it becomes increasingly difficult to preserve the full overview of both the system and the validation. With fragmented documentation you have to look at more than one document to get the full picture. For example, when the user requirements are spread over dozens of papers, nobody actually knows what the collected sum of the requirements are. A similar situation is created when design documentation is spread over different documents, and changes are included in other documents (modifications to an existing design or additions based on new requirements from new documents). During an all-important inspection, there is nothing worse than having to trawl through mountains of paperwork to locate those that support a specific system, not to mention time consuming. Lots of little V-models that describe a system ready for retirement creates a third major problem. Not only is it difficult to find out what the system cornerstones are (because of fragmentation), but it is time-consuming finding and retiring all the old documents.

Figure 2. QAtor's real world picture.
When planning, life cycle considerations such as maintenance and retirement, should be built in to ease later processes  Having lots of V-models though does not go hand-in-hand with complying life cycle considerations described in the GAMP 4 guide. So what can we do? 


If we turn the 'V' into an 'O', the life cycle consideration might actually work because there would be no additions, deletions or modification, just the newest version of the documents (Figure 3). This would solve the problem of document fragmentation because the documents would always be updated and the latest version consistent with the real life system. Instead of creating a new user requirement specification (URS) in a new V-model, modify the old one and save it as Version 2. Based on the modified URS, it is possible to modify the design so that it mirrors what is actually going to be implemented in the real world. The only problem with the reuse of documentation is the manually performed additions to documents; that is, manual comments and manually performed tests. However, FDA's 21 CFR Part 11 may have found a solution to this. Some years ago, FDA (and prior to that Eudralex) permitted electronic signatures.2,3 Rather than seeing these as a hindrance, perhaps we can actually use them to our benefit. These clauses allow us to move any type of electronic raw data (provided we do it in a validated manner). So, provided we make electronic signatures for any modifications manually added to documents, we could move signatures from one version of a document to another. There are, however, some 'minor' technical obstacles about proving that signatures were not compromised, such as proving that they were moved in a validated manner. However, once they are overcome it should be possible to insert new tests of new requirements in the already established test scenarios. This is also possible with paper-based documentation.
Using an O- instead of the V-model would also be far more logical regarding the life cycle considerations because, unlike V-models, O-models constantly evolve. The O-model enables production to continue with little validation interruption because specific versions of validation documentation are available to support any point in an item's life cycle. The method can also be done using multiple V-models, reusing the same documents as developed in the last V-model in every new one.
Finally, we must consider how to start and end an O-model. It seems that the right way to begin an O-model is via a change request or a deviation. In many cases the refinement process is part of the deviations or change requests, but putting it into the planned loop might actually allow the process to be right first time. The end of an O-model is controlled by end of life considerations — what are we going to do with raw data, can we reproduce data or a production scenario in the full retention period? This is also logical, as the documentation is kept within the circle during normal life, and leaves the circle once the system expires. Knowing that the ways of the (validation) world do change over night, I am aware that the idea of using validation documentation for something other than inspections, is provocative. However, the ultimate goal of validation is to produce documentation that actually raises quality instead of just producing more paper. Think it over!
Christian Stage is director of QA product development at QAtor, Denmark.
References 1. http://www.ispe.org/Template.cfm?Section=gamp
2. http://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfCFR/CFRSearch.cfm
3. http://www.pharmacos.eudra.org/F2/eudralex/vol-4/pdfs-en/anx11en.pdf

FDA's Draft Guidance for Process Validation: Can It Be Applied Universally?5

Others made similar observations, and the general thrust of the comments on the proposed change was that the presently available tools to control aseptic processing do not constitute adequate confirmation to be considered validation. The arguments against the proposed revision notwithstanding, FDA made the change to the regulation. The arguments presented above were deemed inadequate, though no clear rationale or supportive scientific evidence to the contrary was offered. In the context of this larger document, the earlier objections to FDA's CGMP revision and its relevance to this draft guidance have perhaps been made clearer, and the specific difficulties more evident.
FDA indicated that media fills, environmental monitoring, routine processing controls, and satisfactory sterility tests constituted a validation in principle if not in fact (12). That position is inconsistent with the concepts presented in the draft validation guidance, which requires substantially more robust evidence of the link between input and results. The difference may seem on one level to be semantic, yet a gulf nevertheless exists between the industry's and regulators' positions with respect to validation of aseptic processing. Regardless of how one views this conundrum, one point should cause little or no confusion: the statistical component of the guidance really doesn't work with respect to linking any process parameters directly to performance.
Manual processes. Operators' skills and proficiency play a major role in the outcome of a substantial number of important pharmaceutical processes such as manual sanitization of equipment and environments and aseptic gowning. In addition, the operator's abilities may sometimes negate other controls that are present. Processes that rely heavily on operator proficiency may not be considered adequately validated, regardless of the outcome. These processes include manual cleaning, wet granulation, sugar coating, and manual inspection. It is hard to conceive that any of these could attain the level of control associated with validation. These processes are likely best considered as verified, given their heavy dependence on the operator.
A final perspectiveA substantial amount of clarification, and perhaps broader interpretation, is needed to reconcile the draft guidance with the elements of pharmaceutical validation that do not directly result in the preparation of an intermediate material, drug substance, or drug product. In some situations, the draft guidance certainly can be used without change, but a degree of caution is necessary. Recommendations for a life-cycle approach are certainly acceptable; many pharmaceutical firms had already instituted comparable programs. It seems that the less the particular process results in or influences a material that can be analyzed or is subject to operator variability, the less useful the statistical elements of the guidance are. Without parameters and attributes that are readily quantifiable, using statistics is certainly inappropriate. The design and development experimental evidence works quite well in some instances but appears to be a force fit in other applications, largely according to whether the process outcome is numeric. Successfully linking design and development and initial and ongoing qualification for these processes also seems to depend on the extent to which process success is readily quantifiable.
The qualification and validation model provided in the draft guidance appears fully applicable to validation of products and processes but only partially applicable in other areas. Its usefulness for sterilization and aseptic processing, two of the more important pharmaceutical processes, is highly questionable. An extensive effort to provide a common level of expectations between FDA and industry (as well as within FDA and industry) is urgently needed to clarify how implementation is to be addressed. For sterile products, where FDA theory and industry practice are most divergent, an update of FDA's 1994 Guidance Submission for Sterile Products might be the best way to accommodate the desired approach with current methods for validation (13). 
Conclusion
The draft guidance about process validation is refreshingly simple and supports good science, yet it is demanding with respect to the level of effort required to properly validate a pharmaceutical production process. The variety of approaches that the pharmaceutical industry uses for process validation extends from the merely cosmetic, providing little if any real support to product quality, to overblown efforts that have nearly crippled firms with their complexity and restrictive approach. The draft guidance is not perfect, and improvement and clarification are certainly necessary, especially regarding the guidance's application to sterile products. The guidance requires a restructuring of validation programs to tie development science more closely to commercial-scale manufacturing. It clarifies FDA expectations for validation in a more coherent manner than previous documents. The expectation for life-cycle treatment with heavy statistical emphasis mandates that sound science be applied to validation more clearly than ever before.
For applications in the validation of processes and products that result from them, the author commends those who prepared the guidance for a job well done. It is essential, however, that FDA and industry proceed with special caution in the areas reviewed above because blind adherence to the concepts of the draft guidance will likely lead many astray.
James Agalloco is the president of Agalloco and Associates, PO Box 899, Belle Mead, NJ 08502, tel. 908.874.7558, References
1. FDA, Draft Guidance for Industry—Process Validation: General Principles and Practices (Rockville, MD, Nov. 2008).
2. J. Agalloco, "The Validation Life Cycle," J. Parenter. Sci. Technol. 47 (3), 142–147 (1993).
3. K. Chapman, "The PAR Approach to Process Validation," Pharm. Technol. 8 (12), 24–36 (1984).
4. R.E. Madsen, "Real Compliance and How to Achieve It," PDA J. Pharm. Sci. Technol. 55 (2), 59–64 (2001).
5. Code of Federal Regulations, Title 21, Food and Drugs (General Services Administration, Washington, DC, September 2008), Part 211, pp. 51919–51933.
6. FDA, "Pharmaceutical CGMPs for the 21st Century—A Risk-Based Approach," Final Report (Rockville, MD, Sept. 2004).
7. ASTM, "E 2500-07 Standard Guide for Specification, Design, and Verification of Pharmaceutical and Biopharmaceutical Manufacturing Systems and Equipment," (ASTM, West Conshohocken, PA, 2007).
8. J. Agalloco, "Compliance Risk Management: Using a Top Down Validation Approach," Pharm. Technol. 32 (7), 70–78 (2008).
9. J. Harris et al., "Validation Concepts for Computer Systems Used in the Manufacture of Drug Products," in Proceedings: Concepts and Principles for the Validation of Computer Systems in the Manufacture and Control of Drug Products (Pharmaceutical Manufacturers Association, Chicago, 1986).
10. PDA, "Process Simulation Testing for Aseptically Filled Products, PDA Technical Report #22," PDA J. Pharm. Sci. Technol. 50 (6), supplement (1996).
11. J. Agalloco, Comments to FDA, submitted Feb. 5, 2008, Docket No. 2007N-0280.
12. Federal Register, 73 (174), pp. 51919–51933, Sept 8, 2008.
13. FDA, Guidance for Industry for the Submission Documentation for Sterilization Process Validation in Applications for Human and Veterinary Drug Products (Rockville, MD, Nov. 1994)

FDA's Draft Guidance for Process Validation: Can It Be Applied Universally?4

Equipment cleaning and component preparation. Containers and closures for parenterals require cleaning before use to remove trace particles and endotoxins. In addition, some of these items may be treated with silicone-based lubricants to facilitate assembly and use. Nondisposable equipment in contact with the product must be cleaned before use in the next lot and subsequent procedures. Cleaning and preparation processes are production processes in which the objective is to provide items or surfaces that are free of contamination. From that perspective, the design and development for cleaning and preparation procedures are well suited to the draft guidance. The initial and ongoing validation of these processes in the manner outlined for products and processes is possible as well, though the expectations defined in the guidance do seem somewhat excessive for these background processes.
Sterilization processes. All parenteral products include several sterilization processes for the formulation, equipment, and supportive elements. Depending upon the particular sterilization process, the number of parameters to be evaluated can be substantial, and the DOE or multivariate experimental designs mentioned in the design stage of the guidance may be appropriate. Included in most sterilization-cycle development efforts are activities such as component or container mapping, load mapping, and confirmation of material quality and stability postexposure. These activities are well established at many firms in a formal manner, and the application of more scientific approaches would be beneficial where such knowledge is not well structured. The draft guidance can be applied to the initial and ongoing qualification phases, because it would largely align with the statistical approaches for evaluating products exposed to the sterilization process.
The difficulty in applying the guidance to sterilization stems from the biological destruction aspects of these processes. Although statistics are commonly used to calculate sterility assurance levels and the probability of a nonsterile unit, these values are focused on biological-indicator results and usually do not consider the bioburden to a significant extent. The bioburden resistance and population in the commercial-scale process are of critical importance, but development knowledge cannot be translated to a larger scale easily. Sterilizing filtration, which is widely used, represents an even greater challenge because information about the bioburden is essential, and little can be done from a design or development perspective to support it more fully in a commercial setting.
The rigors expected in development and the use of a life cycle from the guidance certainly fit sterilization, but it seems inappropriate to force statistics onto the actual qualification of the sterilization processes themselves, especially the biological destruction components of those processes. Most sterilization processes are validated using worst-case conditions and highly resistant biological indicators, which represent extreme challenges for the process. Does the imposition of statistical expectations add any real value? Aseptic processing. Successful aseptic processing relies on various elements. Facility design, equipment design, numerous sterilization procedures, environmental decontamination, aseptic assembly, and aseptic technique are some of the more prominent components (10). Success with aseptic processing relies on attention to detail in these and other areas. Unlike formulation processes, where the influence of the independent processing parameters can be assessed by reviewing the dependent quality attributes, aseptic processing lacks any meaningful correlation potential.
On a basic level, there are simply too many independent—and interrelated—variables in aseptic processing, and the most meaningful one of all lacks metrics of any type. Personnel are an integral part of virtually all current aseptic processing systems. Their effect on the most important aspect of all, sterility, is also the greatest, and creates the greatest opportunity for failure. Aseptic processing performed by human operators is devoid of any measurable variable that could be used to predict the outcome.
Another concern regarding the adaptation of the draft guidance to aseptic processing is perhaps even greater. In 2008, FDA proposed a change to the CGMP regulations that required aseptic processing to be validated. To many in industry, this expectation could not reasonably be fulfilled with the available tools. The author provided the following comment to FDA:
Aseptic-processing simulations cannot validate an aseptic process. The results obtained demonstrate the capability of the facility, equipment and operational controls to provide a minimal microbial contamination rate in a single event. They cannot be utilized to predict the outcome of a similar process performed at a different time, and thus cannot 'validate' the aseptic process. Successful aseptic processing incorporates a myriad of necessary controls; however these controls, alone or in concert, cannot be relied upon to support the absence of microbial contamination as is routinely accomplished in sterilization validation.
There is sufficient support within the CGMP regulations [implied in 211.113(b)] and guidance documents for aseptic processing (explicit in the 1994 submission guide and 1986–2004 aseptic processing guidance) to support the periodic execution of process simulations. Considering these as validation overstates their utility, and implying that firms having successfully passed a process simulation have attained a 'validated' condition is inappropriate. Sterilization and depyrogenation processes are validated such that the routine controls utilized with them are adequate to support the efficacy of the process. Aseptic processing, because it relies on controls of limited sensitivity, poor robustness, and a substantially greater involvement of personnel with their attendant variability, is inadequately supported by the currently available in-process controls. To consider it 'validated' overstates our imperfect ability to measure and control it. In addition, it could be interpreted as license to relax the controls that are necessary for its success. Aseptic processing is perhaps the most difficult of all processes to control in this industry; suggestions that it can be 'validated' imply a level of control not yet attainable. (11)

FDA's Draft Guidance for Process Validation: Can It Be Applied Universally? 3

Critical process utilities. These systems are largely mechanical and include multiple components assembled to deliver an essential process utility throughout the facility. The equipment components of these systems are easily addressed in a manner essentially identical to that defined for production equipment. The major difference is that the utility equipment requires substantially less development to place it into operational service. The processes to prepare the liquids or gases are largely unchanging, and thus the concepts such as design of experiments (DOE) and multivariate analysis are of limited value. The process utility is performance qualified using a sampling plan that may have statistical elements, though they are not necessary. These systems are customarily subjected to routine monitoring on a frequent basis; many water systems are monitored daily. These systems include stages of initial and ongoing performance qualification, but the attributes of greatest concern, microbial identity and population, are poor fits with most statistical tools. Thus the guidance is not readily applicable to these systems.
Classified and controlled environments. Similar in some ways to the utility systems described above, these systems provide a condition rather than a specific material at a well-defined state. The equipment-qualification side of these systems fits the guidance model reasonably well. Because the systems perform in essentially the same manner at all locations and for nearly all applications, development in the sense required in the guidance does not apply at all. A design phase certainly exists, but its elements are not at all like those related to formulation or synthesis processes.
The simplest of these systems (e.g., cold boxes, incubators, and similar items) require little performance qualification, and the use of statistics is hardly warranted. At the other extreme, aseptic processing areas, the evaluation of successful performance is subject to numerous external influences, so these systems can't be considered performance qualified at all, at least not in an independent manner. Acceptable results in an aseptic processing environment result more from diligence in manual routine decontamination, housekeeping practices, operator gowning, and the like. It would be inappropriate to suggest that once the environment is properly qualified, the activities performed inside it would always be successful. Success simply can't be guaranteed, at least not in any manned environment. The application of the guidance to classified environments without adjustment is not feasible.
Computerized systems. Computerized systems are well established in pharmaceutical manufacturing. The validation of these systems within the GMP environment was outlined by a Pharmaceutical Manufacturers Association committee many years ago (9). The original document described a life-cycle approach that was openly adapted from the software-development life cycle. The similarity between the computerized system and draft process validation guidance life cycles is primarily in their cradle-to-grave treatment of the system being validated. The similarity ends there. The methods for software and hardware development for computerized systems are substantially different from those recommended for process validation. The software and hardware elements of computerized systems roughly parallel process methods and process equipment in the production environment. The equipment components have some degree of similarity, but software development bears little resemblance to the development of a pharmaceutical process on a small scale. The performance qualification of computerized systems in the pharmaceutical industry should be considered a background activity and has neither a requirement for repetitive confirmation nor a statistical component. The real proof of a computerized system in the pharmaceutical industry is in the end-product, be it a completed product, test result, or data field that resulted from the operation of the system. The draft guidance is an extremely poor fit for computerized systems because the testing and documentation required is so different from that associated with process and product situations.
In-process inspection of product attributes. The inspection of pharmaceutical products is an important part of the overall process for their preparation. The purpose of the inspection is to identify and remove nonconforming units before further processing or final release. Inspection processes are quite varied. They include:
  • Inspection of filled parenteral containers for visible particulates
  • Inspection of uncoated and coated tablets
  • Lot-number and expiry-date inspection for printed materials.

The design and development of inspectional systems can be a hybrid of process development and equipment design. The use of DOE practices can be effective in establishing the inspection parameters for automated systems. When the inspection is performed by qualified personnel, a certain amount of that evaluation is possible, but because the operator's inherent diligence and visual acuity are perhaps of greater importance than the controllable inspection parameters, statistical inferences are less useful. Initial and ongoing qualification of these systems is reconfirmation of the optimized inspection parameters in commercial settings with real-world defects. The life-cycle model for process validation appears to fit inspectional systems reasonably well.

FDA's Draft Guidance for Process Validation: Can It Be Applied Universally?2

The draft guidance has a broad scope from API production through to finished dosage forms and therefore must be applicable in various situations. The validation methodology outlined provides substantial flexibility to the industry, avoiding details that might fit well in one instance but be wholly inappropriate in another. Objectively speaking, the guidance's lack of definitions can be viewed both positively and negatively. Firms with extensive knowledge of their processes will appreciate that FDA believes that they should be abe to validate their own processes without having to accommodate arbitrary precepts. On the other hand, firms lacking the requisite process knowledge would likely have preferred substantially more detail with regard to regulatory expectations for process validation. The draft describes FDA's process-validation expectations from a "what to" rather than a "how to" perspective. That approach is wholly consistent with the 21 CFR Part 211 current good manufacturing practices (CGMPs), where firms are given great latitude with respect to the methods used to realize the compliance requirements (5).
The document properly places process equipment in a secondary role. Equipment qualification is certainly expected, but only as a means to support the process and not as the focus of the effort or an end in itself. Providing clarity in this regard is an important step forward and is consistent with FDA's risk-based compliance objective, the recent American Society for Testing and Materials effort at refocusing equipment qualification activities, and industry publications that cite the need for focus on critical end-product quality concerns (6–8).
In summary, the usefulness of the draft guidance for validating pharmaceutical production processes and products is unquestionable. The life-cycle model will result in development and validation exercises that provide relevant and meaningful information. The link between the process parameters that influence the critical quality attributes will serve the industry well. The use of statistical methods will add a rigor to the validation efforts that has been sorely lacking.
Applying the guidance outside the process or product focusPerhaps the only question that need be asked is how easily the draft guidance can be applied to processes and systems that are less clearly related to end-product quality attributes, or where the development model described might be substantially different. The most important of these systems and processes commonly used in the pharmaceutical industry include:
  • Critical process utilities (e.g., water systems and compressed gases)
  • Classified and controlled environments (e.g., ISO 5–8 rooms and cold boxes)
  • Computerized systems (e.g., process control and manufacturing resource planning)
  • Inspection of product attributes (e.g., visible particle and labeling)
  • Cleaning and preparation procedures (e.g., product-contact parts and surfaces)
  • Sterilization processes (e.g., steam, dry heat, and radiation)
  • Aseptic processing (e.g., filling and compounding)
  • Manual procedures (e.g., gowning and sanitization).

The ease with which the guidance can be applied varies substantially with the particular process, and each general category is best considered individually.

FDA's Draft Guidance for Process Validation: Can It Be Applied Universally?1


In November 2008, the US Food and Drug Administration issued Draft Guidance for Industry—Process Validation: General Principles and Practices (1). The guidance outlines regulatory expectations for process validation following a "life-cycle concept" that describes a "cradle-to-grave" approach for validating pharmaceutical processes (2). The life-cycle approach builds upon the results of experimental activities during development to define operating parameters and product specifications that are used in initial and ongoing process qualification. The life-cycle concept provides a robust means for the development, manufacture, and control of pharmaceutical products.
The guidance document covers validation largely at a conceptual level and avoids narrow precepts and specific examples. This approach is appropriate because the document addresses the subject from active pharmaceutical ingredient (API) production (by either chemical synthesis or biological processes) through drug-product production for all pharmaceutical dosage forms. The intended breadth of coverage embraces a myriad of unit operations in the preparation and manufacture of these products. Unstated is whether the draft guidance is intended to be applied to supportive processes that are not an inherent part of the formulation process. Among the support processes are cleaning, inspection, sterilization, and aseptic processing. Each of these processes can be an essential part of pharmaceutical manufacture that requires validation.
Basics of the draft guidance
The draft guidance recommends a defined and structured approach for process-validation activities within an organization. During the design and development stage, experiments should define the relationship between the independent process parameters and the dependent product attributes. These studies should be conducted in a predefined manner, and the results should be documented for later reference. The goal of process development is the attainment of knowledge regarding the process–product relationships to support later commercial production. The greater the knowledge accumulated at this early period, the more assurance the firm will have that it can successfully launch and maintain the process at a commercial scale. From a compliance perspective, this approach makes excellent sense; the knowledge gained during preclinical and clinical-stage experimentation provides a way to link clinical data obtained at the smaller scales with data from the later production process. A well-developed process is one for which the critical process parameters have been identified and control ranges for each have been established (3). The development experiments should evaluate the interaction between the independent and dependent variables until the results of the process are predictable and routinely acceptable. Multifactorial experiments can assess the relationships between the variables and build knowledge about the process's limitations. The experiments performed at a smaller scale establish the acceptable ranges for the various independent process parameters. Thus, when the process is operated under the appropriate conditions, operators have substantially greater confidence that the desired quality attributes of the product will be realized. The acceptability of the end result is ascertained using samples of the completed materials. Initial and ongoing qualification of production processes are the means for establishing and confirming the experimental experience at significantly larger scales of operation. Knowledge gleaned from the development simplifies later activities. Production processes are always operated within the defined operating ranges because there is no reason to experiment with conditions at the extreme ends of the ranges on this larger scale. Challenges during these stages are primarily in the number of tests performed on the produced materials. Qualification lots are customarily sampled at a substantially higher rate than are routine production lots, and testing of these expanded samples is the challenge of the commercial process. The guidance recommends using appropriate statistical tools in the full-scale qualification efforts to provide the desired confidence in process and product acceptability and thus attain the desired validated state. The expectations for statistical evidence in process validation are well founded; sampling batches at the modest levels associated with pharmacopeial tests provide little, if any, proof of end-product quality. Although those tests may be legally binding, they have only limited value. Industry has largely ignored the levels of "real quality" needed to support its claims for patient welfare (4). 

What is Disinfectant Validation? 3

Here, we define sanitation process validation as "establishing documented evidence that a sanitation process will consistently reduce microorganism populations on inanimate surfaces to preestablished levels that are considered safe." The preestablished microbial bioburden levels must be based on public health codes, regulations, industry guidelines, or a scientifically sound rationale.
In the pharmaceutical and food industries, the sanitation process typically follows a cleaning stage. Cleaning with a detergent usually removes or kills more than 90% of vegetative bacteria present on surfaces. The surviving microorganisms are mostly surface-attached, and the sanitation process will inactivate them in-situ.
Disinfectant effectiveness tests do not mirror in-service conditions. The test microorganism inoculum used in DETs does not mimic the behavior of environmental growth (e.g., biofilms and surface-adhered microorganisms). In addition, DETs fail to consider the effects of the preceding cleaning stage. Therefore, the testing conditions used by agents manufacturers for disinfection and sanitation have little relationship to the sanitation processes actually used in the pharmaceutical industry. Manufacturers rely on use-dilution and carrier tests to provide data for registration in the United States and for recommended in-use concentrations.
Validation of a sanitation process should instead be based on empirical measurements of sanitizer effectiveness in working conditions, suggesting the logical equation: successful field tests under in-use conditions = validation.
For chemical disinfectant and sanitizing agents already registered under FIFRA in the United States or the CEN TC 216 work program in the EU, there would be no need to perform in-house effectiveness tests. The DETs already performed by the manufacturer for registration should be sufficient. Abundant experience and USP draft chapter ‹1072› show that sanitizers and disinfectants are more effective in actual use than the DETs indicate. Furthermore, the high inoculum counts used in the laboratory studies represent a "worst-case" scenario. We should, therefore, be able to rely on a second logical equation:
registration = qualification.
This approach would make in-house DETs unnecessary, and would greatly simplify the sanitation validation process by removing the most difficult step; performing DETs. Only sanitizing or disinfecting agents that have not been registered for the intended purpose would then require additional qualification with DETs.
Surface tests would still be required to develop procedures for sanitation processes. These tests would assess the effectiveness of the selected sanitizer against surface-adhered microorganisms. In these assays, microorganisms are dried onto surfaces, sanitized, and then removed for counting by conventional techniques or rapid microbiological methods (RMMs). Established surface tests are straightforward and inexpensive, and can thus be carried out by most microbiology laboratories. Surface tests can reflect in-use conditions like contact times, temperatures, use-dilutions, and surface properties. These tests can be modified to follow a representative cleaning stage, and so will better mimic real in-use conditions. The proposed sanitation process would be developed from the surface tests. Finally, the proposed sanitation process would be validated via challenge in field tests.
The sanitation process validation would then be performed following a simple methodology:
  • clean the equipment;
  • sanitize the equipment;
  • take microbial bioburden samples.

Summary
The validation of agents for sanitizing and disinfecting seems like a major undertaking, but does not need to be. These agents are not validated; they are qualified for the intended purpose, and then the sanitation process itself is validated. The most difficult part in sanitation process validation would be the execution of disinfectant effectiveness tests by the user. In-house DETs are superfluous, however, when the selected sanitizer or disinfectant has been registered in accordance with FIFRA or the CEN TC 216 work program. The sanitation process validation methodology proposed here includes cleaning as part of the sanitation process, and sanitation process validation becomes the successful execution of field tests under actual in-use conditions. The resulting methodology would be cost-effective, simple, and time-saving.
Acknowledgment
My gratitude to Daniel Y. C. Fung, professor of food science at Kansas State University, for his kindness in reviewing a draft of this article. And special thanks to Douglas McCormick, Advanstar Communications, for his assistance in the final editing of the manuscript.
José E. Martinez is a consultant for JEM Consulting Services Inc., PMB 652 Box 4956; Caguas, PR 00726, tel. 787.349.3857,
References
1. US Food and Drug Administration, "Guideline on General Principles of Process Validation," (FDA, Rockville, MD, 1987), http:// http://www.fda.gov/cder/guidance/pv.htm, accessed Feb. 15, 2006.
2. Draft General Informational Chapter ‹1072›, "Disinfectants and Antiseptics," Pharmacopeial Forum (2002).
3. E.C. Cole, W.A. Rutala, and G.P. Samsa, "Disinfectant Testing Using A Modified Use-Dilution Method: Collaborative Study," J. Assoc. Off. Anal. Chem. 71 (6), 1187–1194 (1988). 4. E.C. Cole and W.A. Rutala, "Bacterial Numbers on Penicylinders Used in Disinfectant Testing: Use of 24 Hour Adjusted Broth Cultures," J. Assoc. Off. Anal. Chem. 71 (1), 9–11 (1988).
5. E.C. Cole, W.A. Rutala, and J.L. Carson, "Evaluation of Penicylinders Used in Disinfectant Testing: Bacterial Attachment and Surface Texture," J. Assoc. Off. Anal. Chem. 70 (5), 903–906 (1987).
6. S.F. Bloomfield et al., "Development of Reproducible Test Inocula for Disinfectant Testing," Int. Biodeterioration & Biodegradation 36 (3–4), 311–331 (1995).
7. M.D. Johnston, E.A. Simons, and R.J.W. Lambert, "One Explanation for the Variability of the Bacterial Suspension Test," J. Appl. Microbiology 88 (2), 237–250 (2000).
8. S.B.I. Luppens, F.M. Rombouts, and T. Abee, "Disinfectants in a Suspension Test," J. Food Prot. 65 (1), 124–129 (2002).
9. V.S. Springthorpe and S.A. Sattar, "Carrier Tests to Assess Microbicidal Activities of Chemical Disinfectants for Use on Medical Devices and Environmental Surfaces," J. Assoc. Off. Anal. Chem. Int. 88 (1), 182–201 (2005).
10. Center for Food Safety and Applied Nutrition, "Comprehensive List of Terms," (FDA, Rockville, MD, 2001), http:// http://www.cfsan.fda.gov/~dms/a2z-s.html, accessed Feb. 15, 2006.
11. US Environmental Protection Agency (EPA), "Antimicrobial Pesticide Products," (EPA, Washington, DC, 2004), http:// http://www.epa.gov/pesticides/factsheets/antimic.htm, accessed Feb. 15, 2006.
12. "Disinfectants," (Essential Industries, Inc., Merton, WI), http:// http://www.essind.com/Disinfectants/DN-Glossary.htm#Disinfection, accessed Feb. 15, 2006.

What is Disinfectant Validation? 2

The CEN TC 216 methodology has three phases to its Disinfectant Effectiveness Test—European Tiered Test Approach (see Table II).
As a result of the CEN TC 216 work program, and with the concurrence of the EU Biocides Directive, several BS EN (European Standards as adopted by the British Standards Institute, test methods have been issued. These pass–fail test methods detail how disinfectants should be tested against selected cultures under controlled laboratory conditions. They also require an evaluation of the laboratory test results against field test results (see Table III).
Disinfectant effectiveness tests are not yet routine, so most drug-industry microbiologists are unfamiliar with them. The protocols take a long time to learn and a long time to perform, which makes them expensive. In addition, they are difficult to do and are done infrequently, which makes it hard to maintain laboratory personnel proficiency and hard to produce reproducible results.

Table III: British Standards (BS) EN test methods.
Reproducibility is critical to proper interpretation of results. Variability in AOAC`s tests is mostly caused by errors in preparing test solutions and media. Some studies have shown, however, that the AOAC testing technique (3, 4) and inoculum preparation (5) may also be at fault. Variations in conditions during cold storage (i.e., refrigeration, freezing, freeze-drying) can affect the predominant genotype in the stock culture or source culture (6). Thus, when pharmaceutical companies try to validate disinfection processes using AOAC protocols, they may be unable to reproduce the disinfectant manufacturers' bactericidal label claims. CEN TC 216 tests are also liable for reproducibility problems caused by variation in the inoculum size and preparation (7, 8). In the United States, the EPA shares regulatory authority with FDA, which has jurisdiction under 21 CFR 880.6890, General purpose disinfectants. This Subpart (concerned with medical devices), defines a general purpose disinfectant as a "germicide intended to process noncritical medical devices and equipment surfaces." (Noncritical medical devices make only topical contact with intact skin.) Liquid chemical sterilizers intended for use on critical or semicritical medical devices are defined and regulated by FDA under 21 CFR 880.6885, Liquid chemical sterilants/high-level disinfectants.
Sanitation process validation

Definitions and distinctions
A review of regulatory definitions (see sidebar, "Definitions and Distinctions") quickly leads to the realization that "disinfectant validation" is actually equivalent to sanitation process validation. This analysis leads to the following proposal for improving the validity of the validation process while making it easier and less time-consuming to perform. This approach is based on the CEN TC 216 work program methodology for registering disinfectants in the European Union and on the quantitative carrier tests (QCT) developed by the University of Ottawa for environmental surfaces and medical devices (9). Like the CEN TC 216 method, the Ottawa QCTs first assess microbicidal performance under ideal conditions, and then move on to assess field performance with more stringent tests.

A Risk-Management Approach to Cleaning-Assay Validation 4

Analytical methods. The model and worst-case compound evaluation did not replace any analytical-method validation activities. Analytical recovery must be established for each compound in the portfolio, but not on all surfaces. If multiple limits are to be considered, or if a range of reporting is required, the lowest limit may be evaluated, and that recovery can be applied to all acceptance limits as a conservative estimate. In the analytical method, three recovery factors were presented: Group 1, Group 2, and Group 3. Methods could be validated for any surface within a group and could be considered representative. The authors chose stainless steel 316L because it is the most prevalent, and cast iron because it is a common material on a tablet press. Type III hard anodized aluminum is the only surface in Group 3. This strategy did not ignore any uncommon surfaces. It grouped them appropriately, swabbed them, and applied a representative recovery factor.
Variability. Method variability was evaluated by performing a control sample (Compound A, 6 replicates, 0.5-μg swab, stainless steel 316L, Ra = 3.5) each day. The mean recovery of the entire experiment was 52%. These data suggested that the swabbing ability of the analyst did not change over time. The standard deviation within a day typically was less than 6. The pooled-within-run standard deviation was 3.99 over the course of the experiments. This value was used as a criterion for grouping new surfaces. The day-to-day standard deviation was 15.34.

Figure 5
Incorporating new materials of construction into the grouping strategy. Periodically, new equipment will be introduced into the CTM area that incorporates a product-contact surface made of a material of construction that is not listed in Table II. This problem is often caused by alloys of metals that have already been evaluated and by polymers of proprietary composition. Because surface recovery must be evaluated, personnel need a way to incorporate new surfaces into the groupings outlined in Table II. When a new piece of equipment is purchased, CTM-engineering employees prepare the needed documentation to evaluate the equipment with regard to the cleaning program before use. If an identified sampling location is made of a new material of construction, the engineer asks the person responsible for the cleaning program and analytical development to perform the next steps, which are shown in Figure 5. Suppose that new equipment incorporated three new materials: a crystalline thermoplastic polyester marketed under the trade name of Ertalyte (Quadrant Engineering, Reading, PA), stainless steel 420, and stainless steel 630. Without the grouping strategy in place, method revalidation would have to occur for all compounds handled by this piece of equipment. With the strategy in place, these surfaces are placed into groups based on model-compound recovery. No method revisions are required. The analytical recovery of Compound A was evaluated for Ertalyte, stainless steel 420, and stainless steel 630 at the 5.0-μg/in.2 level. The authors used the validated method to evaluate the recovery of the three new surfaces compared with a representative surface from Group 1 (i.e., stainless steel 316L), Group 2 (i.e., cast iron), and Group 3 (Type III hard anodized aluminum). Recovery was evaluated for both the group representative and the new surfaces on the same two days with three replicates on each day. As an alternative, six replicates may be performed on the same day as the controls because the comparison of recovery is relative.


The new surfaces are placed in one of the three groups or define a new lower group, based on how close their average is to that of the group control. The authors obtained a cutoff value of 3.0% in the following manner. Based upon the data for the control, the within-day standard deviation was calculated to be 3.99 and was used in Equation 1. A series of one-sided hypothesis tests with an error rate of α = 0.10 were performed to assess whether the new surface mean was less than a specified control-surface mean. The confidence limit half-width for the difference between the means of two surfaces was computed using the within-day standard deviation because the replicates for each surface had to be run on the same two days, with each day treated as a block. For these calculations, it was assumed that this standard deviation was known. The lower one-sided confidence limit for the difference in means was derived using the following steps: This calculation did not indicate a difference between a control surface and the new surface under evaluation if the recovery differed by less than 3.0%. This approach was conservative because Eq. 1 categorized a new surface into Group 2 if it differed from stainless steel by more than 3.0%, which could be viewed as a strict criterion. Because the results were obtained on the same days and runs, the authors believed that this approach was reasonable. When evaluating the recovery of a new surface, this strategy helps personnel to place each material into the appropriate group. The grouping starts with a comparison of the new surface average to that of Group 1 (stainless steel 316L) and continues sequentially. If the new surface recovery (NSR) is more than 3.0% less than that of the group reference, it is compared with the reference surface in the next lower group until a group is found with which it does not differ by more than 3.0%. If no such group is found, then the new surface forms a new, lower group. The procedure is as follows:
  • If the NSR > mean stainless-steel recovery – 3.0%, the new surface belongs in Group 1.
  • If the NSR > mean cast-iron recovery – 3.0%, the new surface belongs in Group 2.
  • If the NSR > mean Type III anodized aluminum recovery – 3.0%, the new surface belongs in Group 3.
  • If the NSR < mean Type III anodized aluminum recovery – 3.0%, the new surface becomes a new group or becomes the worst-case surface for Group 3 and is used in all future method validations.


Table III: Swab recovery of three new materials of construction compared with controls from each representative group.
The data for the three new surfaces are outlined in Table III. Based on this approach, Ertalyte was placed into Group 1 because the difference between its recovery and that of stainless steel 316L (i.e., Group 1) was less than 3.0% (i.e., 2.48%). The recovery from stainless steel 420 and stainless steel 630 was more than 3.0% less than that from stainless steel 316L, but greater than that from cast iron. The authors placed stainless steel 420 and stainless steel 630 into Group 2. The placement of these two grades of stainless steel into Group 2 highlighted the conservative nature of this approach because their recoveries were only 4% and 8% less than that from stainless steel 316L. The recoveries were 12% and 9% greater than that from cast iron for stainless steel 630 and 420, respectively. Table II was updated to reflect this placement. Because the grouping strategy was conservative, it prevented the underestimation of recovery factors when assay values were reported and prevented the formation of additional groups for method validation unless the recovery value for a new surface is sufficiently low to warrant such an addition. 
Conclusion
The authors' data-driven risk-management approach to cleaning verification methods uses analytical-recovery values for a model compound to place product-contact surfaces into groupings for analytical-method validation. The data generated during the studies supported the formation of three recovery groups to validate analytical swab methods. Groups 1–3 were represented by stainless steel 316L, cast iron, and Type III hard anodized aluminum, respectively. This approach allowed all surfaces to be considered during analytical-method validation and provided an objective mechanism to incorporate new surfaces into the strategy.
The benefits of this strategy are numerous. First, only three surfaces must be validated on each compound, which drastically minimizes the number of recovery values established to support the entire portfolio. Second, the strategy includes a way to add new materials of construction to the cleaning program if new equipment is purchased. Traditionally, all swab methods must be revalidated to incorporate the new surface. With this strategy in place, a model compound is evaluated, the new surface is grouped, and no changes to existing methods are required. Third, the strategy allows for a constant state of compliance. A relative recovery value is known for any material of construction for all equipment.
Because the grouping strategy is applied to a small fraction of the total surface area, no surface material of construction is ignored, each molecule undergoes a typical method validation, and the strategy places surfaces into groups conservatively. The authors believe that the strategy controls risks appropriately and that the data set given in this study scientifically supports the strategy of grouping materials of construction to support analytical methods within the cleaning program. Acknowledgments
The authors would like to acknowledge the following colleagues at Eli Lilly: Gifford Fitzgerald, intern, for generating the swab-recovery data; Ron Iacocca, research advisor, for the SEM data; Sarah Davison, consultant chemist; Mike Ritchie, senior specialist; Mark Strege, senior research scientist; Matt Embry, associate consultant chemist; Kelly Hill, associate consultant for quality assurance; Bill Cleary, analytical chemist; and Laura Montgomery, senior technician, for their contributions and insightful suggestions throughout the project. In addition, Leo Manley, associate consultant engineer, provided the roughness measurements in support of this project.
Brian W. Pack* is a research advisor for analytical sciences research and development, and Jeffrey D. Hofer is a research advisor for statistics, discovery and development, both at Eli Lilly and Company, Indianapolis, IN, tel. 317.422.9043,
.

*To whom all correspondence should be addressed.
Submitted: Oct. 12, 2009. Accepted: Dec. 22, 2009.
References
1. ICH, Q9 Quality Risk Management, Step 5 version (2005).
2. Code of Federal Regulations, Title 21, Food and Drugs (Government Printing Office, Washington, DC), Part 211.67.
3. ICH, Q7 Good Manufacturing Practice Guide for Active Pharmaceutical Ingredients, Step 5 version (2000).
4. FDA, Guideline to Inspection of Validation of Cleaning Processes (Rockville, MD, July 1993).
5. L. Ray et al., Pharm. Eng. 26 (2), 54–64 (2006).
6. PDA, Technical Report 29, "Points to Consider for Cleaning Validation" (PDA, Bethesda, MD, Aug. 1998).
7. G.L. Fourman and M.V. Mullen, Pharm. Technol. 17 (4), 54–60 (1993).
8. ICH, Q2 Validation of Analytical Procedures: Text and Methodology, Step 5 version (1994).
9. R.J. Forsyth, J.C. O'Neill, and J.L. Hartman, Pharm. Technol. 31 (10), 102–116 (2007).

Pharmaceutical Validation Documentation Requirements

Pharmaceutical validation is a critical process that ensures that pharmaceutical products meet the desired quality standards and are safe fo...