SAP Validation Environment for Pharma Companies

The pharmaceutical industry is responsible for all the development, production and supply of pharmaceutical product. These products are needed to save lives, prevent disease and helps in maintaining quality of life. Preparations must be safe and effective for patients & the general public. Preparation should be such that there is no risk to life or side effect involved.  It must protect consumer. For this we need regulatory bodies who can control the preparation process of these products.
There are International reputed agencies who acts as Regulated bodies for Pharmaceuticals industry
  • Food & Drug Administration (FDA) –Relevant for US Market mainly
  • European Medicines Evaluation Agency (EMEA) –For European Region
  • Drug Controller General of India –For India
Following are the Regulation involved:
GxP: The term GxP means GMP (Good Manufacturing Practices) ,where ‘x’  includes
  • GCP (Good Clinical Practices)
  • GLP (Good Laboratory Practices)
  • GDP (Good Distribution Practices)
What is Validation?
Validation is a process of establishing documents that provides guarantee that if we follow a particular process consistently to produce a product we will be able to meet predetermined specifications and attributes.
Validations in Pharmaceutical Company
  1. Process Validation: This involves validation of Manufacturing Process, Plant & Machines, and Manufacturing Techniques etc.
  2. Cleaning Validation: This involves validation of Labs, Production, Packaging (Proof of clean), Equipments, Machinery, Instruments, Cleaning Techniques etc.
  3. Method Validation: This involves validation of Laboratory methods (R&D…), Quality Verification methods, Analytical Equipment etc.
  4. Computer System Validation: This involves validation of Computer Related Systems (any place), Computer Systems & Operating      procedures,   Design, Installation, Performance etc.
V Model Concept for System Validation
Below is universally accepted model popularly known as V model for validation
SAP validation SAP Validation Environment for Pharma Companies
V Model Concept for System Validation
  • User Requirement Specifications (URS) : This step defines what user wants from the system and makes sure that requirement is met
  • Functional Specifications: This step defines or creates a document in user understandable format based on URS. This is developed by developer or vendor or service provider.
  • Design Qualification DQ: This step provides a document which states that the computer system requirements have been clearly defined and is approved in the form of URS. It also provides document that states all additional system specifications are developed, reviewed and approved.
  • Installation Qualification IQ: In this step document is provided which gives the guarantee that all hardware installation aspects are considered according to appropriate codes and approved design. Also software installation is also considered.
  • Operation Qualification OQ: In this step document is provided which gives the guarantee that installed system works as specified.
  • Performance Qualification PQ: In this step document is provided which gives the guarantee that entire system performs as expected under all operating ranges.
nchitre SAP Validation Environment for Pharma Companies
Nikita Chitre

ArchitectSAP Solutions
SAP. Streamlined.

SAP System Validation and Compliance Methodologies for Pharma Companies

The SAP system setup for Pharmaceutical companies is same as any other industries/organizations; just Pharma companies are regulated by special Government bodies by the state. This is because Pharma companies manufacture medicines and other products which are essential for maintaining good human life and hence are sensitive entities. To launch any new pharmaceutical product in the market, it has to be rigorously tested and test results documented and are verified by the regulating bodies. To carry out SAP system validation of a pharmaceutical industry which has implemented SAP, one should gather basic knowledge of all interfaces of SAP and the FDA guidelines for the product about the interface to internal and external hardware and software taking into account.
There are different regulatory bodies for pharma companies in different countries like:
  • Drug Controller General of India – For India
  • FDA – Food and Drug Administration for US
  • EMEA – European Medicines Evaluation Agency
Following are the reasons that explain the need of regulation for Pharmaceutical industry:
  • Quality assurance – Pharmaceutical companies produce drugs, medicines and medical equipments which are used by hospitals worldwide for treating patients. Thus the manufacturing process of these must be very hygienic, safe and of top quality.
  • Government Policies – The Government has some guidelines, rules and procedures for pharmaceutical companies and it has its dedicated team of representatives who conduct regular inspection checks along with surprise visits to the pharmaceutical companies. This is to ensure that best practices are being followed while producing life saving materials.
  • Upgradation – With the changing world and market scenarios, the rules and procedures need to be analyzed, studied and modified, if required. Also new steps or rules may be added.
Following are the general basic steps involved in all the Validation processes:
  1. Gap Analysis – GAP here means incompleteness or missing links. GAP analysis means analyzing the SAP system through available documentation, and current status of SAP with reference to the guidelines of Regulatory agencies and attempt to find out from available implemented systems.
  2. Plan of Action – Based on our gap analysis we create action plan along with client, and decide which gaps can be closed technically and which are to be attended by other procedures. There may be some offset in closing the Gaps.
  3. Risk Assessment – After the plan of action, we will carry out detailed Risk Assessment of relevant modules. If required the Human resource module and distribution part is also analyzed. In this step, we determine the important transaction codes and classify them into typical three categories as per GAMP guidelines. The further validation procedure is based on this classification.
  4. Validation - This step is the formal start of validation process with the creation of VMP or Validation Master Plan. Then comes the protocols preparation.
Computer System Validation (CSV)
This method is to verify whether the computer systems are delivering their part. It includes hardware, software and the firmware. This method requires detailed documentation of the software or application used by the manufacturing process in the Pharma company.
The documentation involves:
  • List of hardware required
  • Minimum hardware required to maintain the SAP support system
  • Functional specification about the requirement
  • Applications and benefits
  • Test results with comments/feedback.
Following are some of the problems that may arise during a validation project:
  • Difficulty in understanding the requirement if the documentation i.e. the functional specification is not proper.
  • Secondly, in most of the cases, the validation group is assigned task only after the SAP has been implemented in the industry and is in function for more than year. So here it is difficult to recognize what output the system was supposed to produce. To overcome this problem we should seek guidance and expertise of experienced functional consultants of all modules.
  • We also need to define the present requirements. For this we have build a central team which consists of all kinds of users as well as senior representatives from Quality Assurance, Quality Control, IT Dept and Validation and partial finance persons.
akapgate SAP System Validation and Compliance Methodologies for 
Pharma Companies
Abhijeet Kapgate

ArchitectSAP Solutions
SAP. Streamlined.

Validation of Equipment – Pharma Industry

The concept of validation was first developed for equipment and processes. In 1993 the software for a large radiotherapy device was not designed and tested properly. Several problems resulted in several devices giving doses of radiation higher than intended. Because of that several patients died and several got permanently injured. That is why validation of equipment is required for pharma industry. Following are the steps followed to do validation of equipment in SAP and how to create link between maintenance order and calibration order.
Validation of Equipment after Corrective Maintenance by QC in SAP.
  1. When Equipment Breakdown is reported by Production, Maintenance Person gets a break down order. He attends the Breakdown through Breakdown order.
  2. After solving the issue Maintenance person submit the Equipment to Production for check/dry run
  3. Then Production person will raise a Request for Validation to Quality department.
  4. Quality person will validate the Equipment through calibration order. They will check all the parameters, Room Temperature, humidity & also the parts replaced by maintenance etc. according to details mentioned in the Maintenance order.
  5. After completion quality person will revert back to Production, Production gives approval to close the Maintenance order.
  6. To create link in SAP between Maintenance order and Calibration order, you can use suborder concept.
In SAP create a maintenance order PM01 for break down, and then create a calibration order PM05 as sub order (IW36). You can settle the sub order to a different cost center, so that calibration order cost is settled to Quality department.
nchitre Validation of Equipment – Pharma Industry
Nikita Chitre

ArchitectSAP Solutions
SAP. Streamlined.

FDA’s Process Validation Guidance Calls for Technology Solutions

As the FDA gathers feedback and finalizes its “Guidance for Industry—Process Validation: General Principles and Practices” draft published in November, life sciences companies are looking carefully at the updated recommendations and how to apply changes to their own process development and manufacturing capabilities.
These updates to the 22-year-old guidelines are welcomed by those of us who are passionate about manufacturing process excellence. Leaps and bounds have been made in fundamental enabling technologies since the FDA first published its original guidance on process validation in 1987. For perspective, ’87 was the year Sun Microsystems introduced the SPARC processor and IBM launched its PS/2 personal computer with a 3.5-inch diskette drive. We were still recording on paper-based systems and many years away from cost-effective storage of large amounts of electronic data—let alone the ability to access and leverage such data as the basis for decision making.
Today, as life sciences manufacturers continue to catch up to technologies’ new capabilities, many have migrated to electronic systems and records, but they are still not able to provide self-service access and aggregation of data for end users in a context that allows them to easily understand their processes and capitalize on potential improvements.
You absolutely have to have appropriate software capabilities in place.
They have to request and search manually through mountains of data from disparate sources—enduring the “spreadsheet madness” that makes problem-solving time consuming and error-prone. The FDA draft’s emphasis on using data continuously for validation makes the accessibility and analysis of process data more important than ever from a regulatory perspective. The updated guidelines define process validation as “the collection and evaluation of data, from the process design stage throughout production, which establishes scientific evidence that a process is capable of consistently delivering quality products.” Now, on-demand access to the right data in the right context for analysis becomes central to Performance Qualification.
To leverage knowledge and understanding as the basis for establishing an approach to control that is appropriate for the manufacturing process, the FDA guidance recommends that manufacturers:
• Understand the sources of variation
• Detect the presence and degree of variation
• Understand the impact of variation on the process and ultimately on product attributes
• Control the variation in a manner commensurate with the risk it represents to the process and product.
The document says, “Process validation involves a series of activities taking place over the lifecycle of the product and process.” The FDA describes in detail process validation activities in the following stages, saying, “In all stages of the product lifecycle, good project management and good archiving that capture scientific knowledge will make the process validation program more effective and efficient. These practices should ensure uniform collection and assessment of information about the process, reduce the chance for redundant information gathering and analysis, and enhance the accessibility of such information later in the product lifecycle.”
To fully apply the FDA’s current thinking in process development and manufacturing, you need the right technology tools. While PAT implementations have been encouraged by the FDA for years—especially in the agency’s risk-based approach initiative—this is probably the most important published guidance that ties technology to process validation practices in a way that spans process development, quality and manufacturing. You absolutely have to have appropriate software capabilities in place to access, aggregate, analyze, understand the information content and report data on-demand by end users in a meaningful context, such as one that allows correlation of upstream parameters with downstream process outcomes (automatically accounting tor batch genealogy), during both process design and commercial operations.
The FDA is pointing manufacturers to practices that can continuously improve their process understanding and, therefore, make the process validation program more effective and meaningful. Like many of its regulatory efforts and mandates, what is good for the FDA and consumers is also beneficial for manufacturers. In this case, leveraging data in the right way can enable greater Process Intelligence, which helps control variability and improve quality, results in higher yields and speeds time to the commercial market.
The new document states, “In addition to being a fundamental tenet of following the scientific method, information transparency and accessibility are essential so that organizational units responsible and accountable for the process can make informed, science-based decisions that ultimately support the release of a product to commerce.”
This is certainly a progressive step forward for the FDA and a win-win situation for an industry ready to embrace technology tools for improvement.

About the Author
Justin O. Neway, Ph.D., is executive vice president and chief science officer at Aegis Analytical Corporation. He has more than 25 years of experience in pharmaceutical and biotechnology manufacturing, and in the development and application of software solutions to quality compliance and operational efficiency in life science manufacturing.

Computer System Validation: Are You Ready for an Inspection?

Is your organization ready for an inspection of your Computer System Validation program? In this article, I will offer some key tips on how to prepare for an inspection of your computer system validation (CSV) program. Often times, the FDA comes to inspect your facility for reasons other than your CSV program.  However, because so many of our business processes are governed by electronic systems, the topic of Computer System Validation inevitably comes up during the course of an inspection. As a result of an increase in federal investigators, investigators are able to inspect more facilities and dig deeper into areas such as Computer System Validation.
The very first item you can prepare is a system inventory list. The list itself should be organized by GxP (GCP, GLP or GMP) so you can sort the list accordingly during an inspection or for analysis purposes. Next, the list should include a system owner for each system so you know who is responsible for the system and who to call upon during an inspection.
Another important piece to include is the location of the computer system validation documentation, as this is not only important from a storage aspect, but also an access point of view. Under everyday operations, one should have easy access to this documentation. During an inspection, it is key to have quick access to such documentation, as investigators do not like to wait too long for documents they request. The last attribute to include in your system inventory list is an indicator of whether or not a system is critical. To determine criticality, it is wise to determine whether or not a system is used for regulated purposes or used to make decisions as they pertain to regulatory requirements. Flagging these can not only help you better prepare for an inspection, but also better allocate your resources more effectively for the inspection preparation activities.
The second item you should ensure you have in place is governing Standard Operating Procedures (SOPs). You should have the following SOPs in place for compliance and operational purposes as well as inspection purposes. The first important SOP to have in place is the Computer System Validation SOP. This SOP should outline and detail the Validation Lifecycle. It should include requirements for key areas such as Intended Use, Validation Master Plan, User and Functional Requirements Specification, System Design Specification, Traceability Matrix, Installation Qualification, Operational Qualification, Performance Qualification, Validation Summary Report, and System Release Memo. Investigators will want to see that you detailed the requirements for all of your deliverables in the Validation Lifecycle.  Furthermore, they will look to ensure the deliverables have the right dependencies and are in the correct order.
The second SOP to have in place is the Software Development Lifecycle (SDLC) SOP. This SOP should outline the steps needed to perform the SDLC for custom applications and should handshake with the Computer System Validation SOP.
The third SOP to have in place is a Change Control SOP. The Change Control SOP should describe the process for controlling both software and hardware in the production environment. Furthermore, the SOP should require a Change Control Board that is responsible for reviewing and approving all changes. It is also important to include a classification of change. For example, there should be three classifications: Minor, Major, and Emergency. Classifying the changes helps manage the change control process more effectively.
Assembling a Change Control Board and classifying changes gives inspectors a good sense that your process is in control, as it shows you are well organized when assessing changes. It is notable to mention there are an increasing amount of 483s and Warning Letters for companies failing to exercise change control and perform computer system validation for their respective systems.
The fourth SOP to have in place is a Bug and Issue Tracking SOP. This SOP is becoming increasingly important, as regulators are asking more questions when it comes to how bugs and issues are handled. Therefore, it is important to have a mechanism in the Bug and Issue Tracking SOP to handle whether or not validation is required as a result of fixing a bug or an issue. Investigators often look for this mechanism.
A fifth SOP that is an important aspect of inspections is a Good Documentation Practices SOP. It is very important o have a structure in which to document your evidence of validation activities. Deploying good documentation practices helps ensure the integrity of your computer system validation and change control documentation. Investigator are quick to point out documentation errors when reviewing documentation, therefore, it would be prudent to have a sound Good Documentation Practices SOP in place. Please remember there is a good amount of documentation when it comes to computer system validation.
An important aspect of preparing for inspections is to prepare critical documentation for review by an investigator. How do you determine what is critical?  Well, a good first step to determining what is critical is to determine what system are used for regulatory purposes or to make decisions as they pertain to regulatory requirements. For example, an obvious application is your Adverse Event Reporting System, which handles Adverse Events that are reported to the FDA. A Documentation Management System is another example of a critical system, as it is often used to manage and store regulated documents. What validation documents for your critical systems should you be reviewing? There are some documents that are prone to mistakes so I would recommend these types of documents be reviewed first.
An example of a document that is error prone is your test scripts. There are two important points to look for here. First, you want to ensure the test scripts are legible and free of errors. Secondly, you want to ensure the test scripts have the appropriate objective evidence attached to them. This is a very important point to bear in mind, as investigators often like to see objective evidence. What do I mean by objective evidence? Objective evidence is evidence that a test script has met its objectives. A great example of how to capture objective evidence is through screen shots. However, be careful where to require screenshots, as you only want screenshots that prove the test script met its objective. Otherwise, you end up with too many screenshots that do not necessarily prove the test script has met its objectives.
Another important document to review prior to an inspection is your Traceability Matrix. Your Traceability Matrix ensures your intended use of the system has been tested through test scripts that map to a set of approved requirements. Investigators are now reviewing Traceability Matrices to ensure the system is tested effectively against it s system requirements. I encourage the review of this document during the end of your validation efforts and prior to approving the Traceability Matrix. However, it is a good idea to double-check your Trace Matrix before an inspection, as an investigator can find one requirement that was not tested and therefore declare that your system has not been validated per its intended use and there is not validated. As you can see, the Trace Matrix is a very important document.
The last document to review before an inspection is your System Release Memo. The System Release Memo signifies the system has been released into production. Some important point s to check here are the dates. It would be advantageous to ensure your signature date of your System Release Memo is after the signature date of your Validation Summary Report. Small items like these are often overlooked by companies, as there is usually a rush to release systems into production and sometimes it happens before the documentation is complete. There is no explanation for errors like these, as it shows you are not following your SOPs and that will be the nature of the 483 observation if an error like this is found by an investigator.
In conclusion, there is much to consider about your Computer System Validation program prior to an inspection. Remember to have a system inventory list in place, the proper SOPs in place, and to inspect your critical systems and their documentation prior to an FDA inspection. Again, just because the FDA may be inspecting your facility for other reasons, doesn’t discount the possibility they will want to audit your Computer System Validation Program. As stated, so many of our respective business processes are carried out by way of electronic systems in this age of technology. Therefore, it would be advantageous to assess your Computer Validation Program whether you foresee an inspection or not. Having an efficient Computer System Validation Program in place helps ensure the integrity of your electronic records, allocate resources more effectively and therefore can yield long term cost savings to your company.

Staffing, Training and Validation Are Critical to PAT

By Paul Thomas, Managing Editor

Many manufacturers are “doing” PAT these days, but few individuals truly have a grasp of what it means and what its potential is in the pharmaceutical industry — for process monitoring, and design and control as well.

One who knows PAT inside and out is Bruker Optics’ VP Dan Klevisha. Klevisha has been trying for years to convince pharma firms of the merits of PAT. Only since FDA put its full weight behind analytical technologies like those Bruker offers — such as process monitoring IR, NIR and Raman — was he able to make significant headway. Seventeen of the top 20 pharmaceutical firms are now using Bruker for some sort of PAT application, he says.

I sat down with Klevisha at last month’s Pittcon show in Orlando, and picked his brain about PAT and where it’s headed.

Bruker developed its Tandem on-line NIR device for monitoring tablet uniformity at a major pharma firm's request; there was no market for such an instrument before FDA began promoting PAT.
What is different about PAT these days? I asked. Unlike a few years ago, he said, every firm has committed substantial resources and has a defined organizational strategy to approach PAT. “The manufacturing segment of pharma has finally accepted PAT for real,” he said.

Some firms have been vocal about their PAT efforts, others not so. While he won’t name names, Klevisha says some of the firms leading the pack in terms of progressive PAT strategies are those we hear least from.

What’s the greatest risk firms face in implementing PAT? There are two, Klevisha said. First, “they have to make sure that they have a sufficient staffing commitment to support the deployment of a PAT system. This includes having people qualified to transfer that knowledge to the manufacturing staff.” There’s a very long learning curve for PAT applications, he added, and firms need to commit to thorough training programs for personnel.

Finding experts to train the manufacturing ranks is not easy, he said. “Process analysis experts from chemistry, defense, and telecom are in heavy demand,” Klevisha said. “The challenge is that these individuals don’t have experience in a pharmaceutical environment.”

The second greatest risk is validation of PAT-enabled processes. “The industry has a good sense of validation and equipment qualification,” he said. “But that has to be rethought for PAT. Traditional validation processes are too time-consuming and expensive.” Costs of PAT equipment, in fact, can be dwarfed by the cost of validating its use, Klevisha said. Understanding the science behind PAT will be key for firms to solve validation issues.

An understanding of processes grounded in science is, of course, what FDA is looking for. I asked Klevisha what FDA’s PAT initiative has meant to his and Bruker’s business. Bruker has focused on PAT instrumentation for more than 12 years, he noted. But for most of those years, few pharmaceutical firms expressed even the slightest interest in process analytical technologies and systems. As a result, Bruker had no need to pursue technologies that it had within its grasp.

That has changed dramatically since the initiative. Klevisha used the example of the company’s Tandem on-line NIR device for monitoring tablet uniformity. After FDA’s change of course, a major pharmaceutical firm asked Bruker to develop the device. “Without PAT, we would not have developed it,” he said. “There would have been no market.”

Where is PAT headed in the next two or three years? "In that time, we’ll have a firmer understanding of successful applications," Klevisha said. Right now, comprehensive PAT strategies are just getting under way and offer no real lessons to speak of. With trial and error will come experience. For Bruker, Klevisha believes, that will translate into better and less expensive tools and solutions, and easier and faster deployment.

When payback is realized, investment will increase, he continued. “Firms will demand robust, single-purpose, application-specific devices,” he predicts. One company in another industry has already implemented more than a hundred different Bruker devices, Klevisha noted. Eventually, pharmaceutical firms will have similar needs.

The “Intended” Consequences of Software Validation

With the growing importance of information management and electronic documentation within the pharma GMP world, it was only a matter of time until regulators made software validation a focal point of their work. “The FDA has increased its awareness of software validation and investigators are now more mindful of it during the inspection process,” notes Michael Gregor, CEO of Compliance Gurus (Boston, Mass.). “In fact, FDA investigators are after objective evidence of regulated applications requiring software validation.”
Most manufacturers have yet to adapt to this change, says Gregor, but they now have little choice. In this conversation, Gregor offers insight into the fundamentals, and pitfalls, of software validation efforts:
PhM: What’s behind FDA’s increased awareness of software validation, and has it caught manufacturers off-guard?
M.G.: An increase in federal investigators has allowed the FDA to concentrate on a more broad range of regulations. Yes, many manufacturers have been caught off-guard.
PhM: What’s the status of software validation guidance, and are there updates on the horizon?
M.G.: There are no updates to software validation on the horizon. There was a pledge by the FDA to update 21 CFR Part 11, Electronic Records and Electronic Signatures, but they have yet to provide any further guidance or update the final rule itself.
PhM: What are some of the pitfalls you see with software validation projects?
M.G.: The biggest pitfall is the lack of good requirements. Requirements are the source of the system functionality and design and therefore should be properly written. Another pitfall is the lack of traceability from requirements to test scripts. Lastly, there is a lack of adherence to change control procedures, therefore resulting in systems no longer maintaining a validated state.
PhM: Why is this happening? Aren’t validation teams prepared or competent enough?
M.G.: Many teams are facing tight deadlines, which contributes to the lack of attention to planning and detail. Oftentimes validation teams are rushed through the process and therefore many key validation attributes are overlooked. There is a lack of a representative group to author requirements. Too many times requirements are written in a vacuum!
A good practice for establishing solid system requirements is to ensure you have a representative group involved in the author, review, and approval of system requirements.  Typical and recommended representatives are: Quality Assurance, Users, and IT.
PhM: How about change control? What’s the secret to continued validation?
M.G.: The trick to maintaining your validated systems in a validated state is to have a change control procedure that covers both software and hardware changes. The procedure should also require a representative group to reside on a change control board.  Again, recommended members are: Quality Assurance, IT, and the User community. It’s important to note that more than one user is okay and even encouraged, as it is not only important to assess the impact of the change to the system, it is also important to understand the impact of the change to the business.
PhM: Are COTS [commercial off-the-shelf] vendors providing better validation packages than in the past, and should manufacturers still be wary of software products that come with “complete” validation packages?
M.G.: Great question. The quality of validation packages has been consistently poor and they continue to lack the quality needed to pass FDA standards. COTS vendors fail to realize that objective evidence is needed to prove a system operates as intended.  Furthermore, oftentimes COTS applications are highly configured and therefore require a level of validation that is much more significant than the standard “out of the box” test scripts provided by COTS vendors.
PhM: As more manufacturers are taking advantage of Software-as-a-Service and “hosted” options, are there special validation considerations they need to be aware of?
M.G.: Yes, there are several considerations where security is concerned. Both physical and network security are crucial when an application is hosted by a third party. Validation considerations are security, change control, and installation.
PhM: What is “intended use” and why do manufacturers struggle to interpret its true essence?
M.G.: As the name implies, it describes how a software application is intended to be used. Intended use drives the need for validation and also leads the way for system requirements, which are a key component to any validation effort. Manufacturers struggle with the concept because they often lack the procedural controls to identify and define “intended use” in the first place.
PhM: How prevalent are warning letters related to “failure to validate computer software for its intended use,” and what are the underlying issues that the Agency is calling out?
M.G.: Warning letters related to the failure to validate software are not too common. However, in the last year, this has changed, as the FDA as stepped up efforts and has become more vigilant when it comes to software validation and intended use.
PhM: A lot of new products are coming on the market for analytical and modeling applications (to assist in Quality by Design efforts and other work). Might there be specific intended use issues (or general validation issues) for these types of products?
M.G.: Applications of these types are often complex applications. The more complex the application, the more difficult it is to properly validate for its intended use. In most cases, proper requirements are often lacking with complex applications. This alone can cause a validation effort to fail, as system requirements are the root to system functionality and design.

QbD for Better Method Validation and Transfer

More companies in the pharmaceutical industry today are adopting the principles of Quality by Design (QbD) for pharmaceutical development and manufacturing. Described in ICH Q8, Q9 and Q10 guidance documents, QbD enables enhanced process understanding, and a more systematic and scientific approach to development, so that better controls may be implemented. The end goal is more robust manufacturing processes than those that typically result from traditional approaches to drug development.
The QbD framework has many implications for manufacturers and regulators alike. This article describes how analytical methods can be viewed as "processes" and QbD concepts applied, to improve both method validation and transfer. Its goal is to stimulate thinking and discussion on how analytical method validation and transfer could evolve as industry increasingly adopts Quality by Design concepts.
But QbD cannot be considered without examining validation within a product lifecycle framework. The U.S. Food and Drug Administration (FDA) attempted to do just that in a recent draft guidance [1] designed to help achieve the goals of its “Pharmaceutical GMP's for the 21st Century – A Risk Based Approach” [2].
This guidance addresses some of the problems with traditional approaches to process validation, which were articulated by Moheb Nasr nearly five years ago [3]. Too narrow a focus on a "three-batch" approach to validation has encouraged a "don't rock the boat" mindset within the industry that can fail to foster continuous improvements in quality or efficiency.

The Need for Real World Verification

The same problems can be seen in analytical method validation and transfer, where the focus often shifts from ensuring the robustness of the method itself to ensuring the robustness of the documentation, and how well it will withstand regulatory scrutiny.
Methods for pharmaceutical products are normally validated by experts who have developed the method in accordance with ICHQ2 guidance (Validation of Analytical Procedures: Text and Methodology) or USP <1225> (Validation of Compendial Procedures). However, validation can be treated as a one-off event, with little consideration given to verifying how well the method will perform at everyday, "real world" operating conditions. All too often, regulators and industry use ICH Q2 or USP <1225> in a “check box” manner without considering the intent of these guidances, or the philosophy of method validation.
Pharmaceutical QbD
These issues continue after the method has been validated, when it is transferred to another laboratory. This transfer step requires that the knowledge of how to operate the method be transferred to those who will use it routinely, and that an exercise be performed, documenting that similar results have been obtained by both parties.
The transfer exercise is usually performed by the most competent analytical staff in the receiving laboratory, yet it may have little relationship with how the method will be used in routine, day to day operations.
It may come as little surprise, then, that most methods fail to perform as intended in the receiving laboratory, triggering efforts to identify variables that are causing the discrepancies, and repeated, costly testing and documentation.
The roots of method transfer failures can usually be traced to insufficient consideration of the routine operating environment during the method validation exercise, and lack of an effective process for capturing and transferring the tacit knowledge of the development analysts.

A QbD Framework for Analytical Method Life Cycle Management

If an analytical method is viewed, simply, as a process, whose output is data of acceptable quality, QbD concepts designed for manufacturing processes can be applied to analytical methods [4]. Thus, the concepts of lifecycle validation that are being developed for manufacturing processes might also be applied to analytical methods.
QbD is defined as “a systematic approach to development that begins with predefined objectives and emphasizes product and process understanding based on sound science and quality risk management” [5] and FDA [2] has proposed a definition for process validation that is “the collection and evaluation of data, from the process design stage throughout production, which establishes scientific evidence that a process is capable of consistently delivering quality products.”
When considering a lifecycle approach to method validation, a similar definition could be adopted “The collection and evaluation of data and knowledge from the method design stage throughout its lifecycle of use which establishes scientific evidence that a method is capable of consistently delivering quality data.”
From these definitions, it can be seen that there a number of key factors that are important in a QbD\Lifecycle approach. These include:
  • The importance of having predefined objectives
  • The need to understand the method, or being able to explain the method performance as a function of the method input variables
  • The need to ensure that controls on method inputs are designed such that the method will deliver quality data consistently in all the intended environments in which it is used
  • The need to evaluate method performance from the method design stage throughout its lifecycle of use.
In alignment with the approach proposed in the draft FDA guidance for process validation, a three-stage approach could be taken with method validation. (See chart.)
Stage 1 – Method Design: Define method requirements and conditions and identify critical controls
Stage 2 – Method Qualification: Confirm that the method is capable of meeting its design intent.
Stage 3 – Continued Method Verification: Gain ongoing assurance to ensure that the method remains in a state of control during routine use.

Stage 1 - Method Design

The Method Design stage includes establishing the method performance requirements, developing a method that will meet these requirements and then performing appropriate studies to understand the critical method variables that must be controlled to assure the method is robust and rugged, The method design stage can be an iterative process which is repeated as required in accordance with the lifecycle phase of the product.
Method Performance Requirements
Utilising a QbD approach, it is essential at this stage that sufficient thought be given to the intended use of the method and that the objectives or performance requirements of the method be fully documented. This represents the Analytical Target Profile (ATP) [6] for the method.
To draw a parallel to qualification of new analytical equipment, the ATP for the method is effectively the equivalent of a User Requirement Specification that would be produced to support qualification of new analytical equipment (or the Design Qualification as defined in USP <1058>).
Pharmaceutical QbD
To build the ATP, it is necessary to determine the characteristics that will be indicators of method performance. Typically these characteristics will be a subset of those described in ICH Q2, such as accuracy or precision.
It is important, however, not to use the ICH Q2 characteristics in a check box manner but to consider the method's intended use. For example, the characteristics that are critical to the performance of an HPLC assay method intended to be used to quantify the active ingredient within a tablet might be quite different from those of an NIR method intended to be used to measure the end point of a blending operation. In the second case, accuracy may not be a critical method performance requirement in determining homogeneity.
Once the important method characteristics are identified, the next step is to define the target criteria- in other words, how accurate or precise the method should be. A key factor in choosing the appropriate criteria is the impact of method variation on the overall manufacturing process capability. Knowledge of the proposed specification limits and the expected process mean and variation is helpful in setting meaningful criteria.
Method Development
Once the ATP has been defined, an appropriate technique and method conditions must be selected in order to meet the requirements of the ATP. While method development is a very important part of the method lifecycle, it is not necessary to elaborate here since it has been extensively addressed in the literature.
Method Understanding
Based on an assessment of risk (i.e. the method complexity and the potential for robustness and ruggedness issues) one can perform an exercise focused on understanding the method to better understand what key input variables might have an impact on the method's performance characteristics. From this, one can identify a set of operational method controls.
Experiments can then be run to understand the functional relationship between method input variables and each of the method performance characteristics. Knowledge accumulated during the development and initial use of the method provides input into a risk assessment (using tools such as the Fishbone diagram and FMEA) which may be used to determine which variables need studying and which require controls.
Robustness experiments are typically performed on parametric variables using Design of Experiments (DoE) to ensure that maximum understanding is gained while minimizing the total number of experiments. Depending on the type of method, surrogate measures of characteristics such as accuracy or precision may be evaluated. An example might be peak resolution in a chromatographic method. It follows that well designed System Suitability requirements can be a valuable tool for assuring that a method is operating in a region where the ATP requirements will be met.
When developing an understanding of the method’s ruggedness, it is important to consider variables that the method is likely to encounter in routine use, such as in situations involving different analysts and different instruments. Tools such as measurement system analysis can be useful in providing a structured experimental approach to examining such variables.
Method Design Output A set of method conditions will have been developed and defined which are expected to meet the ATP. Those conditions will have been optimized based on an understanding of their impact on method performance.

Stage 2 - Method Qualification

Once a set of operational method controls has been determined during the design phase, the next step is to qualify that the method will operate as intended. In a similar manner to equipment qualification, method qualification can be broken down into the stages of method installation qualification (MIQ), method operational qualification (MOQ), and method performance qualification (MPQ).
Method Installation
Qualification focuses on ensuring that the facility where the method will be used is prepared to use it. This process includes ensuring that the analytical equipment is qualified and that appropriate knowledge transfer and training of analysts has been performed.
The method conditions and detailed operating controls, along with all the knowledge and understanding generated during the design phase, must be conveyed to staff at the facility in which the method will be used. Performing a “method walkthrough” exercise with both the development and routine analytical teams can be extremely valuable in ensuring all tacit knowledge about the method is communicated and understood.
The extent of the IQ phase may vary depending on the level of pre-existing knowledge of the routine testing laboratory with the product, method or technique. For example, the laboratory that originally developed the method may merely ensure that other personnel who will use the method are trained.
The Method Operational Qualification phase is focused on demonstrating that the method meets its design intent. It is similar to “traditional method validation.” In the lifecycle approach, however, the focus is not about performing the traditional check box exercise to demonstrate compliance with ICH Q2. Instead, it is about demonstrating that the method meets the specific requirements that have been pre-defined in the Analytical Target Profile. A fundamental principle of this stage is that it should build on information that may have been generated in experiments already performed; for example, evidence of the selectivity or limit of quantification of a method may already exist from studies performed as part of method design.
The final stage in the qualification phase is to perform a MPQ exercise. At this stage, the actual manufactured supply samples are tested in facility, equipment and by personnel that will be used for routine analysis. The method should be operated exactly in accordance with the defined method controls and local operating procedures. As with the IQ stage above, the extent of the PQ exercise could vary, and would include for example confirming system suitability criteria are being routinely met.

Stage 3 - Continued Method Verification

The goal of this stage of the method lifecycle is to continually assure that the method remains in a state of control. This should include an ongoing program to collect and analyze data that relate to method performance---for instance, by setting criteria for replication of samples or standards during the analysis, by trending system suitability data or by trending the data from the regular analysis of a reference lot.
This activity aligns with the guidance in USP chapter 1010 on system performance verification. Close attention should also be given to any out of specification (OOS) or out of trend (OOT) results generated by the method once it is being operated in its routine environment. Ideally, using the new approach, laboratories should encounter fewer OOS results. If they do, it will be easier for staff to determine their root cause.
Change Control During the lifecycle of a product, both the manufacturing process and the method are likely to experience a number of changes brought through unintentional deviations, continuous improvement activities or the need to operate the method and\or process in a different environment. It is essential that all changes to the method operating conditions be considered in light of the knowledge and understanding that exists on the method performance. If the method operating conditions are modified such they fall outside the known method operable region (determined as part of ‘method understanding’ during the original design phase) then the Method Qualification should be revisited. Likewise if the process changes and samples contain analytes at levels outside those covered in the method operational qualification exercise or have new interferences, a new ATP would need to be generated and a partial re-qualification exercise will be required to ensure the method is still capable of producing consistent and reliable results for the extended range or modified sample.
Where a method needs to be transferred to new location, appropriate Method Installation Qualification activities (including knowledge transfer) will need to be performed in addition to a Method Qualification exercise.
Consistent with QbD principles, the extent of requalification should be driven by scientific principles, as described previously

Other Scenarios

The approach to method qualification described above focuses on the activities that would typically be performed when a method is developed and used within a single company. Other scenarios exist where a laboratory may need to use a method for which it has no access to the original method design or qualification information such as in a contract testing laboratory. In these situations, it is important that an ATP is defined and documented. An appropriate qualification study is then performed to demonstrate that the method meets its ATP. This approach could also be applied to qualification of a pharmacopoeia method. By applying the lifecycle approach to pharmacopoeia methods it is envisaged that there would be no need for a separate USP chapter on method verification (USP <1226> Verification of Compendial Procedures).
Another scenario that needs to be considered is when a method is used to support drug development rather than drug manufacturing activities. In this case the method will still need to produce data that is scientifically sound - particularly as it may be used to release material for clinical supplies, support stability studies or support the development of process understanding. At this point in the product lifecycle however the focus is on understanding the method performance rather than demonstrating that it meets predefined acceptance criteria.
A key advantage of this approach for the above scenarios is the flexibility to perform a qualification against the specific ATP defined for the intended use of the method. This eliminates the current desire to create a qualification document against ICHQ2 in a check box manner which can lead to unnecessary headaches and additional work.
Since there is the potential for this approach to be adopted for all users of analytical methods, it also offers the potential to standardise the multiple, misrepresented terminology which currently exists within the industry. It would result in a harmonised method qualification approach with standardised terminology which would remove the current state of confusion between the terms; method validation transfer and verification and the many interpretations as to what is actually required for each.
In short, adopting a QbD approach to analytical method lifecycle management would have some significant implications for analytical scientists in the pharmaceutical industry. First, ICH Q2 and pertinent USP chapters would need to be revised to align with the lifecycle validation concepts described above. (The need for a revision of ICH Q2 as a consequence of increasing adoption of QbD concepts and use of PAT has also been identified [7].
The activities that were previously defined as “method transfer” (that is, knowledge transfer and confirmation of equivalence) would become a central component of the lifecycle validation approach (in other words, they would be described as method qualification activities) rather than being treated as distinct from validation.
Likewise, method validation as described in USP chapter <1225>, method verification as described in USP chapter 1226 and the proposed USP chapter on Transfer of Analytical Procedures would be encompassed within the lifecycle approach and it may be logical to combine those chapters.
The CMC section of regulatory submissions may also need to change. In a Quality by Design rather than “Quality by Testing” paradigm, it seems inappropriate that the validation details of the test methods should fill such a large proportion of a submission. With a lifecycle approach to method validation the submission would see an increased focus on the method design aspects and demonstration of the work performed to arrive at method understanding. This would parallel the increasing focus on process understanding in submission which has arisen since the introduction of ICH Q8.
Review of formal method qualification activity would become an inspection issue rather than a registration issue. Underpinning this approach is the need for effective lifecycle management of knowledge and understanding of the analytical method.
In summary, the switch to a QbD approach to method development is already beginning to bring improvements to the performance of analytical methods. It is now time to look to modernize and standardize industry’s approach to method validation and transfer. By aligning method validation concepts and terminology with those used for process and equipment qualification, there is an opportunity to both reduce confusion and complexity for analytical scientists as well as to ensure that efforts invested in method validation truly add value, rather than simply being a check box exercise.

Analytical Target Profile (ATP)

The combination of all method performance criteria that direct the method development process. An ATP would be developed for each of the attributes defined in the control strategy and defines what the method has to ‘measure’ (i.e. acceptance criteria) and to what level the measurement is required (i.e. performance level characteristics: e.g. precision, accuracy, working range, sensitivity and the associated performance criterion).
Method Development The collection of activities performed to select an appropriate technique and method conditions that should have the capability to meet the ATP requirements.
Method Understanding (MU) The knowledge gained from the collection of activities performed to understand the relationship between variation in method parameters and the method performance characteristics.
Method Installation Qualification (MIQ) The collection of activities necessary to ensure a method is properly installed in its selected environment. This will include the knowledge transfer activities such that the laboratory understand the critical control requirements of the method and the activities required to ensure the laboratory can meet these requirements e.g. purchasing of appropriate consumables such as columns, and reagents, ensuring reference standards are available, ensuring appropriate equipment is available and appropriate training is given to personnel.  
Method Operational Qualification (MOQ) The collection of activities necessary to demonstrate that a method can meet its ATP. Experiments performed as part of the OQ exercise must be appropriate for the specific intended use as defined in the ATP.  This may involve demonstrating the method has adequate precision and accuracy over the intended range of analyte concentrations for which it will be used.
Method Performance Qualification (MPQ)
The activities that demonstrate that a method performs as intended when used with the actual samples, facilities, equipment and personnel that will routinely operate the method.

About the AuthorsPhil Nethercote  is head of the Analytical Centre of Excellence in GSK’s manufacturing division, and chairman of the GSK Analytical Standards Group. Since joining GSK in 1987 he has held a number of roles associated with new product introduction. He has a degree in Chemistry from Heriott-Watt University and a PhD in HPLC from Stirling University.

Phil Borman is a manager in Analytical Sciences, Chemical Development within GSK and is currently accountable for the application of Statistics globally within his department. Prior to his current role, Phil worked in both Chemical and Pharmaceutical Development. He has an M.S. in chemistry from the University of Manchester and an M.S. in industrial data modeling from De Montfort University in Leicester.
Tony Bennett is Project Leader for Laboratory Support in Quality Analytical Sciences, Existing Product Support, Global Manufacturing Supply at GSK.
Gregory P. Martin is President of Complectors Consulting, LLC, in Pottstown, Pennsylvania.
Pauline McGregor is Analytical Method Support Specialist for PMcG Consulting of Oakville, Ontario. She has over 20 years experience in validation, R&D and quality systems in the pharmaceutical industry.
References1. U.S. Food and Drug Administration (FDA). Draft Guidance for Industry
– Process Validation: General Principles and Practices. 2008.
2. FDA. “Pharmaceutical cGMPs for the 21st Century – A Risk-Based
Approach.” 2004.
3. Nasr, M. speaking at the 2005 AAPS Workshop on “Pharmaceutical
Quality Assessment - A Science and Risk-based CMC Approach
in the 21st Century.”
4. Borman, P., Nethercote, P., Chatfield, M., et al. Pharmaceutical
Technology, 31(10). 2007.
5. ICH Q8 (R2) – Pharmaceutical Development, November 2005.
6. Schweitzer, M., Pohl, M., Hanna-Brown, M., et al, Pharmaceutical
Technology, 34(2). 2010.
7. Ciurczak, E., ICH Guidances (Already) Show Their Age. www.