Tuesday, September 21, 2010

Evaluation of Validation Content Uniformity Test Results Using Lower Probability Bound Distribution Charts

By Pramote Cholayudth
 
 
One of the widely recognized references on process validation with regard to sampling and testing plans is the US Food and Drug Administration’s Draft Guidance for Industry: Powder Blends and Finished Dosage Units—Stratified In-Process Dosage Unit Sampling and Assessment (1) that is based on the Product Quality Research Institute (PQRI)’s Recommendation Report (2). However, it is a non-binding document (i.e., the industry may use any alternative approaches even after it becomes the official guidance). Another well-known relevant reference is the PDA Technical Report No. 25: Blend Uniformity Analysis: Validation and In-Process Testing (3). The sampling, testing, and corresponding acceptance criteria limits in validation study will follow these two statistics-based documents. The validation test results with respect to critical quality attributes (CQAs) require to be statistically evaluated (e.g., estimating confidence limits as appropriate, demonstrating the high probability of passing the tests, etc.).
   
According to Torbeck (4), “… If statistical procedures were given in the USP, companies would have little incentive to develop better procedures.” His comment is now reconfirmed. The industry may not have a good approach like J.S. Bergum’s method if the United States Pharmacopeia (USP) provided some kind of statistical procedures. Bergum is one of the leading professionals who suggested the use of the new statistical procedures. His methods (5, 6) introduced both how to establish the validation acceptance criteria and how to evaluate the validation test results. In protocol development, all the validation practitioners may follow some recognized documents including Bergum’s in establishing the acceptance criteria limits. However some of them may not be confident in using statistical tools for evaluation of the test results, especially for the multiple stage tests (e.g., content uniformity and dissolution).
   
To provide an alternative to such a statistical evaluation, this article introduces an approach to using lower probability bound distribution charts, constructed according to Bergum’s methods, for evaluation of content uniformity test results.

Computer System Validation

Prepare your systems for FDA inspection

By Michael J. Gregor



Is your organization ready for an inspection of your Computer System Validation program? In this article, I will offer some key tips on how to prepare for an inspection of your computer system validation (CSV) program. Often times, the FDA comes to inspect your facility for reasons other than your CSV program. However, because so many business processes are governed by electronic systems, the topic of computer system validation inevitably comes up during the course of an inspection. As a result of an increase in federal investigators, investigators are able to inspect more facilities and dig deeper into areas such as Computer System Validation.

The very first item you can prepare is a system inventory list. The list itself should be organized by GxP — GCP, GLP or GMP — so you can sort the list accordingly during an inspection or for analysis purposes. Next, the list should include a system owner for each system so you know who is responsible for the system and who to call upon during an inspection. Another important piece to include is the location of the computer system validation documentation, as this is not only important from a storage aspect, but also an access point of view. Under everyday operations, one should have easy access to this documentation. During an inspection, it is key to have quick access to such documentation, as investigators do not like to wait too long for documents they request. The last attribute to include in your system inventory list is an indicator of whether or not a system is critical. To determine criticality, it is wise to determine whether or not a system is used for regulated purposes or used to make decisions as they pertain to regulatory requirements. Flagging these can not only help you better prepare for an inspection, but also better allocate your resources more effectively for the inspection preparation activities.

The second item you should ensure you have in place is governing Standard Operating Procedures (SOPs). You should have the following SOPs in place for compliance and operational purposes as well as inspection purposes. The first important SOP to have in place is the Computer System Validation SOP. This SOP should outline and detail the Validation Lifecycle. It should include requirements for key areas such as Intended Use, Validation Master Plan, User and Functional Requirements Specification, System Design Specification, Traceability Matrix, Installation Qualification, Operational Qualification, Perform-ance Qualification, Validation Summary Report, and System Release Memo. Investigators will want to see that you detailed the requirements for all of your deliverables in the Validation Lifecycle. Furthermore, they will look to ensure the deliverables have the right dependencies and are in the correct order.

The second SOP to have in place is the one for Software Development Lifecycle (SDLC). This SOP should outline the steps needed to perform the SDLC for custom applications and should handshake with the Computer System Validation SOP.

The third SOP to have in place is that for Change Control. The Change Control SOP should describe the process for controlling both software and hardware in the production environment. Furthermore, the SOP should require a Change Control Board that is responsible for reviewing and approving all changes. It is also important to include a classification of change. For example, there should be three classifications: Minor, Major, and Emergency. Classifying the changes helps manage the change control process more effectively. Assembling a Change Control Board and classifying changes gives inspectors a good sense that your process is in control, as it shows you are well organized when assessing changes. It is notable to mention there are an increasing amount of 483s and Warning Letters for companies failing to exercise change control and perform computer system validation for their respective systems.

The fourth SOP to have in place is a Bug and Issue Tracking one. This SOP is becoming increasingly important, as regulators are asking more questions when it comes to how bugs and issues are handled. Therefore, it is important to have a mechanism in the Bug and Issue Tracking SOP to handle whether or not validation is required as a result of fixing a bug or an issue. Investigators often look for this mechanism.

A fifth SOP that is an important aspect of inspections is a Good Documentation Practices SOP. It is very important to have a structure in which to document your evidence of validation activities. Deploying good documentation practices helps ensure the integrity of your computer system validation and change control documentation. Investigators are quick to point out documentation errors when reviewing documentation; therefore, it would be prudent to have a sound Good Documentation Practices SOP in place. Please remember there is a good amount of documentation when it comes to computer system validation.

An important aspect of preparing for inspections is to prepare critical documentation for review by an investigator. How do you determine what is critical? Well, a good first step to determining what is critical is to determine what systems are used for regulatory purposes or to make decisions as they pertain to regulatory requirements. For example, an obvious application is your Adverse Event Reporting System, which handles Adverse Events that are reported to the FDA. A Documentation Management System is another example of a critical system, as it is often used to manage and store regulated documents.

What validation documents for your critical systems should you be reviewing? There are some documents that are prone to mistakes so I would recommend these types of documents be reviewed first. An example of a document that is error prone is your test scripts. There are two important points to look for here. First, you want to ensure the test scripts are legible and free of errors. Secondly, you want to ensure the test scripts have the appropriate objective evidence attached to them. This is a very important point to bear in mind, as investigators often like to see objective evidence. What is “objective evidence,” in this case? Objective evidence is evidence that a test script has met its objectives. A great example of how to capture objective evidence is through screenshots. However, be careful where you require screenshots, as you only want those that prove the test script met its objective. Otherwise, you end up with too many screenshots that do not necessarily prove the test script has met its objectives.

Another important document to review prior to an inspection is your Traceability Matrix. Your Traceability Matrix ensures your intended use of the system has been tested through test scripts that map to a set of approved requirements. Investigators are now reviewing Traceability Matrices to ensure the system is tested effectively against it s system requirements. I encourage the review of this document during the end of your validation efforts and prior to approving the Traceability Matrix. However, it is a good idea to double-check your Trace Matrix before an inspection, as an investigator can find one requirement that was not tested and therefore declare that your system has not been validated per its intended use and thus is not validated. As you can see, the Trace Matrix is a very important document.

The last document to review before an inspection is your System Release Memo. The System Release Memo signifies the system has been released into production. Some important points to check here are the dates. It would be advantageous to ensure your signature date of your System Release Memo is after the signature date of your Validation Summary Report. Small items like these are often overlooked by companies, as there is usually a rush to release systems into production and sometimes it happens before the documentation is complete. There is no explanation for errors like these, as it shows you are not following your SOPs and that will be the nature of the 483 observation if an error like this is found by an investigator.

There is much to consider about your Computer System Validation program prior to an inspection. Remember to have a system inventory list and the proper SOPs in place, and to inspect your critical systems and their documentation prior to an FDA inspection. Again, even though the FDA may be inspecting your facility for other reasons, don’t discount the possibility they will want to audit your Computer System Validation program. As stated, so many of our respective business processes are carried out by way of electronic systems thatit would be advantageous to assess your Computer Validation Program whether you foresee an inspection or not. Having an efficient Computer System Validation Program in place helps ensure the integrity of your electronic records, allocate resources more effectively and can therefore yield long-term cost savings to your company.

ACR Image Metrix Presents Validation Methodology at DIA Europe

PHILADELPHIA(BUSINESS WIRE) Brenda K. Young, BA, CCRA, senior director of clinical operations at ACR Image Metrix?, an imaging contract research organization (CRO) located in Philadelphia, Pennsylvania, has presented her experience in computerized system validation to the Drug Information Association (DIA) 3rd Annual Clinical Forum held in Nice, France, on October 19, 2009.
Young's presentation topic, A CRO's ChallengeValidating Leading Edge Software for Imaging Clinical Studies,? discussed how conducting imaging clinical trials often involves leading edge software. Computerized systems used in clinical investigations, imaging workstations and software are required to be validated before data is collected in support of regulatory submission. During her presentation, she presented two case studies that described approaches to computer system validation.
Imaging, as a part of a clinical trial, can involve many different modalities for research; technical validation is a key component of high importance for its usage in a trial whose data will be used to support a regulatory submission. Understanding the work process simulated by testing is key to the performance qualification portion of computer software validation. Knowledge of the mathematics and physics underlying the software and workstation is not necessary to execute a good validation; however, it is critical to possess knowledge of the work process with the computerized system,? Young explained.
Brenda is an example of the high caliber imaging expertise on staff at ACR Image Metrix. She brings over 25 years of clinical research experience to our team and is on the forefront of validation efforts for imaging clinical research in the industry,? stated Michael Morales, general manager for ACR Image Metrix.
About ACR Image Metrix
ACR Image Metrix, an imaging contract research organization (CRO) located in the American College of Radiology Clinical Research Center in Philadelphia, applies imaging techniques to improve the efficiency of drug and medical device development programs. The world-class team of radiologists and imaging scientists at ACR Image Metrix works with pharmaceutical, biotechnology and medical device companies to integrate the appropriate imaging technologies, modalities and clinical design techniques into their imaging studies.
ACR Image Metrix
Shawn Farley, 703-648-8936
Public Relations Manager
sfarley@acr.org
Source: ACR Image Metrix

HCL Launches Computer Systems Validation

New Delhi: HCL Technologies has announced introduction of ‘CrosSView,’ a framework-based computer systems validation (CSV) methodology for the development of robust software applications in the life sciences sector.
According to HCL Technologies officials, CrossView is a template-driven end-to-end methodology that is predictable, repeatable and verifiable, ensuring that software applications developed using this framework meet the exacting standards set for life sciences applications.
HCL Technologies director of life sciences practice, Pradep Nair said, “The introduction and application of CrosSView will enable HCL Technologies to deliver systems with verifiable quality to life sciences customers, thus improving transparency and responsiveness.”
Validation is the practice of ensuring that the software used in the process of manufacturing of pharmaceuticals and medical devices, or in hospitals for patient data, is completely error free, reproducible, and access-secured.
It includes validation planning, mitigation, system development, or test the optimum and accurate hardware and software performance in highly exacting environments.

Proteomics and Validation

PFA-Proteomics Contract Research Services

The Proteome Factory is specialised in contract research for differential proteome analysis (proteomics, differential protein display) studies, identification and validation of regulated proteins, biomarkers and targets from all kind of protein containing samples. Only proteome analysis gives new insights in quantitative regulation of proteins caused by internal and external factors like diseases and drugs. The identification of regulated protein species (name, amount and modification) is the prerequisite to understand functional protein networks and their correlations. Biomarkers, biomarker profiles and targets can be identified. The proteomics information can be used to get better insights in in-vivo and in-vitro pharmacology or for optimization of fermentation processes and plant breeding.
For this purpose Proteome Factory has developed a comprehensive PFA-Proteomics Platform which opens new opportunities in biomarker and protein target discovery, quantification, identification and validation as well as pharmacokinetics of protein drugs. The proteomics platform consists of sophisticated gel-based and gel-free proteomics methods like iTRAQ™* proteomics and proprietary proteomics technologies ile MeCAT. In each proteomics study Proteome Factory decides in close cooperation with its partners and clients which proteomics technologies of the PFA-Proteomics Platform should be used to meet the scientific requirements. The study design, optimized sampling of biological material, sample prefractionation (e.g. affinity depletion of abundant plasma/CSF proteins) as well as statistical requirements are very important to ensure a high quality and valuable proteome study.
Picture: 2DE proteomics -

Application Areas of the PFA-Proteomics Platform

  • Biomarker discovery and identification
  • Biomarker profile discovery and identification
  • Protein and peptide target discovery and identification
  • Identification of regulated proteins and protein species
  • Optimization of fermentation processes
  • Optimization of plant breeding
  • Pharmaco proteomics
  • Identification of vaccine candidates
  • Immuo proteomics
  • Phospho proteomics
  • Membrane proteomics

PFA-Proteomics Plattform

Proteome Factory performs gel-based and gel-free proteomics methods, which provide complementary information about the proteome.
  • Biomarker- and Target Finding by extreme high resolution 2DE proteomics
  • MudPIT (LC-ESI-MSMS based multi-dimensional protein identification technique)
  • Label-free LC-ESI-MSMS proteomics
  • ITRAQ based LC-ESI-MSMS proteomics
  • Plasma Proteomics
  • Immuno Proteomics
  • Pharmako Proteomics
  • Membrane Proteomics
  • Pharmacokinetics by MeCAT (absolute protein quantification down to attomol range)
  • Validation of biomarkers and target proteins by SRM / MRM analysis (mass western)
  • Validation of biomarkers and targets by 2DE western blot and 1D western blot

Extreme High Resolution 2D Electrophoresis

Proteome Factory’s extreme high resolution two dimensional gel electrophoresis (2DE) based on the 2DE development of Professor Klose (Klose and Kobalz, Electrophoresis, 1995, 16, 1034-59) is still the most powerful separation technique for protein species worldwide allowing the best and most reliable proteome analysis. Several thousand protein spots (theoretically up to 10,000 protein spots depending on sample and gel size, up to 40x30 cm) can be separated and relatively quantified.
Even Proteome Factory’s 2DE gels with a size of 23x30 cm are much more powerful in protein separation than commercially available IPG-based 2DE because Proteome Factory’s 2DE technique has several advantages:
1. significant better protein separation resolution,
2. higher sensitivity with less protein amount,
3. less artificial streaks,
4. separation of basic protein up to pH 10.5 (up to pH 11.5 in 60x30 cm gels) and
5. high compatibility to protein identification by MS.

Workflow of Extreme High Resolution 2DE

The 2DE proteomics platform of Proteome Factory is very robust and highly standardized allowing reproducible analysis of all kind of protein containing samples using SOPs (standard operating procedures) without the need of time and cost intensive method development. Depending on scientific problem and customer request the 2DE procedure can be specifically optimized if required.
Proteome Factory’s 2DE approach comprises the following steps:
  • Protein extraction
  • Protein concentration determination
  • 2DE protein separation (high resolution IEF and SDS-PAGE)
  • Protein gel staining (MS compatible silver staining, Fluorescent staining)
  • Digitalisation of 2DE gels
  • Differential 2DE gel image analysis with differential protein display
  • Evaluation of regulated protein spots with intensities, regulation factors and statistics
  • Spot picking of differential protein spots by spotXpress
  • Automated protein identification analysis
  • Protein identification by database searching or de novo peptide sequencing

Advantages for Our Customer

  • High quality and challenging proteome analysis for research & development
  • Analysis results belong to customer
  • Confidentiality
  • Close collaboration
  • Pilot studies to multi-year research projects
  • High sensitive protein identification and quantification
  • Highly qualified experts for protein and proteome analysis at Proteome Factory
*iTRAQ™ is a trademark of Applera Corporation or its subsidiaries.

The Importance of Clean Analytical Instrumentation Systems in Pharmaceutical Process Validation

by Dave Simko, Swagelok (Full-length paper presented at Interphex)

Validated pharmaceutical processes are necessary to ensure quality of the products being manufactured, establish and maintain effective costs, and meet the regulatory requirements that are mandatory for approval and introduction of new pharmaceutical products.

The collection of data needed to document the effectiveness and repeatability of the pharmaceutical process requires the use of a variety of both process control and analytical chemistry instrumentation. Process control instrumentation monitors process variables, such as pressure, temperature, level, and flow in a manner typical of chemical processes. The main difference in process control instrumentation for pharmaceutical processes is in the interface between the sensing element or device and the clean process stream. Analytical chemistry instrumentation monitors the chemistry of the process throughout the manufacturing process. It is especially important during the purification, compounding, and formulating of the finished product.

Sample collection and conditioning—prior to introduction into the analytical instrument—is critical to reliable analysis, in both laboratory devices and in at-line and on-line analyzer systems. The cleanliness of the sampling system is an important element of reliability and repeatability. Cleanliness is impacted by the design, manufacture, and condition of the fluid control components used in the system. Consideration must be given to potential inboard, outboard, and internal leakage; effective sealing systems; entrapment; material selection; surface finishing and conditioning; and cleanability.

The result of these considerations is an evaluation process that will lead to selection of the analytical chemistry instrumentation system components that will optimize repeatability and make validation easier.
Component Selection Process
Proper selection of components for service in clean systems is the responsibility of the system designer. Following are some elements for consideration in the selection process:
Design

  • Choose components that have proven leak-tight performance capabilities, considering potential for inboard, outboard, and internal leakage.
  • Consider the seals used in the design. Select components with sealing systems that will meet the system’s service conditions.
  • Look at the potential for entrapment and select components designed to minimize entrapment areas.
  • Study the flow path through the component. Consider the component with the smoothest flow path and minimal internal volume.
  • Consider the materials of construction and of the seal members. Be certain that they will be compatible with the pressure, temperature, and media requirements of the system.
  • Check all ratings for compatibility with system requirements.
Manufacture

  • Check surface finishes. Components with the best surface finishes will be easier to clean and keep clean.
  • Consider how the components are assembled and the possibilities for entrapment.
  • Determine if the lubricants used will be compatible with the system.
Cleaning

  • Study the cleaning method used by the manufacturer and select components that will meet the system requirements initially and in service.
Verification
  • Determine what certifications will be required to verify the necessary elements of the specification or procurement document.
Packaging
Specify the packaging method that best meets the need of the end user.



Updated guidance on requirements for paediatric 'compliance checks' during validation

We have previously published guidance on how the MHRA will handle the impact of the requirements of the Paediatric Regulation on the validation of applications for marketing authorisations (MAs). The guidance has been updated to reflect the need for compliance checks on certain line extensions and variations as well as initial applications for new medicinal products.
The same principles apply in these cases and the MHRA will continue to request an opinion from the European Paediatric Committee where one has not already been obtained by the applicant. The document has also been updated to refer to the final European Commission guideline on the content of applications for paediatric investigation plans and the operation of the compliance check and also to other European Medicines Agency (EMA) and Co-ordination Group for Mutual Recognition and Decentralised Procedures (CMDh) guidance.
Guidance on the MHRA’s Handling Of The Requirement in the Paediatric Regulation to Undertake a ‘Compliance Check’ During ValidationPDF file (opens in new window) (85Kb)

Pharmaceutical Validation Documentation Requirements

Pharmaceutical validation is a critical process that ensures that pharmaceutical products meet the desired quality standards and are safe fo...