Sunday, July 25, 2010

IT Infrastructure Qualification and System Validation: IT Vendor Perspectives 5

The risk-based approach emphasizes the need to focus on critical features and processes that affect product quality. The trick is to apply knowledge and experience and then base it on intended use of the infrastructure and computer systems relative to regulatory and business need. All rationale and justification for decisions must be documented and approved.
Scenarios on validation and qualification in an outsourcing environment
For example, consider a scenario in which a vendor provides IT services to a pharmaceutical company from its offshore development center. The offshore development center houses a data center, workstations, a network, servers, and so forth. For this example, let's take an example of a server that is used to store clinical and health data. This server would be an open system, that is, the access is controlled by personnel who don't own the data that the server holds. Because the server holds clinical and health-related data (which may include privacy data as well), it becomes a regulated server, which must be managed and operated according to regulatory guidelines on validation. If the facilities were to be inspected by an external agency or a regulatory authority such as FDA, the vendor would need to prove that this server is qualified—that is, it is developed to meet its 'intended use' by following the principles of software validation as mandated by FDA and that the server is being managed and operated in a controlled environment. A detailed risk assessment accompanied by relevant supporting qualification documentation would be imperative to demonstrate that the server is qualified.
Now consider a scenario in which a vendor company is planning to support applications such as clinical trial management systems, laboratory information management systems, or manufacturing control systems. In these cases, the IT infrastructure that supports these applications must be qualified. In another case, if the systems and applications supported by the IT infrastructure are nonregulated in nature— for instance, a simple payroll application that has no product quality or regulatory impact, has minimal safety implication, and poses no major risks—then the approach to qualify and validate it will be entirely different. It may not require validation at all, if there is a well-defined and approved documented rationale based on risk factors. Equipment, facilities, and processes that pose no risks to product quality will not come under regulatory scrutiny at all and would require very low effort in risk-control methods and in qualification.

Table I: Risk assessment factors.
Nonetheless, any exceptions and deviations from the risk-based approach must be documented and approved after performing adequate business impact and system criticality analysis. Some important elements to be considered in a typical risk assessment are listed in Table I. At a corporate level, vendors can publish computer-system validation and infrastructure-qualification policies as the governing documents, which will apply to GxP-critical projects. To demonstrate technical, procedural, and administrative controls within vendor infrastructure setup, the documentation for qualification and validation should be complete and comprehensive. Typically, this documentation would include:
  • a documented and approved risk-assessment report detailing what risks the infrastructure (e.g., a server) poses to systems and applications that can directly impact drug product quality and safety and data integrity;
  • a qualification and validation plan or protocol that contains the purpose and scope, roles and responsibilities, coverage, periodic review procedures, maintenance procedures, administration procedures, documentation, and so forth;
  • network requirements, design, and testing documentation;
  • mandatory documentation, including installation SOP, physical and logical security SOP, backup and restoration SOP, disaster recovery SOP, business continuity SOP, change control SOP, and document management SOP;
  • installation qualification and operational qualification documents;
  • installation verification records that prove that components of the IT infrastructure were installed and configured properly and in accordance with specifications and procedures and that there were no deviations from standard installation;
  • operating procedures and instructions for those who run the infrastructure;
  • administration procedures;
  • training SOP or manuals;
  • qualification reports;
  • infrastructure retirement and revalidation procedures;
  • data management, record retention, and archiving procedures;
  • audit trail procedures.

A platform–operating system based approach to qualifying infrastructure ensures a higher level of standardization throughout the life cycle and minimal overlap in documentation as well.
For nonqualified infrastructure that already supports GxP applications, retrospective qualification is recommended. Retrospective qualification, however, may not always be equivalent to prospective qualification. When it's performed retrospectively, the best strategy to follow is to divide it in four phases: system inventory, gap analysis and risk assessment, remediation, and implementation.
Develop system inventory. This inventory includes a network of software and hardware with information about sites locations and list of people responsible for developing and maintaining the inventory. Identify applicable information systems that are used to create, modify, retrieve, archive, transmit, and submit data to regulatory authorities. Include documentation of system inventory.
Conduct gap analysis and risk assessment. Perform gap analysis and risk assessment on the basis of "as is" and "to be" scenarios. Review processes, procedures, and current documentation in accordance with GxP guidelines.
Have a remediation plan. Classify infrastructure systems according to business risks. Identify issues and corrective actions. Perform impact analysis and set the timelines for remediation of infrastructure systems and documentation.
Perform implementation. Implementation would cover activities such as revising procedures, establishing necessary security, administrative and procedural controls, audit trails capability, keeping documentation current, training personnel, and continued monitoring.
Qualification must not be a fixed, one-time process. The challenge is to always keep the infrastructure and computer systems in a qualified and validated state. The SOPs that govern the usage of the systems must be kept current, understood by all the stakeholders, and adhered to at all times during the system lifecycle. Periodic reviews of systems and documentation must be carried out to ensure that there are no gaps and that the systems are always in a state of compliance and fit for business use.
Once a qualification package has been approved and documented, it can be referenced in all validation packages of individual software applications. This approach will offer the benefit of not having to redo validation for diverse applications that are supported by this IT infrastructure.
The cost of retrospective qualification is usually five to six times more than that of prospective qualification. And the average cost of remediation of a single computer system can be in the thousands of dollars. A risk-based approach can prove to be a cost effective way to address compliance requirements. It optimizes expenditure and can decrease the average costs of qualification and validation significantly.
Best practices
Best practices will emerge when vendors execute projects out of their offshore development centers or vendor sites and face compliance audits. Regular quality compliance audits will help establish criteria to streamline and standardize qualification processes. Incorrect interpretation of regulatory requirements can lead to impediments to the correct approach in qualification and validation processes. One must not lose sight of FDA policy mandated for implementation as well as frequent updates to regulations and draft guidance documents. IT vendors also should observe the trends in the industry and learn how pharmaceutical companies are devising good practices in pharmaceutical engineering. Subsequently, vendors will understand what is currently working and what must be removed. Good documentation practices, if followed, will assist in implementing best practices across projects.
The frameworks for compliance are almost readily available to be leveraged, but awareness of the trends, best practices, and approach is inadequate. For most big vendors that are CMM and ISO certified, all the relevant quality processes and systems are in place. After all, FDA regulations such as Principles of Software Validation are drawn from established industry standards on software engineering. The only perceptible differences are the scope, perspective, and the increased emphasis and rigor about risk assessment, security, testing, traceability, configuration and change management, verification, audit trails, and so forth. Conformance and compliance thus become utterly non-negotiable. Nonetheless, it also is beneficial not to allow innovation to suffer in the name of regulations. Doing what the regulations prescribe is imperative, but being able to continuously innovate and cultivate best practices based on experience and knowledge will help in reducing non–value-adding effort.
Vendors should focus on developing competency in working in regulatory compliance frameworks. Awareness of industry and FDA terminology is a must. Vendors must appoint infrastructure qualification and validation consultants to lead and drive the effort in installation qualification and computer system validation. Quality groups must be sensitized to the rigors and trends in quality systems and processes in the pharmaceutical industry. 
Setting up validation and qualification services teams will be fruitful. Such teams could include qualification and validation leads, information system personnel, quality control representatives, internal auditors, regulatory compliance experts, and technical writers.
Conclusion
Consequences of noncompliance can be disastrous for any pharmaceutical company. For vendors, noncompliance could impact business value. Noncompliance at vendor sites could lead to a loss of business opportunity. Vendors should be sensitive to regulatory requirements and also be flexible in aligning their information technology and quality processes to that of regulatory expectations.
Vendors must understand the expectations from the industry and regulatory bodies and have provisions within their company policies to tailor, customize, and even refine specific IT processes to address regulatory compliance requirements. Good preparedness for audits and inspections and a thorough understanding of risk-control mechanisms hold the key to success. Vendors and service providers must gear up for the future because future regulations will be more complex and strict and only compliance and quality-focused vendors will lead, in view of FDA and other regulatory agencies' continued vigil to monitor compliance in IT systems. The best way forward is to confidently assess gaps in compliance, remediate systems and documentation, and adopt a risk-based approach for compliance. A continued state of compliance in itself is a reward and the benefits are immeasurable. For vendors to gain IT outsourcing contracts from pharmaceutical companies, they must demonstrate control of IT infrastructure and systems and produce evidence in the form of current documentation. Demonstrating evidence of qualification and validation is an important criterion in vendor evaluation and assessment. Continued investment in building compliant frameworks and regimes, in terms of managing IT Infrastructure and systems in a validated way, will help vendors sustain competitive business advantage.
IT vendors can create value for both their clients and themselves by defining a strategy for regulatory compliance, particularly for pharmaceutical IT services, aligning it with their overall IT strategy, understanding what is required, and embracing a culture of continued compliance and quality.
Siddhartha Gigoo is a manager at Tata Consultancy Services Limited (TCS) C-56, Phase 2, Dist. Gautam Budh Nagar, Noida-201305, Uttar Pradesh, India.
or

*To whom all correspondence should be addressed.
Submitted: May 15, 2006. Accepted: Sept. 7, 2006
Keywords: GMP compliance, information technology, qualification, validation
References
1. US Food and Drug Administration, Compliance Policy Guides Manual, Human Drugs, Computerized Drug Processing; Vendor Responsibility, CPG 7132a.12, Sub Chapter 425, 1985.
2. FDA, Guidance for Industry, Computerized Systems Used In Clinical Trials, 1999
3. FDA, General Principles of Software Validation; Final Guidance for Industry and FDA Staff, 2002.
4. Code of Federal Regulations, Title 21, "Food and Drugs Guidance for Industry, Part 11, Electronic Records; Electronic Signatures–Scope and Application"
5. FDA, Glossary of Computerized System and Software Development Terminology, 1995.

No comments:

Pharmaceutical Validation Documentation Requirements

Pharmaceutical validation is a critical process that ensures that pharmaceutical products meet the desired quality standards and are safe fo...