Thursday, December 31, 2009

The road to free validation

At first glance, the title of this article may bring a wry smile to the face of many an astute practitioner, but I can provide 'documentary evidence' that free validation is not a just a play on words, but a financial reality.

Flawed philosophies

As a seasoned lecturer in validation and compliance, I welcome novel ways of explaining what validation and, more generally, compliance are; how they affect a company; and how best to implement them. Fundamentally, as a pharmaceutical facility embarks on a validation initiative, or rather the overall pursuit of regulatory compliance, projects can easily become lost in the detail and, consequently, lose sight of their true objectives. All too often, personnel put their energies into 'doing things right' and, while this may seem commendable, on closer inspection it can be found wanting. I have found this embraced in companies as part of policy, but these flawed philosophies can manifest as problems almost anywhere in the business. Unless the company has the evidence to back up such claims, regulators will know exactly what to challenge and where to expose these shortcomings.

Doing things right is also often extolled as the rhetoric in training sessions and project meetings without consideration of the huge costs and burden of both the validation effort and the remedial tasks in the infrastructure required to support them. Often we find investment in both is disproportionate to any returns, making it no small wonder that validation is still regarded by many with great suspicion. Practitioners have also opined at conferences that it adds no tangible value to their business, but why is this misconception still prevalent? Why are there still so many misunderstandings surrounding validation and what constitutes full compliance with, for example, FDA and the Code of Federal Regulations (CFR)? One answer is that we should all ponder less about 'doing things right' and think more objectively about 'doing the right things', otherwise we may find that we have elevated the entire operating parameters of the business to a level it cannot afford, is ill equipped to develop and incapable of sustaining.

Conditions

There are conditions to achieve 'free validation'. First, financing a major validation project requires substantial capital outlay and is a daunting task for all concerned. When validation is placed alongside the daily running of the business, cultural resistance is inevitable and someone will invariably ask: "Why do we have to do this?" The point they should be making is that it is not part of their job description, contract of employment or day-to-day duties; in fact ,the extra work is more often a substantial departure from the daily continuity that keeps the business going.

As the intensity of a project builds, so too does the pressure on those in key positions, which contorts the point of compliance from one of substantive inconvenience to a general feeling of upheaval. Add to this the barbed comments between colleagues during monthly project budget reviews, where disparities between project outlay (finance) and progress become all too apparent, and validation becomes even more controversial. Management has every right to feel prickly as the costs and business risks are enormous — get it wrong and you compromise product quality, patients' health and the company.1

As the funding dries up, corners are cut, strategies are reinterpreted, assumptions are made and the whole experience of striving for regulatory compliance begins to fall apart, if not overtly, covertly in private memos and closed door tête-à-têtes. In the end, the red ink is slashed all over what is left of the budgets, but what has really happened is that the business risk has risen and placed the company at the mercy of the thoroughness of regulatory bodies. But maybe the business will get lucky? Maybe they will not notice that policies and key procedures are not followed and that things have not always been documented. If there is a version of Russian roulette in the pharmaceutical business, this is it. Symptomatic of a business tumbling toward critical and major observations, the condition is known as 'quasi-validation.'2

Keeping it simple is key, but this is easier said than done. Keeping the costs of validation and compliance to a minimum would keep many happy as it would lessen the work load across the entire organization, freeing us to do what we do best — make the finest medicines and devices in the world.

But how does one go about 'doing the right things,' bringing down costs and providing validation for free? There are ways and means via a collection of informed perceptions on well-worn industry maxims, such as, there is little difference between 'process development' (PD) and 'process validation' (PV).3

With the right investment in PD, specifically the optimization of manufacture using a 'corrected business framework (CBF),' validation is achieved in unison and paid for out of the profits from increased yield and heightened efficiencies in manufacture and the infrastructure that supports it. Such a strategy places validation in a whole new light and makes pursuing it good business sense, but only when it is done well and fully integrated.

The epithets used earlier in the context of this article, such as 'remedial actions' and 'CBF', are the terms I prefer when citing the problems inherent within a company. 'Remedial actions' pertains to company infrastructure, namely the short-comings in engineering, maintenance, calibration, analytical methods, utilities, manufacturing, personnel, training, the quality management system, IT and batch records. If there are problems in any of these business units (departments), they will manifest as failings within validation; not failings of validation, per se, but failings of infrastructure — shortcomings that support the undisputed truth that you cannot validate bad practice.

Conversely, a 'CBF' is one that has commercial terms and conditions, and regulatory compliance at its core. Therefore, there must be a clear distinction and balance between business risk (loosing money) and regulatory risk (compliance). For example, within any validation project, only systems, equipment components and instrumentation that have a direct impact on product quality should have a place in the validation scope. Anything deemed to be business risk should be placed among regulatory risk as a function of informed judgments, never out of ignorance.

It is time to abandon the ill-conceived quasi approaches to validation and compliance, not least because the subjects have moved on and become more complex and expensive, but with the advent of electronic records and signatures, the demand for full process and product characterizations, and the growing use of statistics to demonstrate compliance, validation concepts continue to challenge the best of the best. However, the fundamentals of the subject are as true today as they were when Theodore E. Byers introduced PV to the pharmaceutical industry on 11 October 1974.4 As validation continues to mature, it also seems to have acquired a quantum state whereby the more we look the more we see. With this comes a heightened awareness of companies' imperfections, shortcomings that make everybody nervous as there is always someone responsible.

The mechanics of validation are very simple. Validation is the pursuit of exactitudes (yes/no, pass/fail, critical/noncritical, compliant/noncompliant). Unfortunately, these are then applied to inexact sciences meaning that a greater degree of realism and rationalization is no longer an option, but a serious business need, especially when one considers that business and compliance risk equate to degrees of business exposure that increase exponentially within the project. This is amplified when a company forms the aforementioned quasi approaches to validation and compliance from a platform of false economies.

How to achieve free validation

So, what are the means by which free validation can be achieved? The following ten categories, in no particular order, show the way and outline a more enlightened and lean approach to validation and compliance. These savings will, collectively, offset the capital outlay, if not immediately then in a realistic pay-back period.

Planning. It's all in the planning as they say. Plan the work and work the plan — validation and the uniform state of compliance are no exceptions to this rule. Reduce the project to the basics and marry these to the existing business model. While it may seem like trying to get a square peg in a round hole at times, there is sound reasoning behind the need to straddle such obstacles. There is no sense in trying to change too much too soon as continuity is key. Maintain all the basic tenants of quality assurance (QA) and quality control (QC) at all times and ensure that the business units are configured collectively, aligned with regulators' GMP requirements. Also remember that regulators have little interest in business efficiency; they are only concerned with whether you have control over both the good and bad elements of manufacture and the facility that supports it.

Gap analysis. Perform a gap analysis against each business unit to determine in real time how healthy the infrastructure is in relation to compliance with regulations and use the findings (the gaps) as the outline scope of a remedial plan. Apply to this process a proceduralized approach to impact assessment, such as methods penned by ISPE, to rationalize and differentiate between business and regulatory risks (modular differentiation).

The process will, effectively, reduce the validation scope. A simple method of achieving this is to obtain the asset registers for all systems, equipment, components and instrumentation and then analyse them in a committee type forum to designate their GMP impact (direct, indirect and no impact). For computer systems, we follow the risk-based approach. Remember, only those with direct impact should figure in the validation plan, but you should log all rationales and the reasons for their respective designations as these will later help explaining any distinction given. They also provide an important source of reference material that supports change control and batch record.5 Simply put, check you have all the pieces in each department and ensure that they are compliant with each respective part of the regulation.

Remedial plan. A remedial plan is a recent, but essential, concept. If a company does not address the compliance gap across departments then problems will later manifest in validation, along with all the problems evoked by having to open countless numbers of change controls and deviations. As suggested earlier, the infrastructure is the backbone of the organization, and must be sufficiently robust to support validation and compliance, otherwise there is no business.

In reality, the remedial plan is a collection of smaller plans that outline the patches and fixes required in each business unit. The plan will include a list of deliverables, such as those found in the matrix of a validation master plan (VMP), but in a remedial plan they are deliverables required to correct the infrastructure, and the necessary upgrades to plant and equipment. Once the business unit has been corrected and configured within itself, work can begin to configure it to each of the other departments, providing a robust and compliant infrastructure (modular integration).

Documentation. Simplify the validation suite of documents with lean approaches that rationalize procedures, guidelines and templates. Use these as the basis to form validation plans. The production of these is simplified further still by the prior development of a validation policy and strategy document. While this is not a mandatory document, it is one that includes the statements and details of the strategies pertaining to all aspects of validation and compliance, including cleaning and analytical methods; test methods validation; computer systems; utilities; equipment; and process facilities. By these means, VMPs can be written succinctly and concisely, rendering it much easier to understand and implement the 'policy and strategy' document. This then leads into any programming Gantt charts that are required. Some of the more tried and trusted methods of streamlining validation include:

  • Family protocols — master protocols used to validate generic groupings of equipment.
  • Integrate Factory Acceptance Test (FAT)/Site Acceptance Test and commissioning with validation as far as possible, placing more emphasis on QC in the engineering sequence.
  • Design engineering with validation as part of the design criteria.
  • Macro-driven validation templates (these also support QA/Part 11).
  • Training — plan the work and work the plan!6

Resources. Commit resource and funding to optimize manufacture. As stated earlier, there is little difference between PD and PV. The commitment to PD calls in to play the very mechanics found in PV, resulting in greater efficiency, visibility and control in manufacture, leading to less scrap, more yield and increased profit margins. Essentially, a full commitment to PD should provide PV for free.3

One of the primary functions in PD is 'process characterization' of the process steps: the system of documenting the capabilities of process steps all the way to the edge of failure to establish the 'worse case scenario'. This provides a fuller understanding of the process and a juncture to document process capability. It also doubles as operational qualification and provides some of the obligatory documentary evidence. A secondary function of characterization for validation and compliance is in the use of statistics and statistical sampling. There has always been sampling, but the concept of statistical sampling is relatively new in validation and is now attracting greater interest in other areas within regulatory bodies. Statistics and statistical solutions (sampling plans) are the portal between PD and PV (Figure 1). By examining the process through statistics, using statistical analysis tools and programmes (e.g., Six Sigma and Minitab software), and using information garnered from process/product characterization and making adjustments to process equipment, instruments and components, and so forth, the process performance can be optimized. This will also provide the savings needed to soften the initial capital investment required to support both validation and compliance (Figure 1).

Reviews and approvals. Simplifying the review and approvals process may not, initially, take long to develop. The real-time constraints are often found in the facilitation of the document through the review and approvals process, and this problem is exacerbated where electronic approval is used. Reluctance to let go of the 'hard copy' means that old and new ways of document approvals live alongside each other. The difficulties are complicated by the routing of documents for 'unofficial' preapproval to smooth their eventual passage through the electronic system. This route seems to have invited far more reviewers and approvers than is required to achieve compliance. All the above lengthen the time it takes to complete the cycle from blank paper to approved document.

Rationalization of procedures and the document system. Each business unit will have a plethora of dedicated standard operating procedures that have evolved with little or no regard to how they are best configured to the business sector they are designed to service, and with no regard to how they link (directly and indirectly) with procedures in other departments/sectors. Ostensibly, the coverage of procedures carry overlaps, conflicts, misalignments, gaps and a degree of duplicated effort. The number of procedures should be reduced through rationalization, retirement, amalgamation and redesignation. However, there should never be a procedure when a guideline, work instruction or routine will suffice.

'Active' current good manufacturing practice (AcGMP). GMPs are required by law where cGMPs are the current interpretations of these laws and AcGMPs are the 'active' real-time interpretations of GMPs. cGMPs are penned by peer-review groups, such as the Pharmaceutical Inspection Convention, ICH and the Parental Drug Association, but these take time to filter through industry as they require drafts consultations, seminars, papers and, ultimately, publication. Therefore, GMPs and cGMPs are, by nature, retrospective when used by industry. Real-time interpretation of AcGMP is derived from a company's study of regulatory trends, citations and warning letters. These observations eventually find their way into the peer-review route and, ultimately, into amendments and additions to guidelines, before much later finding their way into actual GMP. But these current interpretations find their way into a company through policy, procedures and training.

There is also the pressure brought to bare on regulators by peer-review groups over elements of legislation, such as 21CFR Part 11 compliance, where industry was not convinced or prepared to foot the bill that, in the case of Part 11, was once estimated to be in excess of US$100 billion (€64.4 billion) globally. Remembering too that this is all about interpretation, the GMPs are written ambiguously so that this interpretation resides with the manufacturer, seemingly leaving no avenue for litigation with the regulators. Additionally, there is no chance of litigation regarding cGMPs published by the peer-review groups as they are only guidelines. There is no escaping that culpability resides fairly and squarely on the shoulders of the manufacturer. Therefore, to service a validation project well, a company must remain vigilant, and nurture a real-time awareness of industry and regulatory trends. Essentially, this creates a ratcheting effect spawned in regulatory audits, whereby companies A, B and C are inspected and have all found similar approaches to solving complex compliance issues. The regulators embrace these solutions and use them in later inspections within companies that have failed to conform to such field standards. Thus, a real-time interpretation is essential; one that is serviced through events, such as the trend in warning letters, to gauge the mindset of the regulators in 'real time' and in an 'active sense'.

Training, culture and management. Excellent procedures, slick strategies and the best will in the world are ineffective without lots of training, cultural buy-in and strong management.

Akin to the property maxim "location, location, location", the pharmaceutical industry's is "training, training, training", which must ensure that everyone sees the same big picture and understands the proscriptive elements of procedures in the same way. Having it written down does not guarantee it will be understood and followed in the same way by all, so unless one is prepared to train, test the training and monitor the outcome of the training thereafter, there is a risk that we are creating more trouble than we are trying to solve.

Cultural resistance is one of the biggest obstacles in the run-up to an inspection. In engineering, for example, validation is not part of the culture engineers have grown up with. On many occasions I have been asked by engineers: "Why try and fix something that is not broken?" It's a valid question. The answer I give is this: engineering has evolved primarily to satisfy commercial terms and conditions, but there are certain elements of engineering by way of their regulatory significance, distinction and application (discussed earlier as impact assessment) that single them out as part of a validation scope, not to satisfy commercial terms and conditions, but the regulatory exactitudes they are measured for success against. This includes testing, traceability (audit trails) and other documentary evidence.


Figure 2
Management is another area of great concern as many managers frequently lack the skill, experience or aptitude to do what is required. Usually, managers are the victim of the Peter principle:7 promoting someone to their level of incompetence. A good knowledge of validation, with a wide skill set, is essential for a validation and compliance manager if the job is to be managed on time, in budget and fit for purpose. There is no room for compromise either regarding this latter point as 'quality is King!' However, even those in validation frequently do not understand the big picture, and those outside do not understand any of the correlations, causations and commonality that validation engenders. All too often a project is doomed, and what is at fault? If one spent time conducting a simple trend analysis of warning letters, it would be seen that most problems emanate from management, procedures, training or a combination of all three. Ultimately though, the project is won and lost in the planning and the general assumption that the process already has capability, accuracy, precision and full characterization.

Figure 3
To keep the management of a validation and compliance project simple, it must be planned in four distinct phases:
  • Phase A is where project status is established, gaps identified, plans formed and project tools (including training) generated and communicated to all concerned.
  • Phase B takes the corrected modules and, first, configures them by department before linking each department to every other in a way that supports project objectives and compliance with the CFRs.
  • Phase C validates the various processes, systems and practices against the CFRs.
  • Phase D includes departmental auditing and project audits; FDA training for inspection; a dummy audit by an external company (to mimic an FDA audit) and, finally, FDA's proper inspection (Figures 2 and 3).

Commissioning material. A refined approach in the use of commissioning material in validation. The following statements should be considered when regarding both subject matters in one context:

  • Commissioning is incomplete until validation has been successfully completed.
  • The criteria for commissioning and validation are different. The former satisfies commercial arrangements, the latter regulatory compliance.
  • The acceptance criteria for commissioning is often different to that for validation, and vice versa.
  • The type and methodology used to document both events (commissioning and validation) can differ considerably or, occasionally, be the same.
  • Test scripts for both events are usually a mixture of generic tests — those exclusively for validation and those exclusively for commissioning.
  • The parameters set against the acceptance criteria for test results can differ greatly for both disciplines.
  • Commissioning and validation can, theoretically, happen in any sequence. Moreover, the tests can seem the same, but success in validating pivots on subtle yet fundamental differences, such as those discussed earlier. While the two events have distinctly different agendas, they complement each other, as the above bullets emphasize.

Commissioning is undoubtedly a great precursor for validation. An inspection of the equipment is a necessary part of commissioning work as it exposes defects that are not detected at FAT or manifested through transit. Both will be corrected at the commissioning stage to prove conformity with the specification and conclude contractual arrangements. Ordinarily, this would be sufficient if it were not for subsequent validation. Even when commissioning is successful (modular differentiation), it does not guarantee that any associated qualifications will turn out likewise. The conclusion of a validation study (the configured performance of the equipment) may expose previously hidden flaws in:

  • The design.
  • Its application(s).
  • Inadequacies inherent within commissioning.

One of the most important criteria is management. It is essential that the whole experience of commissioning, validation and overall compliance is managed properly, and by someone with all the requisite experience in all the disciplines mentioned. Without this, the programme will wander, costs will rise and confidence, as well as expectation, will rapidly fall away.

The right things


Call for papers
With the costs and complexity of validation and compliance rising, it is high time to focus on 'doing the right things'. With the right investment, savings can be achieved that offset the capital cost of a compliance and validation project, and provide significant savings within the revenue budget thereafter.

Validation is not there to police or find fault with the business, rather, it is a great business tool that provides both compliance and the assembly of a robust diagnostic tool. Good validation practice should save a lot of money, not cost a lot of money. Moreover, it should diminish regulatory risk. Done well, validation safeguards the company.

Keith Powell-Evans is Compliance and Validation Consultant at Altran (France) and Founder member of the Institute of Validation.

References

1. K. Powell-Evans, Pharm. Technol. Eur., 10(1) (1998).

2. K. Powell-Evans, Pharm. Technol. Eur., 14(9), 60–65 (2002).

3. GHTF (Global Harmonization Task Force) SG3/N99610:2004.

4. W. Gibson and K. Powell-Evans, Validation Fundamentals (Interpharm, Illinois, USA, 1998).

5. ISPE Commissioning and Qualification Baseline Guide (2000). http://www.ispe.org/

6. K. Powell-Evans, Pharm. Technol. Eur., 10(12), 48–52 (1998).

7. P.M. Senge, The Fifth Discipline, (Currency/Doubleday, New York, New York, USA, 1990).

Setting Cleaning Validation Acceptance Limits for Topical Formulations

Pharmaceutical manufacturing requires the selection of residue acceptance levels for potential residues such as active pharmaceutical ingredients (APIs), excipients, degradation products, cleaning agents, bioburden substances, and endotoxins when carrying out cleaning validation studies. These levels are determined according to the potential pharmacological, safety, toxicity, stability, and contamination effects on the next product produced with the same surface or equipment. The USFood and Drug Administration's guidance for determining residue limits states that residue limits should be logical, practical, achievable, and verifiable (1). Limits are typically set for visual, chemical, and microbiological residues by taking into consideration the batch size, dosing, toxicology, and surface area for the equipment.

In contrast to solid and liquid formulations, topical semi-solid formulations such as creams and ointments pose far more difficulty in cleaning because they contain greasy ingredients such as waxes and oils. These ingredients may inhibit wetting by cleaning agents, thereby limiting the ability to clean–rinse the residual product away.

Topical formulations (TFs) generally are considered to be safe and less potent than oral or injectable formulations. Nonetheless, the APIs and excipients commonly used in TFs may produce significant adverse effects in the form of skin irritation, skin sensitization, hypersensitivity, and photosensitivity reactions. Therefore, one must determine the carryover of residues from one product to another in a scientifically justified manner to limit the chances of adverse reactions and possibility of synergistic pharmacological effects between products and their ingredients.

The available guidelines and literature on cleaning validation provide possible approaches for setting the residual acceptance levels for APIs and finished products but contain little or no guidance regarding the ways to use these approaches for TFs (2–5). Although the existing approaches are more logical and appropriate for generating residual acceptance levels for solid and liquid formulations, they can also be applied to TFs with logical modifications. This article discusses the possible ways to set the residual acceptance levels for APIs present in TFs as a prerequisite for conducting cleaning validation studies.

Background

The method widely used within the pharmaceutical industry for setting residual acceptance levels is the one provided by Fourman and Mullen, whose paper is listed in the reference section of the FDA cleaning validation guidance document (2). The method is based on the following criteria:

  • The dose criterion is based on the principle that an API should not be present in a subsequently produced product at levels higher than 1/1000 of the minimum daily dose of the API in a maximum daily dose of the subsequent product.
  • The 10-ppm criterion is based on the principle that any API should not be present in a subsequently produced product at levels higher than 10 ppm.
  • The visually clean criterion states that the equipment should have a fixed value of no higher than 100 μg per 2 × 2 in. swab area.

This article uses the same principles to generate cleaning validation residue acceptance levels for TFs.

Criterion based on product strength

One basis for establishing limits is a mathematical calculation that allows a certain fraction of the therapeutic dose to carry over into the maximum daily dose of the following product. The dose fraction allowed to carry over is referred to as maximum allowable carryover (MACO) and is based on the acceptable daily intake (ADI) of the API being cleaned. Industry uses various approaches to determine the ADI values for active ingredients and involve using minimum recommended therapeutic daily dose, lowest marketed dose, or no observable effect level (NOEL)/LD50 (lethal dose 50%) values divided by a safety factor (5). Some manufacturers use an occupational exposure limit (OEL) value to calculate ADI.

If A refers to the product being cleaned, active A1 to the API present in product A, and product B to the subsequently produced formulation, then the MACO calculation based on ADI values can be expressed as

MACO = (ADI × BS × SA)/(MDD × ESA)

in which ADI is the acceptable daily intake of the active A1 (%) and is equal to the minimum therapeutic daily dose × safety factor; MDD is the maximum daily dose for product B (mg); BS is the batch size of the product B (mg); SA is the swab area (cm2 ); and ESA is the equipment surface area shared with product A and product B (cm2 )


Table I: Safety factors for the determination of cleaning validation acceptance limits.
For the MACO calculation using therapeutic dose unit and safety factor approach, different safety factors have been suggested for various formulation types (see Table I). Nonetheless, a safety factor of 1000 is widely used because it can be thought of as comprising a factor of 10 for adjusting a therapeutically effective dose to a therapeutically noneffective dose, a factor of 10 to accommodate for individual variability in response, and a factor of 10 for making cleaning validation studies robust. The dose-reduction fraction is a measure of the risk involved and is assessed by the manufacturer depending on the actual manufacturing situation.

When it comes to TFs, applying the ADI or therapeutic dose–safety factor approach for MACO calculations is challenging for several reasons, including:

  • TFs are not divided into individual dosage units, unlike solid and liquid formulations.
  • Dose size per application varies depending upon the total lesion area. Hence, it becomes difficult to establish minimum or maximum daily doses.
  • No standardized approach exists for determining dose size. That is, the amount of product recommended to be applied per area of skin per treatment varies from manufacturer to manufacturer.
  • The pharmacological effects produced by systemic absorption of the API through oral ingestion are taken into consideration when determining ADIvalues. In the case of TFs, occasionally enough API is absorbed to cause systemic effects. TFs produce local toxicity even at low doses.

Therefore, a need arises for establishing a rational method to determine cleaning validation acceptance limits for TFs. This article establishes a method for the calculation of MACO for the APIof a subsequently manufactured TF by taking into account possible worst cases.

Worst case I

Assume that active A1 is marketed in strengths of 2%, 3%, and 5% and that product B can be applied on the entire body two to four times per day. The criterion set by Long and Finley can be used to determine the total amount of TF (e.g., cream and ointment) applied on the entire body in one application (6). The criterion describes the dosage of TFs in terms of fingertip units (FTUs). One adult FTU is the amount of ointment or cream expressed from a tube with a standard 5-mm diameter nozzle, applied from the distal crease to the tip of the index finger. One FTU contains approximately 0.5 g of cream, and it is assumed that approximately 20.25 g (~40.5 FTUs) of cream is needed to cover an adult body. On the basis of the these assumptions, the maximum amount of product B that can be applied daily would be

20.25 g/application × 4 application/day = 81 g/day

To determine the maximum amount of product applied per treatment for other TFs such as medicated shampoos, lotions, toothpaste, and mouthwash, one could consult the European Commission's Guidance for the Testing of Cosmetic Ingredients and their Safety Evaluation or the manufacturer's in-house criteria (7). For these formulations, the amount of product applied per treatment is fixed; therefore, the maximum amount of product applied per day depends on the maximum number of times the product is used daily.

The lowest available strength of active A1 in the TF is 2%. This means that approximately 1.62 g (2% of 81 g) of active A1 if present in 81 g of product B will produce its pharmacological effects. Taking into account a safety factor of 1000, active A1 should not exceed 0.00162 g (1.62 g divided by 1000) in 81 g of product B.

Assuming that the batch size for product B is 300 kg, MACO of active A1 to product B can be calculated as

MACO = (0.00162 g × 300 kg × 106 mg/kg)/81 g = 6000 mg

In other words, MACO of the active A1 to a batch of product B should not be more than 0.1% of the lowest marketed strength–concentration for active A1. For the example above, MACO can also be calculated by

MACO = 0.1% × 2% × 300 kg × 106 mg/kg = 6000 mg

From the previous example, it is clear that to calculate MACO of the API in any subsequently manufactured TF, the only information needed is the lowest marketed strength of the API in the TF being cleaned and the batch size of subsequently manufactured finished product. The MACO calculation is independent of minimum and maximum daily dose for product A and product B, respectively.

MACO in terms of amount of API per surface area of equipment (for swab sampling) can be determined by using the following equation:

MACO = (0.1% × C × BS × SA)/ESA

in which, C is the lowest available strength of the API in the product A (%), BS is the batch size of the product B (mg); SA is the swab area (cm2 ); ESA is the equipment surface area shared by product A and product B (cm2 ). This equation can be further modified to calculate the residue limits (mg/mL) in the rinse sample by using total volume (TV) of rinse or wash solvent portion (mL) and volume (V) of rinse sample collected (mL) in place of ESA and SA, respectively.

Worst case II

Following the same assumptions and examples described in worst case I, therapeutic doses for TFs may be established in terms of mg/kg body weight/day or mg/cm2 /day units. According to Long and Finley, 1 FTU covers about 286 cm2 of skin surface area. This implies that approximately 1.75 mg of the TF is applied per each square centimeter of skin surface area. The dose size for TF may be calculated using the following equation,

Daily dose (mg/cm2 /day) = A × C × F

in which, A is the amount of TF applied in one application per unit area of skin (mg/cm2 ), C is the concentration of the API in the TF (%), and F is the frequency of application per day (day–1 ).

For calculating minimum daily dose for product A, A = 1.75 mg/cm2 , C = lowest available strength/concentration of the API in the product A = 2%, and F = 1 (day–1 ). Therefore, the minimum daily dose for product A = 0.04 mg/cm2 /day, and the maximum daily dose for product B = 6.99 mg/cm2 /day (using A = 1.75 mg/cm2 , C = 100%, and F = maximum number of times product B is applied per day = 4). One point to remember is that the calculation for the maximum daily dose for product B is independent of the concentration of API present. Therefore, for a batch size of 300 kg (product B) the MACO value may be calculated as

MACO = (minimum daily dose for product A × batch size for product B) / (safety factor × maximum daily dose for product B)

MACO = (0.04 mg/cm2 /day × 300 kg × 106 mg/kg) / (1000 × 6.99 mg/cm2 /day) = 1500 mg

Comparing the previous cases, one can observe that the MACO value obtained from the second worst-case scenario is four times less than the value obtained from the first worst case. After thorough analysis of the calculations, however, it can be concluded that the MACO value obtained from worst case II is equal to the MACO value from worst case I divided by the maximum number of applications per day for product B. Therefore, if MF is the maximum number of times product B that can be applied daily, the MACO value from worst case II can be mathematically expressed as:

MACO = (0.1% × C × BS × SA) / (MF × ESA)

Therefore, it is the concentration (i.e., percentage) of active A1 and the number of doses per day for product B make a difference when it comes to the calculation of MACO values.

Criterion based on permitted daily exposure

Another approach for determining MACO uses toxicity data. This strategy is generally used in the industry when dealing with contaminants for which therapeutic doses are not known (e.g., intermediates, precursors, and cleaning agents).

For many drugs, using a safety factor of 0.1% of the lowest recommended therapeutic dose may be reasonable and will produce MACO values at a safe level. This approach cannot be used indiscriminately, however, for several reasons. First, for topical products, the therapeutic doses are not well defined, as discussed previously. A second issue is the mechanism by which toxic effects are produced by the drug. For many drugs, the mechanism by which the toxic effects are produced might be unrelated to the mechanism of pharmacological action. For instance, a drug may cause developmental toxicity (e.g., birth defects) or cancer by a mechanism unrelated to its pharmacological effects. In these cases, using the 0.1% safety factor on the therapeutic dose may still be appropriate but would need to be used with greater caution because the toxic effects may be produced below the safe therapeutic levels.

For these reasons, many manufacturers also include an approach based on toxicity data to calculate MACO values. For many drugs, this approach may be reasonable and has produced MACO values similar to the traditional therapeutic dose/safety factor approach. The basic value of this approach is that a limit can be calculated for cleaning validation purposes based solely on the toxicity of the API present in the TF.

The method uses permitted daily exposure (PDE) values, the criteria commonly used for determining occupational and environmental health hazards. There are two ways to set PDE values. One of the approaches is based on the "no observed effect level/safety factor" (NOEL/SF) approach. In this approach, all of the pertinent animal and human studies are reviewed, and the highest dose that did not cause an effect in the most sensitive health endpoint (the NOEL) is identified. Once a NOEL has been identified, a set of uncertainty (or safety) factors are applied to this value to compensate for limitations in the data and ensure that safe MACO values are obtained. If a NOEL is not available, then a lowest observed effect level (LOEL) can be used. The LOEL value is the lowest dose that causes an effect in the most sensitive health endpoint. A safety factor from 1 to 10 may be considered for extrapolating a LOEL to a NOEL.

The no observed adverse effect level (NOAEL) or lowest observed adverse effect level (LOAEL), are often used interchangeably with the NOEL or LOEL, respectively. For TFs, using NOAEL and LOAEL values is more logical when dealing with local toxic effects caused by the active ingredient.

An equation to determine PDE value for a pharmaceutical can be represented as follows:

PDE =NOEL × HBW × SF

in which, NOEL is typically in units of milligram of active ingredient administered per kilogram of animal body weight per day (mg/kg-bw/day). NOEL values obtained from human toxicity data and reported in milligrams per day need not to be multiplied by human body weight (HBW), and a safety factor of 0.1 is used to account for the human variability in response. HBW typically is assumed to be 60 kg for an adult male. SF, which is the safety factor for accommodating limitations in the data, is usually 0.01 for converting NOEL to a PDE for topical products.

The second method of calculating PDE is to convert the LD50 value to a NOEL value by applying an empirical factor. This empirical factor is derived from animal models developed by Layton et al. and can range from 0.0005 to 0.001 (8). The NOEL thus obtained is converted to a PDE value by using the previous equation.

It is important that the NOEL and LD50 values be obtained from dermal toxicity studies. NOEL/LD50 values reported in mg/m2 /day, the PDE (mg/kg) can be calculated by using the following equation:

PDE (mg/day) = NOEL (mg/m2 /day) × HSA (m2 ) × SF

in which, HSA is the average human body surface area, typically assumed to be 1.62 m2.

In addition, a few drugs have NOEL/LD50 values reported in parts per million (ppm), which can be converted to mg/kg-bw/day on the basis that 1000 ppm equals 25 mg/kg-bw/day for an average 60-kg adult.

MACO(mg/swab area) values based on toxicity data may be calculated as

MACO = (PDE × BS × SA) / (MA × ESA)

in which PDE is the permitted daily exposure for active A1 (mg/day), BS is the number of fingertip units per batch of final mixture of product B (FTUs), SA is the swab area (cm2 ), MA is the maximum number of FTUs of product B applied on the skin per day (FTUs/day) as described under criteria based on the strength of product, and ESA is the equipment surface area shared by product A and product B (cm2 ).

For this example, if the PDE value for active A1 = 0.35 mg/day, and assuming BS = 200,000 FTUs, MA = 162 FTUs/day, SA= 25 cm2 , and ESA = 6000 cm2 , then the MACO (mg/swab area) value is

MACO = (0.35 mg/day × 200,000 FTUs × 25.0 cm2 /swab) / (162.0 FTUs/day × 6000 cm2 ) =1.80 mg/swab

Other criteria

Other criteria typically used in the industry include 10 ppm criterion and visually clean criterion. The former is based on the assumption that not more than 10 ppm of any pharmaceutical ingredient should appear in any other product, and the latter is not an assumption but rather a fixed value of 0.1 mg/25 cm2 swab area obtained after performing spiking studies. The lowest value obtained from all the MACO calculations based on different criteria is then selected as the acceptance limit for active A1.

Conclusion

The determination of MACO for a pharmaceutical agent to the subsequently manufactured product is an inexact science. Each approach has its own set of assumptions and limitations. Any firm that relies on MACO values for their cleaning validation studies must understand the assumptions used in deriving the MACO values. It is the responsibility of pharmaceutical manufacturers and cleaning validation scientists tasked with setting MACO values to estimate a value that is safe for consumers without being so demanding that resources are spent unnecessarily.

M. Ovais* is a pharmaceutical scientist, Xepa-Soul Pattinson (M) Sdn Bhd, 1-5, Cheng Industrial Estate, 75250 Melaka, Malaysia, tel. 006063351515, fax 006063355829,

Lai Yeo Lian is the innovation and development manager at Xepa-Soul Pattinson (M),

*To whom all correspondence should be addressed.

Submitted: June 5, 2007. Accepted:July 26, 2007

References

1. FDA, Guide to Inspections of Validation of Cleaning Processes, Division of Investigations, Office of Regional Operations, Office of Regulatory Affairs (Rockville, MD), July 1993.

2. G.L. Fourman and M.V. Mullen, "Determining Cleaning Validation Acceptance Limits for Pharmaceutical Manufacturing Operations," Pharm. Technol. 17 (4), 54–60 (1993).

3. D.A. LeBlanc, "Establishing Scientifically Justified Acceptance Criteria for Cleaning Validation of Finished Drug Products," Pharm. Technol. 22 (10), 136–148 (1998).

4. R.J. Forsyth and D.V.Haynes, "Cleaning Validation in a Pharmaceutical Research Facility," Pharm. Technol. 22 (9), 104–112, (1998).

5. J. Agalloco, "Points to Consider in the Validation of Equipment Cleaning Procedures," J. Paren. Sci. Technol. 46 (5), 163–168 (1992).

6. C.C. Long and A.Y. Finlay, "The Finger-Tip Unit–A New Practical Measure," Clin. and Experim. Dermatol. 16 (6), 444–447 (1991).

7. European Commission Health and Consumer Protection Directorate-General, The SCCP's Notes of Guidance for the Testing of Cosmetic Ingredients and their Safety Evaluation, 6th revision, adopted by the SCCNFP during the 10th plenary meeting Dec. 19, 2006.

8. D.B. Layton et al., "Deriving Allowable Daily Intakes for Systemic Toxicants Lacking Chronic Toxicity Data," Regul. Toxicol. and Pharmacol. 7, 96–112 (1987).

Tuesday, December 29, 2009

Simple sterile gloving technique

Serialization Process for the Pharmaceutical Packaging Line

Serialization Process for the Pharmaceutical Packaging Line

Introduction to Pharmaceutical Validation

What is GAMP4

Good Automated Manufacturing Practice (GAMP) came into being as a direct result of the increase in regulatory attention received by the pharmaceutical manufacturing industry during the late nineteen eighties and nineties.

Prior to this, although regulatory guidelines concerning the use and validation of automated systems existed they had been subjected to less scrutiny than is the case today. In addition, as automated systems became more complex and more widely used the need to improve the overall understanding and interpretation of the regulations also increased.

In response to this the GAMP forum was set up to promote better understanding and interpretation of the regulations and improve communication within the pharmaceutical industry, it’s suppliers and the various regulators. The GAMP forum is now a technical subcommittee of the International Society of Pharmaceutical Engineering (ISPE).

Phase 1.

During the Planning and Definition SMB Validation & Compliance Services Group Inc is available to provide assistance to the equipment user in order to develop:

Validation Plan

SMB Group Inc. may be contracted to produce a validation plan, which summarizes the project, identifies the criteria for success and defines the criteria for project acceptance. Validation Plans are produced per Appendix M1 of the GAMP4 guidelines.

Once the project is completed a Validation Report can be produced to summarize the project, measure success and clearly signify acceptance. Validation Reports are produced per Appendix M7 of the GAMP4 guidelines.

User Requirement Specifications

Prior to accepting vendor bids on a project, SMB will put together a list of requirements for the planned system. The list incorporates the needs of the end users of the system as well as any regulatory requirements, division and local requirements. This document details what the customer wants the system to do. It does not provide design detail unless that detail represents a design requirement. Validation testing will be based partially on this document.

Supplier Assessment

For many projects, a supplier assessment or vendor evaluation (audit) utilizing a pre-approved supplier assessment procedure and criteria will be performed. SMB Group can develop and perform supplier assessments as required. Supplier Assessments are produced per Appendix M2 of the GAMP4 guidelines.

Risk Assessment + Critical Parameters

SMB Group is able to develop and manage the project Risk Assessment processes and help define the critical parameters of the project. Risk Assessments are produced per Appendix M3 of the GAMP4 guidelines.

Operation Qualification Protocols

SMB can develop an Operation Qualification Protocol, which provides a detailed written test protocol to verify the system operation against the various approved design specifications. All control functions are verified. E-stops and interlocks are checked for functionality. All machine functions are tested and all devices challenged to ensure that design criteria have been achieved.

Performance Qualification Protocols

In process or performance qualification protocol are developed and performed. SMB can develop and offer assistance during this process.

PLANNING & DEFINITION PHASE 1

planning gamp4


Phase 2. Supplier Quality Plan

Developing a Supplier Quality Plan is a key aspect of any project. SMB Group may be utilized to prepare and monitor a project quality plan. Supplier Quality Plans are produced per Appendix M6 of the GAMP4 guidelines.

Functional Design Specification:

The Functional Design Specification provides a written definition of what the system does, what functions it has and what facilities are provided. Functional Design Specifications are produced per Appendix D2 of the GAMP4 guidelines.

Acceptance Test Specification:

The acceptance criteria for this test is based on requirements originally agreed upon and documented in the Functional Requirements Specification and detailed in an approved written test protocol. Depending upon the complexity of the project this activity may be divided into Factory Acceptance testing and Site Acceptance testing. Acceptance Test Specifications are produced per Appendix D6 of the GAMP4 guidelines.

Hardware Design Specification:

The Hardware Design Specification provides a detailed written definition of the system hardware, how the hardware interfaces to other systems. Hardware Design Specifications are produced per Appendix D3 of the GAMP4 guidelines.

Hardware Test Specification:

The Hardware Test Specification provides a detailed written test protocol to verify the system hardware against the design specification. Hardware Design Specifications are produced per Appendix D3 of the GAMP4 guidelines.

Software Module Specifications:

The Software Module Specification provides a written definition of what the system should do, and the interfaces between the modules. Software Module Specifications are produced per Appendix D4 of the GAMP4 guidelines.

Software Module Test Specifications:

The Software Module Specification provides a detailed written test protocol to verify the individual software modules against the design specification. Software Module Test Specifications are produced per Appendix D6 of the GAMP4 guidelines. Note: Depending upon the complexity of the project the hardware and software test specifications may be consolidated into a single test specification. Note: Depending upon the complexity of the project the functional design specification, hardware design specification and software design specifications may be consolidated into a single detailed design specification.

Installation Qualification Protocols:

SMB can develop an Installation Qualification Protocol, which provides a detailed written test protocol to verify the system installation against the various approved design specifications. SMB can develop an Operation Qualification Protocol, which provides a detailed written test protocol to verify the system operation against the various approved design specifications. All control functions are verified. E-stops and interlocks are checked for functionality. All machine functions are tested and all devices challenged to ensure that design criteria have been achieved.

Performance Qualification Protocols:

In process or performance qualification protocol are developed and performed. SMB can develop and offer assistance during this process. SMB can develop an Operation Qualification Protocol, which provides a detailed written test protocol to verify the system operation against the various approved design specifications. All control functions are verified. E-stops and interlocks are checked for functionality. All machine functions are tested and all devices challenged to ensure that design criteria have been achieved. Performance Qualification Protocols In process or performance qualification protocol are developed and performed. SMB can develop and offer assistance during this process.


DESIGN & DEVELOPMENT PHASE 2

design gamp4


Phase 3. Hardware Acceptance Testing

Hardware Acceptance Testing provides a detailed written test protocol to verify the system operation against the various approved design specifications. Correct operation of the hardware and instrumentation as defined in the hardware design specification is verified. Hardware Acceptance Tests are produced per Appendix D6 of the GAMP4 guidelines.

Software Module Testing

Software Acceptance Testing provides a detailed written test protocol to verify the system operation against the various approved design specifications. Correct operation of the software as defined in the software design specification is verified. Software Module Tests are produced per Appendix D6 of the GAMP4 guidelines.

Integration testing

Integration tests can be developed and performed prior to execution of a factory acceptance test. Integration tests verify that the requirements for the system are met. The acceptance criteria for this test is based on requirements originally agreed upon and documented in the various design specifications.

Note: Each GAMP test stage is duly considered during validation and test planning and may, depending upon the customers Quality Assurance function, meet the requirements of the IQ, OQ & PQ as summarized above.

Note: Depending upon the complexity of the project these test phases may be consolidated into a single Site Acceptance test.


DEVELOPMENT TESTING & SYSTEM BUILD PHASE 3




Phase 4. Factory Acceptance Testing

Once the system is constructed and just prior to shipment, we will perform a factory acceptance test in which we verify that the requirements for the system are met. The acceptance criteria for this test is based on requirements originally agreed upon and documented in the various design specifications and detailed in a previously approved written test protocol.

DESIGN REVIEW & ACCEPTANCE PHASE 4




Phase 5. Installation Qualification Implementation

All data pertaining to the identification of the equipment (model number, serial number, power rating, control systems I.D. numbers, etc.) is recorded to ensure full trace-ability. PLC addresses are verified for correct wiring. Verification for compliance against the Functional and Design Specifications is carried out. Additional information such Standard Operating Procedures listing, Purchase Order verification and Preventive Maintenance review is also typically included.

Operation Qualification Implementation

All control functions are verified. E-stops and interlocks are checked for functionality. All machine functions are tested and all devices challenged to ensure that design criteria have been achieved.

Performance Qualification Implementation

Bracketing is used to qualify ranges of products. Performance Qualification is also to validate a complete integrated process (e.g. a packaging line consisting of several pieces of equipment).

Validation Report

Once the system has been successfully qualified, a Validation Report will be prepared to confirm that the system is ready for use for purpose.


COMMISSIONING & QUALIFICATION PHASE 5




Phase 6. Change Control

Change Control is fundamental to maintaining the validated status of a process and/or products and as such will be defined in the Quality and Project plans. The point of transfer from Project Change Control to Operational Change Control is typically defined in the Validation Plan. SMB Group can develop, detail and maintain Change Control throughout the system lifecycle. Operational Change Control Plans are produced per Appendix O4 of the GAMP4 guidelines.

Periodic Reviews

Periodic Reviews and evaluations of automated systems should be conducted jointly by the System Owner and user Quality Assurance group. Although the final responsibility for ensuring the compliance and correct operation of a system lies with the System Owner, SMB Group can provide compliance input and provide an independent perspective to the process.

Periodic Reviews are produced per Appendix O1 of the GAMP4 guidelines.

System Security

The User Company Management is ultimately responsible to ensure that automated systems and the data produced or contained within are adequately and securely protected against both willful and accidental loss, damage or unauthorized change. SMB Group can develop and detail a System Security Plan. System Security Plans are produced per Appendix O3 of the GAMP4 guidelines.

Back Up & Disaster Recovery

SMB Group can develop and detail a Back Up and Disaster Recovery Plan. Back Up and Disaster Recovery Plans are produced per Appendix O7 of the GAMP4 guidelines.

Trace-ability Matrix

SMB Group provides a trace-ability matrix to establish the relationships between the various products of the development process. Trace-ability matrices detail design relationships and the verification of each. A trace-ability matrix is a live document used throughout the development process, beginning with the User Requirement Specification and maintained through to the approval of the final test specifications.

OPERATION & DECOMMISSIONING PHASE 6




RESPONSIBILITY

Wednesday, December 23, 2009

Aseptic Processing: Validation

Abstract

Aseptic processing is a widely used methodology in the health care industry for the preparation of sterile materials. The term aseptic processing as it is applied in the pharmaceutical industry refers to the assembly of sterilized components and product in a specialized clean environment. The clean environment may be a conventional human scale classified clean rooms or an environment engineered to further reduce the likelihood of contamination by reducing (or as much as is possible eliminating) direct human contact with the product and components being assembled “aseptically.” The idea of sterile products manufactured aseptically is inherently contradictory, a demonstrably sterile product cannot be produced aseptically using even the most advanced technology available today. Nevertheless, on any given day millions of putatively sterile dosage form units are produced using aseptic techniques that in the literal sense are inadequate to achieve sterility. A sterile product is one that is free from all living organisms, whether in a vegetative or spore state. This is an absolute condition, something cannot be partially or nearly sterile, the presence of a single viable organism represents a failure of the product, and the systems (environment, equipment, and procedures) used to produce it. Asepsis, that state in which all aseptically filled sterile products are manufactured, cannot be established as “sterile.” Asepsis is commonly defined as a condition in which living pathogenic organisms are absent.

Putting aside the classical definitions, one must consider the real difficulty in establishing an aseptic environment, let alone a sterile one. The practitioner is left with a insurmountable task, to somehow create an environment free of any organisms, but also one (with the exception of isolators) in which personnel must be present to perform critical functions. The problem is further compounded if it recalled that personnel are considered the single greatest source of microbial contamination in aseptic processing. Recent experiments have shown that personnel clothed in new, sterile clean room garments slough viable contamination at a rate of roughly one viable particulate to 10,000 non-viable particles. During slow deliberate movements with the best possible clothing, operators will slough particulate and viable organisms. Therefore, the probability of human borne microbial contamination being released in the conventional clean room is one over the course of any reasonably long operational shift. With this fact in mind, how then is one to accomplish a truly sterile or even aseptic environment? Especially when we must consider that many organisms that are normally non-pathogenic, can under certain circumstances become opportunistically pathogenic. Among those circumstances are a debilitated condition of general health in the patient, or, as is increasingly common, immunological insufficiency due to age or pre-existing condition.

Other than the obvious considerations of proper facility design, sterilization validation, and sanitization procedures (all of which are discussed elsewhere in this encyclopedia), the focus of attention must be on the personnel and the activities which they must perform. These actions are broadly termed, aseptic technique, and like any other human activity they can be accomplished in a variety of ways. In order to better understand aseptic technique, some general guidance and examples of good and bad technique can be used to delineate what should and should not be permitted.

The fundamental concept behind every aseptic processing activity is that non-sterile objects must never touch sterile objects. This is often accomplished by the establishment of a “sterile field” in which the core activities are performed. All of the surfaces of the gowned human operator must be always considered non-sterile. Non-sterile objects including the operators hands must never be placed between the source of the air and a sterile object. The operators' hands and arms must always be kept at a level beneath that of open product containers. Sterile components should under no circumstances be touched directly with gloved hands, a sterilized tool should always used for this purpose. Since gloved hands and arms will enter the sterile field they must never touch walls, floors, doors, etc. Strenuous lifting and moving of tanks, trolleys, etc. must not be done by operators assigned to work within or near the sterile field, because the more strenuous the activity the higher the level of particle generation, and at least some of the particulate released by the operator will surely by viable microorganisms.

Some of the techniques to avoid include: reaching over exposed sterile objects to make adjustments beyond them; correcting a stopper feed problem with a gloved hand; touching face, eye shield, or any other non-sterile object with gloves; taking an air sample directly over open containers; continuously standing inside flexible partitions that mark the boundary of the sterile field; breaking up clumps of components with gloved hands. Each of these actions exposes the sterile objects to undue risk of contamination from the personnel. Certainly there are more ways to contaminate the “sterile field” than we can imagine. For this reason, the procedures used in and around the “sterile field” must be carefully defined and followed closely by all personnel. These procedures should follow the general principles outlined above and are evaluated in a media fill simulation and performed in an identical fashion during aseptic processing. It is beneficial to define in writing how each procedure is to be performed and train the operators in these exact procedures. View Section: View Article (PDF) View Article (PDF)

Validation of Active Pharmaceutical Ingredients

Validation of Active Pharmaceutical Ingredients Book Description

Much has happened in the area of bulk pharmaceutical good manufacturing practice (GMP) and validation since the first publication of Validation of Active Pharmaceutical Ingredients. Revised, updated, and expanded, this second edition includes new chapters addressing postapproval changes, technology transfer, international cGMP guidelines/FDA guidance progress, and facility inspection issues. The basic philosophy and principles of GMP and validation have not changed, but new terminology had been introduced, and old terminology had been better defined, improving the understanding of related concepts and principles. The book gives you a working knowledge of the regulatory process that will facilitate your organization's compliance with regulations.

Robin Goldstein has contributed to Validation of Active Pharmaceutical Ingredients as an author. Niles Elliot Goldstein is the founding rabbi of The New Shul in Greenwich Village, New York. He lectures widely on Jewish mysticism and spirituality and has taught at New York University and the Hebrew Union College-Jewish Institute of Religion. Goldstein is the national Jewish chaplain for the Federal Law Enforcement Officers Association and was the voice behind "Ask the Rabbi" on the Microsoft Network. His essays and poetry have appeared in "Newsweek, the "Los Angeles Times, and many other publications, and he is the author or editor of five previous books, including God at the Edge: Searchi

Publisher: Informa Healthcare
Author: Santoro, Robin Goldstein, Berry R Berry
Edition Number: 2
Language: English
ISBN:

1574911198

EAN:

9781574911190

No. of Pages: 588
Publish Date: 2008-01-31

A Practical Guide to Microbial Limit Methodologies




A Practical Guide to Microbial Limit Methodologies
In recent years, the field of pharmaceutical microbiology has experienced numerous technological advances, accompanied by the publication of new and harmonized compendial methods. It is therefore imperative for those who are responsible for monitoring the microbial quality of pharmaceutical/biopharmaceutical products to keep abreast of the latest changes. Microbial Limit and Bioburden Tests: Validation Approaches and Global Requirements guides readers through the various microbiological methods listed in the compendia with easy-to-follow diagrams and approaches to validations of such test methodologies.
Includes New and Updated Material
Now in its second edition, this work is the culmination of research and discussions with technical experts, as well as USP and FDA representatives on various topics of interest to the pharmaceutical microbiologist and those responsible for the microbial quality of products, materials, equipment, and manufacturing facilities. New in this edition is an entire chapter dedicated to the topic of biofilms and their impact on pharmaceutical and biopharmaceutical operations. The subject of rapid methods in microbiology has been expanded and includes a discussion on the validation of alternative microbiological methods and a case studyon microbial identification in support of a product contamination investigation.
Substantially updated and revised, this book assists readers in understanding the fundamental issues associated with pharmaceutical microbiology and provides them with tools to create effective microbial contamination control and microbial testing programs for the areas under their responsibility.

Validating Chromatographic Methods: A Practical Guide



Validating Chromatographic Methods brings order and Current Good Manufacturing Practices to the often chaotic process of chromatographic method validation. It provides readers with both the practical information and the tools necessary to successfully set up a new validation system or upgrade a current system to fully comply with government safety and quality regulations. The net results are validated and transferable analytical methods that will serve for extended periods of time with minimal or no complications.

This guide focuses on high-performance liquid chromatographic methods validation; however, the concepts are generally applicable to the validation of other analytical techniques as well. Following an overview of analytical method validation and a discussion of its various components, the author dedicates a complete chapter to each step of validation:

  • Method evaluation and further method development
  • Final method development and trial method validation
  • Formal method validation and report generation
  • Formal data review and report issuance

Templates and examples for Methods Validation Standard Operating Procedures, Standard Test Methods, Methods Validation Protocols, and Methods Validation Reports are all provided. Moreover, the guide features detailed flowcharts and checklists that lead readers through every stage of method validation to ensure success.

For scientists and technicians new to method validation, this guide provides all the information and tools needed to develop a top-quality system. For those experienced with method validation, the guide helps to upgrade and improve existing systems.

Method Validation in Pharmaceutical Analysis: A Guide to Best Practice


Hardcover: 418 pages
Publisher: Wiley-VCH (May 6, 2005) English ISBN-10: 3527312552 ISBN-13: 978-3527312559
File type : PDFFile
size : 2.9 MB
Book Description
Adopting a practical approach, the authors provide a detailed interpretation of the existing regulations (GMP, ICH), while also discussing the appropriate calculations, parameters and tests. The book thus allows readers to validate the analysis of pharmaceutical compounds while complying with both the regulations as well as the industry demands for robustness and cost effectiveness. Following an introduction to the basic parameters and tests in pharmaceutical validation, including specificity, linearity, range, precision, accuracy, detection and quantitation limits, the text focuses on a life-cycle approach to validation and the integration of validation into the whole analytical quality assurance system. The whole is rounded off with a look at future trends. With its first-hand knowledge of the industry as well as regulating bodies, this is an invaluable reference for analytical chemists, the pharmaceutical industry, pharmaceutists, QA officers, and public authorities.


Download link
http://www.filefactory.com/file/a4eb70/

Document creation and execution

  • Validation Master Plans
  • User Requirements
  • Functional Specifications
  • Design Documents
  • Factory Acceptance Tests
  • Impact Assessments
  • Installation Qualifications
  • Operational Qualifications
  • Controls / Automation Qualifications
  • Computer Systems Validation
  • Performance Qualifications
  • Disaster Recovery Procedures
  • SOP Development

Concept of Process Validation For Pharmaceutical Industry

Concept of validation
GMP-definition is the validation of "establishing documented evidence that establishes a high degree of certainty that a particular process will consistently a product that provides the previously established specifications and quality attributes are available."
Appropriate and complete documentation is recognized as crucial for the validation. Standard Operating Procedures (SOPs), production formulas, detailed documentation batch changeControl, experimental reporting systems, analytical documents, reports development, validation protocols and reports are an integral part of validation philosophy. The validation of the documentation provides a source of information for the ongoing operation of the plant and is a resource that is used in the subsequent process of development or modification activities.
All test activities will take a level of impact assessment to ensure that systems, services andProducts were determined directly affected by the test.
A revalidation program should be implemented on a permanent equipment on the revalidation requirements and change control.
Types of Validation
Prospective validation
Establishing documented evidence that a device / process or system to do what they do, on a pre-planned series of scientific investigations within the meaning of validation sets basedPlan.
Concurrent validation
Is used when an existing process can be shown to be in a state of control by use of tests on samples taken at strategic points in a process, and at the end of the process. All data are collected simultaneously with the implementation of the process, to demonstrate sufficient information to process reproducibility.
Retrospective Validation
Establishing documented evidence that a process does notwhat it purports to do, based on review and analysis of historical data.
Design Qualification (DQ)
The intent of the DQ is in the planning and commissioning process met with a number of mechanisms, including:
- Generation of User Requirement Specifications
- Verification of this type corresponding user requirement specifications.
- Supplier Evaluation / Audit
- Check the Challenge of the design by GMP audits
- Product Quality ImpactAssessment
- Specifying Validation documentation requirements of suppliers
- Agreements with the suppliers about the performance targets
- Factory Acceptance Test (FAT), Site Acceptance Test (SAT) and commissioning procedures
- Definition of construction and installation documentation) to assist with Installation Qualification (IQ.
Installation Qualification (IQ)
IQ is a proof that the equipment or system has been documented in developeddelivered and installed in accordance with design drawings, vendor recommendations and in-house requirements. Moreover, IQ, that a record of the main features of the equipment or system is installed, how available and ensure that they are supported by sufficient and appropriate documentation to implement satisfactory operation, maintenance and control of changes.
Operational Qualification (OQ)
OQ is documented proof that operates the facilityas provided in the above design, operation or approved acceptance range of equipment, as applicable. In cases where process steps are considered an appropriate placebo batch is used to demonstrate device functionality.
All new devices should be fully taken into service before the start of OQ to ensure that at least be sure to use the device, complete with all mechanical assembly and pre-qualification checks are that the device is fully functional and thatDocumentation is complete.
Performance Qualification (PQ)
The goal of PQ is documented proof that the equipment can always be achieved while producing the specifications for a longer period at a defined operating point, a product of the specified quality. The specification will make reference to process parameters, in-process and product specifications. PQ requires three product batches available for all acceptanceCriteria for in-process and product testing. For supply PQ requires the benefits of medium to fulfill all the data over a longer period of sampling.
The PQ documentation should be on standard manufacturing procedures and batch records and describe the methodology of sampling and testing to be.
What is Validated
General
All process steps, production equipment, systems and environment, directly relevant to the production ofsterile and non-sterile products must be formally confirmed.
All major packaging equipment and processes should be validated. This validation is less comprehensive.
All ancillary systems, which should have no direct effect on product quality to be qualified by a technical documentation on the extent of the system and how it works.
Facility
- Manufacturing Area Design.
- Personnel and material flow, etc.
Process Equipment andDesign
Process steps and equipment description. Ie dosing, formulation, packaging, washing equipment
and cleaning. etc
Utility Systems Design
Raw / steam cleaned, purified water, compressed air, air conditioning, vacuum, power, lighting, cooling water, wastewater, etc.
Computerized Systems Design
Information system, automated laboratory equipment, automated manufacturing equipment, electronic recordsetc
Cleaning validation (CV)
CV provides evidence documenting that a cleaning procedure for the reduction of effective pre-defined maximum allowable limits, all chemical and microbiological contamination by one piece of equipment or a production area for processing. The means for evaluating the effectiveness of cleaning includes cleaning and disinfecting surfaces, sampling and inspection of product residues, cleaning residues and bacteriaContamination.
The term CV is used to describe the analytical investigation of a cleaning or by bicycle. The validation protocol should be based on background documentation on the reasons for the "worst case" test, where it is proposed. It should also develop the criteria for acceptance, including chemical and microbiological specifications, limits of detection and the selection of sampling.
Method Validation (MV)
MVprovides documented evidence that the internally-developed test methods are accurate, robust, efficient, reproducible and repeatable. The validation protocol should be based on background documentation on the reasons for determining the method detection limit and sensitivity.
Computer Validation
Computer Validation is a proof to ensure systems are consistently documented in accordance with the predetermined specifications and quality functionAttributes throughout their lifecycle. Important aspects of this approach are) the validation of formal management design (through a specification process), system quality (through systematic review and testing, risk (through the identification and evaluation of new and critical functions) and lifecycle (through sustainable change Control).
If the equipment in embedded computer systems, the elements of the validation of computer controlled, can be carried out as part of the equipment IQ and OQProtocols.

Validation Findings in FDA Warning Letters 2008

The GMP news from 18 February 2009 comprised information on the FDA Warning Letters Report 2008, including the Top 5 deficiencies.

It did not cover deficiencies regarding validation (validation/qualification/calibration) though. This is due to the fact that the findings are listed according to the paragraphs in the 21 CFR 210/211, which does not contain a separate paragraph addressing (process) validation. For that reason the following information does provide an individual validation issues analysis:

In the 22 Warning Letters in the fiscal year 2008 issues regarding validation were criticised 15 times. Top of the list were deficiencies relative to process validation (9 Warning Letters). Five of the letters referred to deficiencies concerning solid dosage forms, 2 concerned semi-solid forms, one addressed radio pharmaceuticals, and one product classification remained unclear.

Two Warning Letters per subject covered issues like inappropriate validation of the sterilisation process, filter validation, "smoke studies" and cleaning validation the authority issued like

Exemplary findings for the issues mentioned above are:

Exclusion of validation batches without providing reasons within a retrospective validation
Missing sampling details in the validation plan
It seems like you did not understand the meaning of a cleaning validation
The cleaning validation master plan does not contain any "scientific rationale" for specific products, sampling locations and acceptance criteria
Swab surfaces are too small
Not all loading patterns were mapped in the validation of the sterilisation process
Inadequate Air Flow Pattern
Further deficiencies concerned issues like

An inadequate calibration of thermocouples
A Media Fill not representing a commercial process
Undocumented removal of filled vials within Media Fills
Conclusion: Although the subject validation is not specifically listed in the 21 CFR 210/211, it still is among the top deficiencies in the Warning Letters issued. Almost 70% of all letters contained one or several findings relative to this subject. 41% were related to process validation.

Validation of USP Methods

In the first supplement of the USP 32, the revised, general chapter <1225> - Validation of Compendial Methods - was published. This chapter describes the requisite performance characteristics that should be considered to prove the validation of a method in the case of its submission to the Pharmacopoeia.

It is striking that terms coming from ISO standards have also been incorporated, although the wording in pharmaceutical surroundings was until now oriented towards the ICH Guidelines, especially ICH Q2(R1).

This was also the topic of the publication entitled "Making Sense of Trueness, Precision, Accuracy, and Uncertainty" in the Pharmacopoeial Forum of May-June 2008. This article reviews the differences between the terms when used in ICH and ISO. It also states that the terms "trueness" and "uncertainty" do not even exist in the ICH and the USP. The conclusions drawn in this article are as follows: The terms should be clarified in the USP. These clarifications could easily be added to the General Chapters <1010> und <1225>. In the longer term, the USP encourages continued harmonisation of terminology among the involved parties (ISO, ICH, VIM - International Vocabulary of Metrology) and other interested parties.

In the revised chapter <1225> of the first supplement to USP 32, these terms have now been incorporated from ISO 5725-1 and ISO 3534-1.

And the term "reportable value", established from the OOS discussions in recent years, is now also incorporated in this USP chapter.

The requisite performance characteristics to be considered in validation of the types of methods in order to prove their suitability for the USP (accuracy, precision, specificity, detection limit, quantitation limit, linearity, range and ruggedness) remained unchanged.

And when is it necessary to revalidate? Revalidation may become necessary when a revised analytical method is submitted to the USP or when an established, general method is to be used for a new product or for a new starting material.

Pharmaceutical Validation Documentation Requirements

Pharmaceutical validation is a critical process that ensures that pharmaceutical products meet the desired quality standards and are safe fo...