Predictive modelling: putting ICH guidelines to work in process validation


Through the International Conference on Harmonisation (ICH) process, regulatory bodies in the EU, US and Japan have been moving steadily towards a sciencebased approach to drug development that will revolutionize the way pharmaceutical companies validate processes and ensure product quality. Concepts such as PAT, quality by design (QbD) and design space (DS), which figure prominently in ICH Q8 and ICH Q9,1,2 encourage greater scientific understanding of processes and products, and hold out the promise of a lighter regulatory burden for companies that adopt such principles. Although the regulatory agencies have provided some helpful direction about how to put these principles into practice, they have not laid out a stepbystep guide. In the absence of such guidance, many companies have been slow to take advantage of the opportunities that these changes offer. That hesitation is understandable. Consider the uncertainty surrounding validation of manufacturing processes — a key milestone in the drug approval process. The industry knows that the accepted approach to validation — the successful processing of three consecutive batches — is antiquated. Further, FDA, for example, now says that it never meant the threebatch guideline as a hard and fast rule. As recently as 2003, in the pages of this publication, a review of the literature on validation uncovered wide variations among experts in their understanding of the term and the regulatory requirements associated with it.3 However, by following a proven approach to sciencebased validation, forwardlooking companies can cut through the uncertainty, move past outdated methods of validation and begin to realize the potential of recent ICH guidelines:
  • reduced compliance risk
  • greater regulatory flexibility
  • more robust processes
  • significant financial benefits.

The goal: managing variability
As anyone who has been involved in validating a pharmaceutical process knows, the apparently simple equation 'fixed raw materials + fixed process = quality' entails a high degree of complexity. Raw materials usually vary from batch to batch and those variations interact in complex ways with the many variable aspects of the manufacturing process. Current approaches to validation are premised on the notion of holding steady by trying to eliminate variability entirely — an endless and ultimately hopeless task. A more realistic and rewarding approach to validation recognizes the inescapable fact of variability. Instead of seeking to stamp out variability, such an approach seeks to manage it by developing a process that can accommodate the range of variables while still maintaining product quality. Such a process would operate within the DS, defined by ICH Q8 as "the multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that have been demonstrated to provide assurance of quality" (ICH, 2005). With an understanding of the DS, the manufacturing processes within that DS could be continuously improved without further regulatory review. The manufacturer would gain more regulatory room to operate, and regulators could be more flexible, using, for example, risk-based approaches to reviews and inspections set forth in ICH Q9 'Quality Risk Management'.
An effective tool: predictive modelling
An effective and practical way to achieve and demonstrate the requisite level of process understanding lies in developing predictive models of the form Y=f(X). Y is the process output that measures the performance of the process and the Xs are process inputs, controlled process variables and uncontrolled process variables.
In pharmaceutical manufacturing, the process output (Y) will be a function of raw material properties and process parameters (Xs). These models should identify critical raw material and process parameters, and reliably predict the behaviour of the process with the wide range of complex multivariate relations among those critical parameters and the outputs they generate.
Although we understand the first principles of kinetics, thermodynamics, heat and mass transfer, we don't have data about the possible behaviours of all the compounds we deal with. Our predictive models for the behaviour of any novel formulation must, therefore, be developed empirically. While validation has always entailed at least some basic empirical techniques, such as simply testing whether a given set of process parameters produces an in-specification result, the application of sophisticated statistical modelling has often lagged. Used with other techniques and bodies of knowledge — raw material science, formulation science and engineering — statistical modelling can help realize the potential of ICH Q8 and Q9.  

Two examples illustrate the power and value of statistical modelling for validation.

Figure 1: Modelling historical data.
Historical data. The first example focuses on the statistical modelling of historical data (existing information about a process) undertaken by a pharmaceutical manufacturer that was having trouble with tablet water content, which was varying unpredictably. The manufacturing process included blending, granulation, milling and tablet compression. As the water content was measured following compression, the problem showed up there, but the project team suspected that the root cause of the variability lay elsewhere in the process. So, working with historical data, the team performed a regression analysis to identify and quantify the drivers of variation of tablet water content. The results are displayed in a table of effects (Figure 1). The table of effects shows the moisture problem related to the blend water and the presence of coarse particles after milling (screen 20). In other words, the variability in particle size was the driver of variability in tablet water content. Therefore, it was those two critical parameters that needed to be monitored and controlled to keep the variability of the tablet moisture within specification. In considering how to control particle size, the team noted that the mill was being fed by hand rather than in a uniform way. They installed an auger to solve the problem, thereby doing exactly what FDA is now encouraging: understanding the drivers of variability in the process and putting good controls in place.
New data. Experiments can also be designed to generate new data regarding process parameters for regression analysis. For example, the maker of a pain management product had encountered wide variations in its dissolution rate.4 Dissolution sometimes occurred too rapidly, which could be lethal to patients, and sometimes too slowly, which could cause patients to suffer unnecessarily. A project team reviewed the available historical data and interviewed a large cross-section of relevant personnel. Using a systematic approach for root cause analysis, the team narrowed the range of possible causes of the unacceptable dissolution rate to nine potential variables — four properties of the raw material (RM), including three related to the API and one related to an excipient, and five process variables (PV) such as temperature and feed rate. To test each of the nine variables at three different values, the team undertook a design of experiments (DoE) to screen out irrelevant variables and find the proper values for critical variables. The team created an L27 orthogonal array design constructed in nine blocks (one for each different raw material blend) and within each block, the different process parameters.

Figure 2: Quality of fit.
Multiple regression analysis capabilities of a software application were used to provide estimates of the effects of the variables on dissolution, their interactions and their statistical significance. The number of significant variables having an impact on dissolution was narrowed to six, with one of those — a process variable — dominating. The final regression model for dissolution was able to explain approximately 95% of the variation in dissolution (Figure 2).

Figure 3: Determining the design space.
Although that one process variable was found to have the greatest influence on dissolution, other process and raw material variables, and their interactions clearly played a role. The various permutations of the settings for all of these variables that still result in an inspecification rate of dissolution (and other product properties) constitute the DS for the manufacture of the product (Figure 3).
This picture of the DS was created using the optimum settings for each of the six significant variables. It brings together 15 3D response surface plots, each of which was originally created in the modelling software. The X and Y axes are made up of the DoE variables, and the Z axis (the contour curves) represents dissolution (the response variable). In the red regions, dissolution is out of specification. In the green regions, the dissolution rate is within specification. By finding the DS in which PV2 — by far the most influential variable — interacting with other variables can produce inspecification dissolution, it is possible to optimize the path to achieve the desired rate of dissolution. In this case, the PV2 variable needs to be kept at a high level while the RM2 variable should be maximized. The RM1 variable should be kept in the centre of the range used, thereby maximizing the 'green space'.
The optimum conditions were entered into the model, two confirmation batches were processed and the results conformed to those predicted by the model. Ultimate confirmation of the power of the technique came with the successful validation and launch of the product.
The results: scientific rigour
When carefully sequenced and applied, predictive modelling can provide the rigorous, scientificallybased understanding of products and processes envisioned by ICH guidelines. As the examples illustrate, using predictive models:
  • Acknowledges that raw materials and processes entail some inherent variability.
  • Allows for the management of that variability.
  • Recognizes that not all variables are equally important, which, in turn, allows risk managementbased approaches to regulation.

On the go...
Predictive modelling also helps enable continuous process verification, defined by ICH Q8 as "an alternative approach to process validation in which manufacturing process performance is continuously monitored and evaluated" (ICH, 2005) and recognized by PAT guidelines and ICH Q8 and Q9 as more effective than the traditional threeconsecutivelots exercise. By identifying critical parameters and interactions, predictive modelling points to those areas that should be monitored — with PAT and other techniques — and controlled. Through continuous monitoring, a manufacturer can accumulate an understanding of the process, refine the characterization of it, and make adjustments within the DS that don't require refiling with regulators. It is important to reiterate that although predictive modelling is an indispensable element of achieving the desired process knowledge, process characterization should not rely solely on quantitative statistical models. It should also employ scientific and engineering knowledge, and univariate experiments. Further, all of these scientific and quantitative techniques should be complemented by basic scientific knowledge and the insight, wisdom and experience of people in the organization who work with the process.
Finally, although recent guidelines encourage frequent communication with regulatory authorities, some organizations, fearing they may become prematurely entangled in regulatory red tape, have been wary of such open communication. However, when armed with increased process understanding, and the powerful data derived from predictive modelling, manufacturers can undertake such communication with confidence. By demonstrating the scientificallybased understanding of processes that regulators are encouraging, manufacturers can establish the mutual trust that is necessary for a productive working relationship, and for realizing the full benefits of the revolution in validation.
Jason J. Kamm is a Managing Consultant with Tunnell Consulting (PA, USA).
Philippe Cini is a Vice President with Tunnell Consulting (PA, USA). http://www.tunnellconsulting.com
References
1. ICH Harmonised Tripartite Guideline: Pharmaceutical Development, Q8, 2005. http://www.ich.org
2. ICH Harmonised Tripartite Guideline: Quality Risk Management, 2005. http://www.ich.org
3. M. Helle, J. Yliruusi and J. Mannermaa, Pharm. Technol. Eur.,15(3), 52–57 (2003).
4. J. Kamm, Pharmaceutical Manufacturing, May (2007). http://www.pharmamanufacturing.com

No comments: