These updates to the 22-year-old guidelines are welcomed by those of us who are passionate about manufacturing process excellence. Leaps and bounds have been made in fundamental enabling technologies since the FDA first published its original guidance on process validation in 1987. For perspective, ’87 was the year Sun Microsystems introduced the SPARC processor and IBM launched its PS/2 personal computer with a 3.5-inch diskette drive. We were still recording on paper-based systems and many years away from cost-effective storage of large amounts of electronic data—let alone the ability to access and leverage such data as the basis for decision making.
Today, as life sciences manufacturers continue to catch up to technologies’ new capabilities, many have migrated to electronic systems and records, but they are still not able to provide self-service access and aggregation of data for end users in a context that allows them to easily understand their processes and capitalize on potential improvements.
“You absolutely have to have appropriate software capabilities in place.”They have to request and search manually through mountains of data from disparate sources—enduring the “spreadsheet madness” that makes problem-solving time consuming and error-prone. The FDA draft’s emphasis on using data continuously for validation makes the accessibility and analysis of process data more important than ever from a regulatory perspective. The updated guidelines define process validation as “the collection and evaluation of data, from the process design stage throughout production, which establishes scientific evidence that a process is capable of consistently delivering quality products.” Now, on-demand access to the right data in the right context for analysis becomes central to Performance Qualification.
To leverage knowledge and understanding as the basis for establishing an approach to control that is appropriate for the manufacturing process, the FDA guidance recommends that manufacturers:
• Understand the sources of variation
• Detect the presence and degree of variation
• Understand the impact of variation on the process and ultimately on product attributes
• Control the variation in a manner commensurate with the risk it represents to the process and product.
The document says, “Process validation involves a series of activities taking place over the lifecycle of the product and process.” The FDA describes in detail process validation activities in the following stages, saying, “In all stages of the product lifecycle, good project management and good archiving that capture scientific knowledge will make the process validation program more effective and efficient. These practices should ensure uniform collection and assessment of information about the process, reduce the chance for redundant information gathering and analysis, and enhance the accessibility of such information later in the product lifecycle.”
To fully apply the FDA’s current thinking in process development and manufacturing, you need the right technology tools. While PAT implementations have been encouraged by the FDA for years—especially in the agency’s risk-based approach initiative—this is probably the most important published guidance that ties technology to process validation practices in a way that spans process development, quality and manufacturing. You absolutely have to have appropriate software capabilities in place to access, aggregate, analyze, understand the information content and report data on-demand by end users in a meaningful context, such as one that allows correlation of upstream parameters with downstream process outcomes (automatically accounting tor batch genealogy), during both process design and commercial operations.
The FDA is pointing manufacturers to practices that can continuously improve their process understanding and, therefore, make the process validation program more effective and meaningful. Like many of its regulatory efforts and mandates, what is good for the FDA and consumers is also beneficial for manufacturers. In this case, leveraging data in the right way can enable greater Process Intelligence, which helps control variability and improve quality, results in higher yields and speeds time to the commercial market.
The new document states, “In addition to being a fundamental tenet of following the scientific method, information transparency and accessibility are essential so that organizational units responsible and accountable for the process can make informed, science-based decisions that ultimately support the release of a product to commerce.”
This is certainly a progressive step forward for the FDA and a win-win situation for an industry ready to embrace technology tools for improvement.
About the Author
Justin O. Neway, Ph.D., is executive vice president and chief science officer at Aegis Analytical Corporation. He has more than 25 years of experience in pharmaceutical and biotechnology manufacturing, and in the development and application of software solutions to quality compliance and operational efficiency in life science manufacturing.