Justin O. Neway

December 1, 2013

11 Min Read

Increasingly, life science manufacturing companies are applying technology to meet quality by design (QbD) goals. Organizations collect overflowing volumes of process data as part of programs designed to improve manufacturing variability and outcomes. Collecting valuable data is now an everyday task thanks to available software and process analytical technology (PAT) tools.

BPI_A_131111AR01_O_230857b.jpg




The industry today, in fact, has focused so much on gathering data that it often has lost sight of an important fact: Data collection systems are valuable only if factual information and useful knowledge are gleaned from the data and applied to improve process understanding, quality risk management, and process life-cycle management.

Regulatory guidance is steering the industry to a new path that requires companies to focus on science-based process improvement. This shift in thinking holds the potential to lead the industry toward a much-heralded “desired state” in which quality, manufacturing, and process development organizations work synergistically like a well-oiled machine to develop replicable, predictable processes. To accomplish their collective goals related to better process understanding, manufacturers need to go beyond collecting mountains of data to find better methods for garnering institutionalized knowledge. That helps them “do it right the first time and better the next time” across all geographical and organizational boundaries.

PRODUCT FOCUS: BIOLOGICS

PROCESS FOCUS: PRODUCTION

WHO SHOULD READ: PROCESS DEVELOPMENT, MANUFACTURING, PROJECT MANAGERS, REGULATORY MANAGERS

KEYWORDS: DATA COLLECTION AND MANAGEMENT, QBD PAT, FDA, ICH

LEVEL: INTERMEDIATE

The “How-to” Manual for Building a Better Machine

If a book were published about industry best practices with instructions on how to build a well-oiled biopharmaceutical process manufacturing machine, it would ideally include a prologue for reference written by industry regulatory guidance professionals. Such authors would include those from the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH). Such global experts share the goal of a “desired state” for the industry, along with the US Food and Drug Administration (FDA), which has given us the now familiar process validation guidance for Stage 3: continued process verification (CPV) (1). Their guidance documents set the stage for successes and opportunities that come from science-based process understanding and improvement.

The International Society for Pharmaceutical Engineering (ISPE) published a discussion paper on the FDA’s process validation guidance (2). The authors define typical Stage 3 CPV activities as “Ongoing programs to collect and analyze process data (monitoring plans) to assure the state of control of the process and verify impact of variability. Evaluating the performance of the process identifies potential issues and determines whether action must be taken to correct, anticipate, and prevent problems so that the process remains in control.”

Chapter one of our hypothetical how-to manual should describe a solution space known as process management informatics: a relatively new term and an important trend that encompasses the goal and the best tools useful for CPV and other areas of process management. It is defined as the technology and systems needed to design, commercialize, and sustain robust, approvable manufacturing processes that give predictable, high-quality outcomes cost-effectively based on scientific process understanding (Figure 1).

BPI_A_131111AR01_O_230858b.jpg



Figure 1:  ()


Process management informatics answers the call from several of the ICH’s guidance documents suggesting that manufacturers go beyond collecting data simply for records’ sake to arrive at a state of true process understanding. We can look to the ICH Q8, Q9, Q10, and Q11 documents as well as the FDA’s validation guidance for more insight. Table 1 and the following sections highlight relevant points from each guidance document that can be addressed with process management informatics.

Table 1: 

BPI_A_131111AR01_O_230859b.jpg



Table 1: 194; ()


ICH Q8 and Q11

Both the ICH Q8 Pharmaceutical Development and Q11 Development and Manufacture of Drug Substances call for better process understanding used to define the design space that provides reduced process variability. The authors emphasize that what’s important is the level of understanding achieved — not the amount of data collected.

ICH Q8 states, “It should be recognized that the level of knowledge gained, and not the volume of data, provides the basis for science-based submissions and their regulatory evaluation” (3). In other words, combining all data collected by all systems available today will be useless unless you gain knowledge from that information. For example, data that can be analyzed to pinpoint a trend correlating with a particular ingredient in a bioprocess serves as knowledge that can be used to reduce the process variability and thereby improve the quality of process outcomes.

ICH Q9

ICH Q9 Quality Risk Management focuses on quality risk management. It states, “Once a quality risk management process has been initiated, that process should continue to be utilized for events that might impact the original quality risk management decision, whether these events are planned (e.g., results of product review, inspections, audits or change control) or unplanned (e.g., root cause from failure investigations or recalls)” (4).

That demonstrates an ongoing r
ecognition among regulatory agencies that even though you can do everything possible with respect to risk analysis and implementation of systems to mitigate risk, unplanned events will occur that manufacturers must be able to handle effectively. The ability to learn from data associated with adverse events helps pinpoint root causes to further mitigate risks and prevent additional, similar problems in the future.

ICH Q10

For further insight about the role knowledge plays, we can look next to ICH Q10 Pharmaceutical Quality Systems, which includes the following important note:

Sources of knowledge include, but are not limited to, prior knowledge […], pharmaceutical development studies, technology transfer activities, process validation studies over the product lifecycle, manufacturing experience, continual improvement, and change management activities. (5)

A common thread among those guidance documents is that all knowledge sources mentioned are based on data collection, which happens in several different companies. That creates the need for working across functional areas — regardless of where data “reside” within geographical boundaries or organizational flow charts.

FDA’s Continuous Process Validation Guidance

The FDA updated its Process Validation: General Principles and Practices in January 2011, referencing the ICH quality documents above. Nothing changed regarding the FDA’s regulations, but this revision highlights continuous quality verfication (CPV) as a lifestyle of constantly producing evidence to prove a process remains in control.

The guidance says, “Process validation is defined as the collection and evaluation of data, from the process design stage throughout production, which establishes scientific evidence that a process is capable of consistently delivering quality products.” In the section titled “Stage 3: Process Validation Recommendations,” the guidance reiterates that “Ongoing assurance is gained during routine production that the process remains in a state of control” (1).

In other words, validation is not an event completed on a checklist. Previously, some manufacturers took the approach of “we produced three successful batches, so we’re finished,” which in some cases meant they just got lucky. The FDA emphasizes through Stage 3 of its guidance that our vigilance to support process validation is never complete, so it must be a way of life.

A final regulatory push worth noting is the FDA’s update on externalization earlier this year. Contract Manufacturing Arrangements for Drugs: Quality Agreements(6) offers guidance on the quality sections of sponsor-contract manufacturing relationship agreements, highlighting best practices across manufacturing teams that include contractors. That should be helpful for companies facing common challenges across global manufacturing networks. In particular, outsourced teams struggle all the more to become well-oiled machines because of the nature of their disparate parts.

Realizing the “Desired State”

Reviewing all the above regulatory notes leads us to the “page-turner” chapter in our hypothetical book about the “desired state.” It is defined in reference to quality, manufacturing, and process development. In the desired state, those three components should work together as a well-oiled-machine to produce safe and efficacious products. Drugs are quickly approved and taken to market with high yield and quality, low process variability, acceptable risk profiles, and acceptable process economics — along with all good things in life that pharmaceutical manufacturers and their customers want.

We easily forget that to achieve those milestones, we also need supporting data and institutionalized knowledge — regardless of geographic and organizational barriers. Only then can we can close the loop and do it better the next time in the name of continuous process improvement. The ISPE discussion (2) addresses knowledge management:

The ongoing review of process performance continually increases process understanding, both for those attributes and parameters that are demonstrated to be in control and through investigation of special and common cause variation. Knowledge management should therefore ensure that this information is used to update process understanding and the process design (if applicable). A firm’s knowledge management system should ideally support end to end (raw materials to impact on finished drug product) and network wide (e.g., site to site with the same product manufacture) product performance review and knowledge sharing. Knowledge management tools (e.g., documentation, sharing software, global product meetings, or review metrics) can be useful to facilitate such an overview. Periodic product performance reviews, whether conducted at local site level or network-wide, should include key stakeholders, as applicable, such as operations, quality, laboratory, engineering, and so on, to leverage increased process understanding for similar products and process platforms.

To live that reality, collaboration is critical. Interestingly, the ICH website homepage quotes Henry Ford as saying, “Coming together is a beginning. Keeping together is progress. Working together is success.” How prophetic that even in Ford’s early era of manufacturing these words rang true.

Today in life sciences manufacturing, better collaboration is a solid path to improvement across global companies with externalized operations. Recent outsourcing shifts demand cohesive data management and communication strategies between all parties — especially in externalized relationships (e.g., with a contract manufacturing organization company). With the right approach in place to facilitate collaboration on product development, for example, knowledge can be leveraged across physical locations and throughout the entire scientific innovation life cycle — from research through late-stage quality control and manufacturing.

Traditional approaches to data sharing fall short of collaboration goals, and, therefore, do not belong in our how-to manual. We find many manufacturers in states of “spreadsheet madness”: storing data in disparate sources (including on paper) that lead to time-consuming, error-prone manual data gathering and reorganizing for analysis.

A better approach to gathering knowledge for process management informatics provides the following critical considerations:

  • a self-service, on-demand data access platform for all process and quality data from multiple sources (including accurate capture of paper-record data), automated trending and alerts, and investigations of underlying causes of process trends and variability

  • automated data “contextualization” for domain-specific observational and investigational analysis and reporting, including batch reports, annual product reviews (APRs), product quality reviews (PQRs); context refers to the organization of related elements that enables analysis and interpretation

  • a single collaboration environment that spans organizational and geographic boundaries for access, aggregation, contextualization analysis, and reporting for all data, including continuous and discrete

  • a validated system that delivers value for nonprogrammers and nonstatisticians.

That type of technology-enabled approach creates a practical, collaborative environment in which process development, quality, and manufacturing networks can leverage the “process management informatics machine.” It can facilitate better understanding and control of the sources of process variability and for CPV. The nature of outsourcing calls for technologies that can facilitate data sharing and collaboratio
n in the name of CPV — as the FDA guidance addresses.

For example, a sponsor company with externalized manufacturing operations in multiple countries for a particular bioprocess must communicate in real-time with its contractors to mitigate the risks associated with process variability and potential quality failures. Live links to approved data for designated parameters can immediately enable automated trending, alerting, and trouble-shooting across geographical boundaries and company firewalls with appropriate safeguards and agreements. Monitoring predetermined critical quality attributes (CQAs), key performance indicators (KPIs), and the critical process parameters (CPPs) that drive them provides data-fueled knowledge for all parties in a collaboration.

Teams can then work together for site-to-site or batch-to-batch comparisons that are important to CPV. Through univariate and multivariate statistical analysis, for example, the root cause of a potency problem might be quickly uncovered. That can be done faster than with traditional data-gathering methods and the typical spreadsheet-based data management limitations found in most CMO-sponsor arrangements today. Corrective action based on scientific decision-making can then be taken to improve future process outcomes.

Better collaboration leads to development and production of safe and efficacious products with supporting data and institutionalized knowledge — regardless of organizational or geographic barriers. Once CPV is a way of life at one production site, then the right steps can be taken to ensure that this culture expands and develops across a whole network’s manufacturing organization. Then we find quality, manufacturing, and process development working together like the well-oiled machine Henry Ford imagined — only this time fueled by modern-day data-based science made possible by process management informatics.

About the Author

Author Details
Justin O. Neway, PhD, is chief science officer for the Accelrys ADQM Group, 1380 Forest Park Circle, Suite 200, Lafayette, CO 80026, 1-303-625-2102; [email protected].

REFERENCES

1.) US Food and Drug Administration.

2.) Bika, D. 2012.ISPE Discussion Paper: Topic 2 — Stage 3 Process Validation: Applying Continued Process Verification Expectations to New and Existing Products, International Society for Pharmaceutical Engineering, Tampa.

3.).

4.).

5.).

6.) US Food and Drug Administration.

You May Also Like