Process development and manufacturing for biopharmaceuticals are often disjointed activities. Disconnects between groups within an organization can be aggravated by a lack of common terminology and poor data-management practices. Implementing a simple data model based on the ISA-88 standard for batch control can help companies capture process and facility data throughout their product life cycle (1). The first half of this two-part article illustrates how translating a process description to a structured electronic format could transform the bioprocessing industry. Seamlessly linking development and manufacturing networks (both internally and externally) throughout a product life cycle not only substantially increases value, but also significantly reduces risk of failure. In part 2, we will examine the architecture in more detail, provide an example of implementing functional applications, and discuss a proof-of-concept prototype.

PRODUCT FOCUS: BIOPHARMACEUTICALS
PROCESS FOCUS: MANUFACTURING
WHO SHOULD READ: OPERATIONS, IT, REGULATORY/COMPLIANCE, AND PRODUCT AND PROCESS DEVELOPMENT MANAGERS
KEYWORDS: QUALITY BY DESIGN, SCALE-UP, DATA MANAGEMENT, PROCESS OPTIMIZATION, TECHNOLOGY TRANSFER
LEVEL: INTERMEDIATE

BPI_A_121011AR02_O_186008b.jpg

Background

The biopharmaceutical industry is trying to lower the high costs of developing successful new drugs. Manufacturers face significant pressures to reduce the cost of producing new, innovative biologics (2). However you measure it, developing a successful new drug is expensive. With a bottom-up approach, we have an estimate from DiMasi in 2003 of US$800 million per successful candidate, and in 2012 the cost is now estimated to be $2.16 billion (3, 4). Those rising costs — coupled with increasing development time scales and the implicit need for process knowledge management — are forcing the industry to look very closely at its business models. Companies are seeking solutions that speed progress from laboratory to pilot scale and for cost-effective production processes.

Risk mitigation is another key area to address: reducing product attrition rates and the risk of problems associated with process transfer; scaling processes to pilot, clinical, and manufacturing operations; and, most important, reducing risk to patients based on product quality and supply. Many initiatives focus on reducing patient risk and product attrition rates.

Here we demonstrate that it is equally important to the future of the biopharmaceutical industry to develop a formal approach to process knowledge management. We contend that this effort will pay dividends by facilitating companies’ ability to leverage process knowledge.

What Does Process Knowledge Management Mean?

Knowledge is not data management: Its roots are in philosophy, and in a formal sense it is a collection of data, information, and/or skills acquired through experience or education — the theoretical or practical understanding of a subject. Knowledge can be implicit (as with practical skill or expertise), explicit (as with the theoretical understanding of a subject), and more or less formal or systematic.

Management of process knowledge is not a new concept to the pharmaceutical industry. From a regulatory perspective, it always has been seen as an important component of good manufacturing practice (GMP). Management of process knowledge has been implicit in practice, associated with the skills and knowledge of people involved in development and manufacturing. As the biopharmaceutical industry has evolved, with development times extending and products becoming ever more complex, that implicit approach to process knowledge management is breaking down.

At the international level, through work of the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH), regulators in the United States, European Union, and Japan have defined knowledge management in a guidance document. It is described as a “systematic approach to acquiring, analysing, storing, and disseminating information related to products, manufacturing processes, and components” (11). This is significant in that it clearly defines an expectation of a formal, explicit approach to process knowledge management. It is important to note that ICH Q10 sets out that expectation but does not define how to achieve it. Instead, it states, “Each company decides how to manage knowledge, including the depth and extent of information assessment based on their specific needs.”

Progress toward formalized process knowledge management is often hindered by the complexity and variability of biological product manufacturing. Current initiatives such as quality by design (QbD) are helping to define a framework for building process knowledge in development and manufacturing. Such initiatives generate large amounts of data, which must be managed appropriately to fully reap their benefits. Our focus is on facilitating development of a process used for product manufacture consistent with the QbD approach and reducing the impact of barriers that often inhibit the build-up of process knowledge.

Product development life cycles typically range from five to eight years for biopharmaceuticals. Long lag times between development and manufacture may make it difficult for companies to learn from past experience. Furthermore, investigational process-development activities usually take second priority to the main objective of making product as quickly as possible. Organizational divides that frequently separate development and manufacturing activities aggravate the situation, particularly when companies lack effective data management.

Data Management: A 2005-published survey of pharmaceutical companies conducted by Morris et al. revealed that few companies examined could track their data and decision-making processes during product development (5). Most were dissatisfied with the ability of their existing information technology (IT) systems to capture and manage drug development information. Survey results indicate that drug development specialists spend an average of five hours a week looking for data, and about two out of three respondents reported that they could not find 10–20% of what they needed. It is virtually impossible to achieve continuous improvement in such an environment, and the cost of lost time and rework can be significant.

High Product Attrition Rates: Biopharmaceuticals in early development have a relatively high attrition rate, so companies are reluctant to invest much effort in early process optimization. They prefer to optimize a process at a later stage in development, when the likelihood of candidate success is higher. Unfortunately, a process is more defined in its later stages, which significantly limits opportunities to optimize it.

Requirements

As the box above describes, regulatory expectations are contributing to an increasingly recognized need for companies to implement systems that le
verage process knowledge throughout their products’ life cycles. Consequently, the industry wants to develop and deploy systemized and explicit approaches to developing and using such knowledge. It is important at the outset to define knowledge management in the context of biopharmaceutical manufacturing and specify what is wanted of a system that serves to manage process knowledge. These requirements can be considered in two parts: first, what is required from a product life-cycle approach; second, the expectations of a system that will manage process information.

Users: The main objective is to provide a framework that facilitates development of an efficient manufacturing process for a product quickly and effectively using prior knowledge. Efficient processes can be defined as those that deliver a return on investment, are designed for the intended facilities, and are robust. A system must support development of manufacturing processes, enabling risk-based approaches that incorporate QbD and process analytical technology (PAT) concepts. Thus, for such a framework to have real value it must do the following:

  • Enable sharing and management of process information throughout a product’s life cycle in a structured database format across the enterprise

  • Exploit and incorporate data generated through development of design space, control strategies, and technology transfer.

  • Track the evolution of a process in terms of operations, parameters, and resources.

  • Link knowledge management with quality risk management.

  • Leverage the use of prior knowledge (including data from other similar products).

  • Capture and use data gained throughout a product’s life cycle for the goal of continuous improvement.

System Architecture Overview: To meet those outlined user requirements, IT systems that support such a framework must be based upon an enterprise-level database that possesses a number of important attributes. A defined data structure should describe process information in a structured hierarchical way. A process description requires data integration and terminology standardization. The database should provide comprehensive, integrated data by linking data sources such as electronic laboratory notebooks (ELNs), lab information management systems (LIMSs), data historians (such as Historian software from Rockwell Automation), manufacturing execution systems (MESs), enterprise resource planning (ERP) systems, and so on.

Workflow considerations for users should link data entry to their everyday needs in terms of outputs to provide for fast and easy data analysis. Efficient electronic review and approval procedures should support an organization’s quality group. To be a part of quality review, it requires risk assessment and compliance with the requirements of 21 CFR Part 11 (6). User interfaces and associated applications should be intuitive and platform independent.

A scalable and modular design allows integration with sophisticated IT systems, but it should be sufficiently simple to deploy with smaller stand-alone applications for companies with limited IT infrastructure. Software must be designed to comply with principles of good automated manufacturing practice (GAMP), with particular reference to configurable software packages (Category 4) and custom software (Category 5) (7).

The system must support and manage manufacturing process descriptions over an entire product life cycle — from early development through commercial manufacturing — with the goal of consistently delivering the intended performance. It must be able to transfer product and process knowledge between development and manufacturing groups within or between manufacturing sites and both within and outside the organization. For products brought into an organization (whether through acquisition or in-licensing), the data management system should be designed to rapidly import new process data sets. It must enable accumulation of manufacturing process performance information to facilitate continuous improvement as well as continued expansion of the body of process knowledge over time.

A Proposed Framework

To manufacturers from other industries, the answer to these challenges would seem obvious: Implement product life-cycle management (PLM) tools. Even though many sophisticated IT solutions are available — ranging from ERP systems to ELNs — no such integrated solutions currently exist for the biopharmaceutical industry. The basis for a PLM system must be its ability to represent process methods in an electronic-datacentric way. Formalizing a process description that way allows automated tracking, comparing, and use of process information.

In 2006, my company began working in this area when we discovered that hierarchical data models were used successfully for representing process information in other related industries (specifically the food and fine-chemical industries). Batch control standards used by those other industry sectors are published by the International Society of Automation (ISA). They describe a hierarchical data model for describing an organization (Figure 1). We concluded that applying that data model would work for the biopharmaceutical industry (8).

BPI_A_121011AR02_O_186009b.jpg

Figure 1: ()

In simple terms, the ISA 95 standard uses multiple models to explain elements of enterprise control system integration, with the initial models being very abstract and the final models very detailed (9). ISA 88 provides for a consistent, standard terminology to represent bioprocesses, essential to creating a data model with broad application (1). So this is a good starting point, given both standards’ widespread use in other industries and for MESs as well as control systems and automation. Figure 2 outlines this data model, a formal datacentric view. Below, we discuss application of the model.

What Is a Recipe?

According to ISA 88, a recipe is a formal way of describing a process. In every-day terms, a general recipe equates to a process description and a master recipe to a standard operating procedure (SOP). Here are their formal definitions.

A general recipe is an enterprise-level recipe that serves as a basis for lower-level recipes. It is created without specific knowledge of process equipment that will be used to manufacture a given product. This recipe identifies raw materials, their relative quantities and required processing, without specific regard to a particular site or its equipment. A general recipe provides a means for communicating process requirements to multiple manufacturing locations. It may be used as a basis for enterprise-wide planning and investment decisions. And it is the output from development that describes the process required to manufacture a product.

A master recipe is targeted to a specific processing area, including associated process cell equipment. Some characteristics of master recipes include the following:

  • It has to be sufficiently adapted to properties of process equipment to ensure correct batch processing.

  • It must contain product-specific information required for detailed scheduling (e.g., process input information and equipment requirements).

BPI_A_121011AR02_O_186010b.jpg

Figure 2: ()

Two aspects of ISA 88 are especially useful for defining a bioprocess model: a separation of process requirements from equipment capabilities and an object-oriented hierarchical data model. The concept of defining a process in dimensionless terms (without reference to scale) is a requirement for scale-up and facility-fit assessments. And defining bioprocesses as a sequence of unit operations facilitates use of the data model described by ISA 88.

As the box on this page describes, recipes are a focus of the knowledge-management model. A general recipe is the processing sequence required to make a given product defined in dimensionless terms. Regardless of scale, each product has one general recipe. When that recipe is carried out for a specific batch size, the resulting process sequence is a master recipe, for which the equipment required to make the product is defined in a physical model.

From the perspective of a biopharmaceutical company, key activities are creation of a general recipe during process development and transformation of that general recipe to a master recipe during technology transfer by mapping it to available resources and a physical model. The process of generating a first-pass technology transfer package can be automated using this approach. In addition, development scientists can assess the fit of the general recipe to any facility at any time during development. Figure 3 illustrates the relationship between general and master recipes.

BPI_A_121011AR02_O_186011b.jpg

Figure 3: ()

A general recipe can be seen as the final result of a series of development studies. Information contained within a general recipe provides sufficient guidance for operations staff to generate manufacturing procedures and operate a process at any scale. In reality, process scheduling and performance are constrained by facility, resource, and equipment availability. Supporting information about the criticality of parameter ranges (particularly hold times and temperatures for process intermediates) is vitally important for troubleshooting during technology transfer and incorporated into the data model.

One challenge of defining bioprocesses in dimensionless terms is their wide range of unit operations and requirements for in-process monitoring and control. The approach taken for a knowledge-management model is to define a library of standard process actions that represent basic processing steps — e.g., heat, agitate, pressurize, and so on — and their associated parameters. Those defined process actions can then serve as building blocks to be assembled in different sequences and generate a general recipe with clearly identified control points. Figure 4 defines how, through rules and knowledge of available resources, a general-recipe action can be translated into an automated procedure that generates a detailed set of instructions for a master recipe.

BPI_A_121011AR02_O_186012b.jpg

Figure 4: ()

Benefits

At the heart of the above-described framework is an ability to define and manage recipes. Although we have described only process recipes here, the same approach can be applied to support recipe segments such as buffer-preparation methods and analytical methods. By tracking key process information throughout a product’s life cycle, the model can enable scientists, engineers, and management to learn from their collective experiences.

A simple and intuitive approach to managing data involves separating key process information from routine experimental methods and general report information. Organized, accessible storage of data and methods supports rapid generation of developmental reports while making it easy to search for, retrieve, and interpret previous work. That reduces the amount of time scientists and engineers spend creating documentation while simplifying and accelerating acquisition of data from past studies. For example, a simple data-entry interface can allow development scientists to define process parameters, to reference supporting experiments, and to select buffer recipes and operating methods from a predefined standards list. Doing so saves them time when creating process documentation. Once a history of experiments and process definitions is built up over time, the resulting database can be used as a troubleshooting tool.

The popularity of platform processes indicates the value of standardization for bioprocesses. A common data center that provides access to preferred materials and standard recipes can promote development of robust process designs and enable development scientists to focus on refining those process sections most likely to affect product quality or process performance. In addition, ensuring that standard representations and terminology are used throughout a process can reduce confusion during technology transfer and improve communication among groups.

Ultimately, the goal of a process knowledge management model is to enable feedback from many areas (including prior manufacturing performance) to guide process development and create a model that will support continuous improvement and better process understanding. This is only the beginning. Figure 5 provides a glimpse of the potential for such a platform. Potential applications include

  • Resource management supporting an enterprise, ensuring that all parts of an organization work from the correct information relating to a product’s manufacture

  • Standard reports instantly available to support chemistry, manufacturing, and controls (CMC) sections that detail the evolution of a process and the rationale for related decisions with back-up references

  • Facility-fit analysis throughout the development life cycle, reducing the risk of unexpected issues when a process is transferred to a particular facility (and it is possible to use this tool to select the best facility for a given product)

  • Automatic generation of technology transfer packages for use within a manufacturing network

  • Understanding of cost implications that come with process and technology choices throughout a product’s life cycle

  • Provision of a framework for managing and maintaining platform processes (together with their controls) to effectively communicate methods and manage their deployment

  • Better informed decision making through using the latest structured data delivered to the right people when they need it.

BPI_A_121011AR02_O_186013b.jpg

Figure 5: ()

The overall benefit of systematic process knowledge management is a reduction in effort (and risk) in the development and manufacture of biopharmaceuticals. Systematic knowledge management will help companies develop more cost-effective processes faster and earlier in development without significant additional investments. The ability to bring more efficient processes into manufacturing will help companies realize necessary resource reductions throughout their products’ life cycles. For the second half of this article, our emphasis moves from the theoretical concept to its development and application of this concept in a live environment with a case study.

About the Author

Author Details
Corresponding author Andrew Sinclair is president and founder of Biopharm Services Ltd., Lancer House, East Street, Chesham, Bucks, HP5 1DG, United Kingdom; 44-1494-793243, fax 44-1494-785954; [email protected]; www.biopharmservices.com. Miriam Monge is vice president of sales and marketing, and Dr. Andrew Brown is head of the consultancy at Biopharm Services.

REFERENCES

1.) 2010.ANSI/ISA-88.01: Batch Control, International Society of Automation, Research Triangle Park.

2.) Harris, G. 2008. The Evidence Gap: British Balance Benefit vs. Cost of Latest Drugs. Time 2.

3.) DiMasi, JA, RW Hansen, and H. Grabowski. 2003a. The Price of Innovation: New Estimates of Drug Development Costs. J. Health Econ. 22:151-185.

4.) Light, DW, and A. Warburton. 2011. Demythologizing the High Costs of Pharmaceutical Research. The London School of Economics and Political Science 1745–8552. BioSocieties www.pharmamyths.net/files/Biosocieties_2011_Myths_of_High_Drug_Research_Costs.pdf:1-17.

5.) Morris, K, S Venugopal, and M. Eckstut. 2005. Making the Most of Drug Development Data. PharmaManufacturing 4:16-23.

6.) CBER/CDER/CDRH/CFSAN/CVM/ORA 2003. Guidance for Industry: Part 11, Electronic Records; Electronic Signatures — Scope and Application, US Food and Drug Administration, Rockville.

7.) 2008.GAMP 5: A Risk-Based Approach to Compliant GxP Computerized Systems, International Society for Pharmaceutical Engineering, Tampa.

8.) 2005.ASNI/ISA 95: Enterprise-Control System Integration, International Society of Automation, Research Triangle Park.

9.) Sinclair, A, and C. Hill. 2007. Process Development: Maximizing Process Data from Development to Manufacturing. BioPharm Int. 20.

10.) 2000–2005.ANSI/ISA-95: Manufacturing Enterprise Systems Standards Second Edition, International Society of Automation, Research Triangle Park.

11.) ICH Q10 2009. Pharmaceutical Quality System. US Fed. Reg. 74:15990-15991.

You May Also Like