Streamlining process development has been the focus of the biotechnology industry over the past several years. To be financially viable in the current market, a company has to be competitive in all three of the following areas: quality, speed, and price (
1
). Attaining any two of the three attributes at a time is no longer sufficient. With new tools and technologies along with improved understanding of the cell-culture process, doing high-quality process development while reducing both cycle time and cost is becoming a reality.
Effective process development must include accurate scale-down model(s) of the large-scale manufacturing processes. For most downstream unit operations, the important scale-up and scale-down considerations are understood; normalization of flow rates and column geometries at different scales are straightforward concepts to implement for purification operations. For bioreactor process development, however, the important factors such as mass transfer and mixing are more difficult to ...
Typical serum-free culture media used in bioprocessing can have 60–90 components at differing concentrations to feed a single cell line. Media used to grow different cell lines for bioprocessing applications may each require unique optimal chemical formulations. Adding complexity, optimal process conditions such as pH and stirring rate may also differ from cell line to cell line depending on the unique characteristics of process performance.
To tackle all those variables, we at Invitrogen Corporation of Carlsbad, CA (
www.invitrogen.com/pddirect
) use design-of-experiments (DoE) statistical methods, which reveal the complicated array of multifactor interactions involved in bioprocess development. Some experiments conducted include the use of high-throughput tools such as a robotically controlled microbioreactor system capable of conducting hundreds of simultaneous bioreactor experiments (Figure 1.
Figure 1: ()
We know that a sound DoE strategy combined with the right tools can be used to identify truly op...
Most people in the industry are struggling with quality by design and how it relates to the acceleration of process development. Many are confused by the new FDA approach to bioprocess development, unsure of the specific implications of QbD on the CMC section of their marketing applications, and unclear how the risk-based approach applies to their particular operations. Some have trouble understanding the precise link between CQA and CPPs under a life-cycle approach and are stuck considering the exact definitions of such terms as
critical
and
variable.
But help is coming from many fronts. FDA reviewers are being specifically trained to expect and assist in the incorporation of QbD principles in regulatory filings. Results from the QbD pilot program are emerging as a valuable resource for guidance. Industry leaders are publishing and providing case studies about their experiences, as seen in this special issue.
Independent associations such as the ISPE, IEEE, and IFPAC are providing tools for comprehen...
Process development for large-scale bioproduction is generally more labor-intensive, time-consuming, and expensive than for comparable nonbiological processes because of the large number of individual processes and potential variables involved. To ensure the future commercial viability of biological manufacturing processes and prevent bottlenecks, it is essential to accelerate development of both upstream and downstream processing, as well as to improve process analytics. This not only reduces time and cost factors involved in design of robust bioprocessing protocols, but also reduces the time to market for new products, offering better returns on research and development investments before patent expiry. The large number of variables and complex processing requirements of biological products are especially challenging for early phase process design, requiring a variety of strategies to achieve rapid bioprocess optimization.
Miniaturization of bioprocess unit operations to the microliter scale offers a co...
+2 As an updated US FDA guidance document emphasizes, the life sciences industry needs to use data to better understand manufacturing processes and sources of variation to minimize product risk and achieve better process control in future batches (
1
). Lessons learned through such efforts also can be applied to future process design, extending the value of data analysis. Bioprocess manufacturers typically rely on lot traceability to determine the composition of their final manufactured products. Lot traceability is only one aspect of the required capability. It requires knowing all the upstream components that made up a final batch — and, therefore, which product lots need to be recalled when there is a defect in an upstream material or process condition. But lot traceability alone cannot meet all needs, especially when process streams undergo multiple splits and recombinations during the course of production.
A better solution is to use an appropriately designed enterprise manufacturing intelligence (EMI) ...
Surveying BPI readers’ experiences SANJA GJENERO (WWW.SXC.HU)
Better, faster, safer: The current drug-development “paradigm” emerging from the FDA is pushing for innovations that reduce process inefficiency and cost. The plethora of new risk-based methodologies include tools being developed as process-analytical-technology (PAT) tools within the encircling parameters of a process design space. All this parallels (and drives) some predictions that the biotechnology industry has seen the last of its blockbuster models, as predictive genomic tools enable personalized approaches to therapeutic development.
Robert Goldberg wrote the following in the
DrugWonks
blog (
www.fiercepharma.com/forward/emailref/9481
):
The key to better drug development is not more bureaucrats or lawsuits, but a stronger scientific foundation for risk assessment, which is at the foundation of everything the FDA does. And genomics should play a central role in building that scientific platform…. How well will the future crop of drug ...