The ability to monitor unit operations in biomanufacturing is essential because it enables early fault detection and effective root-cause analysis. Below, we present a case study on the development of a stand-alone, data-driven, process-monitoring application for a biomanufacturing purification process. We review the application’s functionality and highlight its utility using a few examples from commercial manufacturing of a therapeutic protein. Lessons learned from the development of that application also are presented. The progress and performance of a purification process have been monitored through trends in UV absorbance (for determining protein content), conductivity (for determining buffer salt content), and pressure (for determining the presence of blockage in a system). Purification Process and Data In a purification process, a recombinant therapeutic protein synthesized during a cell-culture process is isolated and purified from the pool of other proteins that are simultaneously produced by mamm...
Bioprocess models and simulations are the basis for digital twins , which are virtual representations of physical processes and enabling methods of biopharma 4.0. Early adopters in the industry have shown potential application of this approach in nearly all stages of a product development life cycle. Experts in academia and the biopharmaceutical industry have studied mechanistic modeling as the main method of chromatography modeling. Mechanistic models are mathematical descriptions of physiochemical phenomena. They are based on first principles and thus can provide a full description of a system. Compared with purely data-driven models, mechanistic models provide higher prediction power, a potential to extrapolate well outside of a calibration space, and high fidelity for scale-up prediction. Studies have focused on the applications of chromatography mechanistic modeling for process development and optimization ( 1–3 ), robustness ( 4–6 ), and scale up ( 7, 8 ). Other studies on application to quality ...
HTTPS://WWW.ALAMY.COM Numeric results from quality attributes testing of drug product and drug substance lots can be used for different statistical analyses. One study is the calculation of statistical tolerance intervals from lot-release data to assist in the determination of specification acceptance criteria ( 1 ). Data from manufactured batches placed on stability at the recommended storage condition (RSC) also can provide useful information to estimate long-term variation. Below, I address potential concerns associated with pooling disparate data sources and illustrate a technique to perform appropriate calculations using statistical software. Modeling Data from Two Sources Often, only limited data are available for a biopharmaceutical product before large-scale manufacturing. My analysis considered the pooling of reportable values from two distinct data sources to estimate long-term manufacturing variability: lot release and lots that have been placed on a stability study and are stored at the RSC. T...
Using parallel systems such as the DASbox minibioreactor system helps to increase reproducibility between bioprocess runs. Reproducible cell growth and reliable development of a desired product are ideal outcomes for a bioprocessing engineer. If reproducibility is poor, the risk of needing to discard a batch and repeat an entire bioprocess is high and results in a great loss of time and resources. Cells, culture media, and a bioprocess control system are required components of an upstream bioprocess. Each of those can be a source of variability that affects cell growth and viability as well as product formation and quality. Below, I ask experts at Eppendorf to describe the factors that can contribute to inconsistent results and explain how to increase the reproducibility of cell culture bioprocesses. Amanda Suttle is a bioprocess senior research scientist in the Eppendorf applications laboratory in Enfield, CT. Robert Glaser is an applications laboratory manager at the Eppendorf Bioprocess Center in Jülic...