Next-Generation Biotechnology Product Development, Manufacturing, and Control Strategies, Part 2: Process Modeling and Analytics
Part 1 of this article focused on the first two sessions of a CASSS chemistry, manufacturing, and controls (CMC) forum entitled “Next-Generation Biotechnology Product Development, Manufacturing, and Control Strategies,” which took place on 16–17 July 2018 in Gaithersburg, MD. Those sessions focused on upstream and downstream process technologies and strategies (1). Part 2 highlights the final two conference sessions on process modeling, control, and analytics.
Process Modeling and Control
The third conference session was “Modeling and Control Strategies.” Advancements in analytics and modeling and their application to biotechnology products are driving a resurgence in endeavors to transform process development and control strategies. The strategic use of analytical methods, data, and models to drive rapid advancements in process understanding, adjustments to manufacturing operations in real time, and approaches to real-time release testing (RTRT) can save resources, speed up development cycles, and increase process robustness and reliability. However, the appropriateness of such mechanisms with respect to assurance of acceptable product quality and to meeting health authority requirements must be aligned between regulatory agencies and the industry. The session provided a discussion of new modeling applications and control strategies that are being developed and implemented for biotechnology products.
CMC Issues |
---|
The CMC Strategy Forum series provides a venue for biotechnology and biological product discussion. These meetings focus on relevant chemistry, manufacturing, and controls (CMC) issues throughout the lifecycle of such products and thereby foster collaborative technical and regulatory interaction. The Forum strives to share information with regulatory agencies to assist them in merging good scientific and regulatory practices. Outcomes of the Forum meetings are published in this peer-reviewed journal to help assure that biopharmaceutical products manufactured in a regulated environment will continue to be safe and efficacious. The CMC Strategy Forum is organized by CASSS–Sharing Science Solutionsand is supported by the US Food and Drug Administration (FDA). |
The first presentation was “Leveraging an Integrated Data Platform to Advance Process Understanding and Enhance Continued Process Verification (CPV),” by Patrick Gammell (Amgen Inc.). He highlighted that biomanufacturing processes are highly monitored and thus data rich. However, the ability to access significant amounts of data and then maximize their value to make actionable analytical assessments requires an integrated data platform.
Currently, engineers at many biopharmaceutical companies are spending too much time formatting data, and those data are becoming more complicated. Amgen leveraged partnerships with data scientists in academia to better understand mechanisms for using more available data and the types of process and product knowledge that could be identified. Data scientists were brought into Amgen, and a data lake was established, with all systems flowing into one source that has been transformational. A data lake allows a systematic approach to integrating and leveraging data from across a validation lifecycle to understand sources of variation and maximize process capability.
The data infrastructure has had a significant effect CPV through near–real-time access to process and product data from across the entire portfolio. That enables near–real-time monitoring with a depth that facilitates detection of weak signals and provides product and process insights that can be fed back into a system for continual improvement. Putting such a system in place and realizing the full potential regarding raw materials variability and emerging technologies can be challenging. But improved knowledge management and data integration can provide resource relief and powerful, actionable insights, especially for cases in which there is limited prior knowledge.
Michalle Adkins (Emerson Automation Solutions) presented “Paving the Road Toward Real-Time Release Testing” and described the BioPhorum Biomanufacturing Technology Roadmap and In-Line Monitoring/Real Time Release (ILM/RTR) working group. The roadmap was developed as a collaborative effort of the biopharmaceutical industry, including manufacturers and suppliers (2). The ILM/RTR team was established as part of the roadmap process, with the goal of decreasing product-release times while improving quality, efficiency, and supply. The ILM/RTR team has been developing a prioritized list of tests that are required for a typical monoclonal antibody (MAb) batch-mode process and that could be moved in-line, on-line, or at-line. The team expects many aspects of this type of process to be applicable to other product and manufacturing modalities. The group is assessing the current state, future goals, drivers for change, prioritization, and best opportunities for improvement. Next steps for the ILM/TRT working group are obtaining additional industry feedback, business case development, and developing user requirement specifications (URS) for high-priority cases.
The third presentation was “Advanced Process Controls and/or Process Modeling for Biopharmaceutical Manufacturing in Practice in a New Manufacturing Plant,” by Saly Romero-Torres (Biogen). Her talk was based on a primary objective of the process analytical technology (PAT) framework: the promotion of process understanding and predictability gained from deep process knowledge derived from first principles and experience. The goal is for that to evolve into process intelligence with vertical and horizontal control. Modeling (such as that being implemented at Biogen) can enable different levels of process intelligence ― prediction and prescription ― depending on model robustness and the data package available. Model impact evaluation and risk to patients are important. Biogen is adopting a modeling culture, which requires a standardized language and understanding of process and overarching control strategy.
The company is creating maturity models similar to those used by high-tech business sectors such as the semiconductor industry. The maturity model requires that a plan for model evolution (and evolutionary intended use) is conceived and socialized among subject matter experts (SMEs) and regulatory agencies early on during process development. Currently, only minimal regulatory guidance is available for modeling requirements and expectations, especially for high-impact models that entail more risk to patients and thus are scrutinized highly. The plan for model evolution is crucial, particularly when implementing data-driven models that rely on process experience, which is not available at early development or even at early commercial manufacturing stages.
Modeling could support supply chain predictability and reliability and advanced process control (from fault detection to model-based optimization).Specific intended uses define the requirements. A well-planned modeling continuum should allow realization of the benefits from modeling activities early on while evolving into more mature prescriptive controllers and release by exemption activities. More knowledge should allow for a more flexible control strategy with less regulatory oversight.
The final talk of the session was “DSP Meets Data Science: Applying Modeling and Machine Learning for Bioprocess Development,” by Ferdinand Stueckler (Roche Diagnostics GmbH). He covered computational modeling, simulation, and machine learning techniques that can be used to summarize experimental data and offer a framework for inclusion of process knowledge from historical data to ensure efficient and robust downstream processing (DSP) development and design.
Model-based techniques and predictions improve process development by guiding experimental design. However, their success depends on the prediction accuracy of in silico simulations, which can be influenced by general model assumptions, data quality, and model parameter-estimation results. Stueckler discussed strategies that consider the influence of those factors on the prediction accuracy required for specific process development stages. He highlighted applications of those strategies and modeling approach by presenting case studies in which data generated in silico and experimental data from multivariate studies were used, and computational methods were applied to evaluate model prediction quality.
During early stage development, available data sets might be limited, and establishing high accuracy can be difficult. However, semiquantitative model predictions have been shown to satisfy requirements to guide process development and optimize protein purification processes by supporting smart experimental designs and decision-making by SMEs. Models should be living documents that can be updated over time and used to support early and late development and commercial manufacturing process control.
A model does not need to be perfect as long as limitations of the predictions are assessed and understood. Challenges and gaps remain to be addressed for successful implementation of data-based technologies as process development tools. Acceleration and streamlining of process development efforts for speed to clinic, commercialization, and improved commercial control strategies will make such efforts worthwhile.
CMC Strategy Forum July 2018 Scientific Organizing Committee |
---|
Anthony Mire-Sluis (AstraZeneca), Sarah Kennett (Genentech, a member of the Roche Group), Siddharth Advant (Celgene Corporation), Cristina Ausin-Moreno (CDER, FDA), Barry Cherney (Amgen Inc.), Steven Falcone (Sanofi), Jie He (CBER, FDA), Alexey Khrenov (CBER, FDA), Michael Tarlov (NIST-National Institute of Standards and Technology), and Kimberly Wolfram (Biogen) |
Panel Discussion
Following the presentations, Aikaterini Alexaki (Center for Biologics Evaluation and Research, CBER, US Food and Drug Administration, FDA), Katia Ethier (Health Canada), and Thomas O’Connor (Center for Drug Evaluation and Research, CDER, FDA) joined session speakers for a panel discussion of key points regarding data gathering and analysis and data-driven modeling.
Driver and Utility for Large Data Analysis: A key driver for changing the methods and tools used for data management has been the underuse of data being generated. In general, CPV still is reactive, and the biopharmaceutical industry is not translating process data into process knowledge. New processes will enable better use of data and real-time analysis that can drive continuous improvement for more reliable and productive processes. Big data and data lakes are not just about gathering data. By itself, a data lake does not solve problems — you could end up with a “data swamp.”
The biopharmaceutical industry needs to invest in the right tools and learn how to analyze and exploit collected data. Other industries accumulate more data than the biopharmaceutical industry does and use those data better. Standard multivariate approaches are incapable of enabling biomanufacturers to analyze the volume and complexity of their data, so our industry needs new approaches and can learn from other industries.
Gathering Data: Data systems can take several years to build, and they have a major resource impact. It is best to have both information technology personnel and end users work together to design these systems and ensure that they are fit for purpose. The systems should be built to enable gathering and analysis of process and product characterization data from the initiation of product development. It is critical to have metadata attached to the data to enable accurate analysis. Many individual systems (e.g., manufacturing execution systems, MES, and systems applications and products, SAP) do include those data, but discipline is needed to ensure continued proper storage. Some data collection is likely to be less valuable than others. But when analysts apply better techniques for assessing data, they can better understand what the data are telling us, and what might seem like minimal data could have a major long-term impact.
Sources of data variability need to be identified (e.g., variability from analytical methods). Techniques can be established to separate method variability from other types of variability. Looking at signal-to-noise ratios for all measurements and applying fitting (e.g., to smooth signals) can be valuable. Ensuring that equipment is fit for use is critical.
Developing Models: Model development requires a good team of experts working together. Data scientists are influencing the thinking of engineers and scientists. Encouraging process engineers and statisticians to gain more knowledge of each other’s disciplines will positively affect model development, and data scientists are teaching industry experts how to be better about collecting and using data.
Using data-driven modeling is not enough for control and release of complex processes. Process understanding always is needed. Although in silico data are unlikely to replace wet experimentation, the amount of testing required can be reduced by focusing experiments on aspects that models cannot yet fully describe. The use of actual values, instead of process setpoints (e.g., for pH) can provide significantly better models.
The ultimate success of a model-based process-control strategy will depend on the level of confidence in the overarching control strategy and the ability to justify variables in a data-driven model. For example, a data-driven model is easier to use for a chromatography unit than it is for a bioreactor because the chromatography unit operation has fewer associated variables (degrees of freedom), but the use of first principles can help support models for complex systems. The overall pharmaceutical industry must convince itself of the appropriateness of modeling. For small molecules, “chemistry is chemistry,” but the living organisms used in biotechnology lead to higher levels of uncertainty. The ability to integrate increasing amounts of relevant data increase the industry’s confidence in using models and model-based approaches.
Advanced model-based process-control strategies can be expensive and resource intensive, and they usually are suitable for high-volume products that are manufactured with high frequencies. Data from fewer batches might be needed if product-specific knowledge can be supplemented with platform information and/or small- and intermediate-scale information (e.g., base a model on reality but use relevant prior knowledge). Understanding differences between molecules can enable leveraging of knowledge from other products.
Integrating clinical, preclinical, and molecular data into models can further enhance model confidence and utility, but gathering those data can be challenging. One potential use for integrating those data into models is the ability to assess the combination of stability data and clinical data for determining how exposure to product attributes that accumulate in older material could affect patients, which would provide an opportunity to adjust specifications.
Using Models: Modeling allows data to be applied in better ways. Modeling simply can be an automated analysis of data. Such models would require less time and fewer staff (to reach conclusions) than would a sophisticated model and still provide a return on investment.
Models are not a magical toolbox, and there is no escape from the need for expertise, especially to understand impact. Models do not always perfectly fit data; the outcomes of in silico experimentation do not always agree with traditional approaches to process and risk assessments. Modeling is an iterative process in which you predict, validate, and refit a model, and then you begin the cycle with prediction again and update the model with additional data to obtain better fits. When models have been developed based on a reasonable level of product and process understanding (e.g., for standard chromatography), they rarely have been totally inaccurate. However, new and more complex process steps are expected to be more difficult. Therefore, more automated methods for data generation and model making will be important. A risk of “overmodeling” data or having too much data is unlikely. Statistical tools can be used to identify overfitting, and the need remains for statistical and process experts who can assess for overinterpretation and assumptions.
In some cases, process understanding gained through models has supported adjustments to a supply chain and has solved nonobvious problems. Raw-material variations, even in small subcomponents of raw materials, have been linked to product variability. That knowledge has enabled companies to better collaborate with raw-materials suppliers (e.g., regarding change control). Models have provided a mechanism for understanding the tolerances to process and material variabilities. Such models have helped manufacturers identify the best place to run their processes and have enabled efficient transfers of processes to new manufacturing sites.
Models should be used carefully, and extrapolation should be avoided. If a model doesn’t fit, perhaps new aspects of variation were not understood. In that case, you would need to identify the new source of variability. Data-driven models can be invalidated by process changes (intentional and unintentional). The iterative process should be considered, including the possible need to start a modeling process from scratch. In addition, if a process is not performing under control, soft sensors should not be trusted.
Replacing Experiments with Models and Getting Them Approved: High-impact models might replace traditional analytics and decrease quality lead time as part of PAT and RTRT. That requires understanding how and how well a process is operating. To trust a predictive model, a process must operate in control. Unless it is in good control, the risk level of real-time release models is high. Regulators attending this discussion agreed that their agencies are willing to listen to what the industry wants to do with its models and that it is possible to replace some experiments with modeling. Regulatory submissions need to support models and their proposed use, and testing and modeling may need to go hand in hand until there is sufficient experience and confidence in a given model.
Data, including model validation, are required to support fitness for use, and information regarding lifecycle maintenance also is required. Session attendees noted that automation systems and validation would be the focus of inspections. “Data dumps” are not likely to be beneficial, and regulators will request additional relevant data if needed. Regulators want to know that there is appropriate monitoring, with a system in place to investigate when things are not as expected, and that corrective measures will be taken. The product lifecycle management document discussed in ICH Q12 could be a tool for including the description of model monitoring throughout a product’s lifecycle. Although replacing testing can be a difficult target for some quality attributes, other types of models and model uses (e.g., providing enough understanding to promote process improvements) also can have a significant impact. For some types of proposals, meetings with regulatory agencies are encouraged.
CMC Strategy Forum Global Steering Committee |
---|
(as of July 2018) Siddharth Advant (Celgene Corporation), Daniela Cerqueria (ANVISA-Brazilian National Health Surveillance Agency), Yasuhiro Kishioka (PMDA-Pharmaceutical and Medical Devices Agency), Junichi Koga (Daiichi Sankyo Co., Ltd.), Steven Kozlowski (CDER, FDA), Rohin Mhatre (Biogen), Anthony Mire-Sluis (AstraZeneca), Wassim Nashabeh (F. Hoffmann-La Roche Ltd.), Ilona Reischl (BASG-Federal Office for Safety in Health Care), Anthony Ridgway (Health Canada), Nadine Ritter (Global Biotech Experts, LLC), Mark Schenerman (CMC Biotech-MAS Consulting), Thomas Schreitmüller (F. Hoffmann-La Roche Ltd.), and Karin Sewerin (BioTech Development AB) |
Analytical Technologies
The final session of the conference was on “Emerging Analytical Technologies.’’ Presenters and attendees discussed how advances in such technologies help drive progress in biopharmaceutical development and manufacturing. Improved analytical technologies also can produce new general knowledge about biopharmaceuticals, which can inform regulatory decisions and provide the basis for designing better products. This session included the FDA’s perspective on implementing state-of-the-art analytical methods for developing therapeutic proteins. Analytical technologies for characterizing a novel modality class, messenger RNA (mRNA) products, also were discussed. Speakers presented biophysical methods to examine heterogeneous protein–protein interactions in formulated therapeutic protein solutions. The last presenter in this session described a novel sensing technology based on carbon nanotubes for assessing glycosylation.
The first talk in the session was “A Regulatory Perspective on Emerging Analytical Technologies,” by Yan Wang, (CDER, FDA). She pointed out that analytical methods are key for product and process understanding and control. Emerging analytical technologies can extend product and process knowledge greatly and help to establish better control strategies for biotechnology products. However, it is critical to understand the limitations and potential uses of new analytical technologies during method development.
Wang provided an overview of analytical method lifecycle management, and described the regulatory expectations for different uses of emerging analytical technologies. She also presented a case study that demonstrated some concerns about using emerging analytical technologies in quality control environments.
The second presentation was “Development and Characterization of mRNA Therapeutics,” by Charles Bowerman (Moderna Therapeutics). He pointed out that using mRNA to create new therapeutics is complex and requires overcoming novel scientific and technical challenges. The mRNA production process relies heavily on in vitro enzymatic synthesis instead of chemical synthesis because mRNAs are much longer than oligonucleotides. Recent advances toward mRNA production and delivery prompt a need for robust analytical methods capable of characterizing this new class of drugs. mRNA product-related impurities include short mRNAs resulting from either premature termination of transcription or in-process degradation, uncapped mRNAs, and point mutations, insertions, and deletions. Bowerman discussed a number of case studies to highlight the use of a combination of biochemical and biophysical methods for characterizing mRNA product-related impurities and variants leading to successful development of mRNA therapeutics.
The third presentation was “Analytical Strategies to Measure Heterogeneous Protein–Protein Interactions in Formulated Therapeutic Protein Solutions,” by George Svitel (Merck Research Laboratories). He provided data to show that coformulating multiple MAbs into a single drug product is an emerging strategy for delivering biologics to patients. This approach can provide multiple benefits, including combined therapeutic effect, streamlined manufacturing and distribution, and improved convenience to patients. However, coformulated products also bring additional challenges to product characterization. Analytical methods originally developed for individual products need to be developed further for coformulated products. Additional questions to consider related to coformulated products focus on mechanisms of degradation, aggregation pathways, and the possibility of creating mixed aggregated species.
The final presentation was “Next-Generation Nanoscale Biosensors Using Single-Walled Carbon Nanotubes Corona-Phase Molecular Recognition,” by Xun Gong (Massachusetts Institute of Technology, MIT). Researchers in the MIT laboratory have studied how electronic structures of carbon can be used to advance molecular detection. The researchers have pioneered corona-phase molecular recognition (CoPhMoRe) for discovering synthetic, heteropolymer corona phases that form molecular recognition sites at nanoparticle interfaces. By screening libraries of synthetic heteropolymers chemically adsorbed onto single-walled carbon nanotubes (SWCNTs), the team has engineered optical biosensors that exhibit high selective recognition for biomolecules such as riboflavin, l-thyroxine, dopamine, nitric oxide, sugar alcohols, estradiol, insulin, and fibrinogen.
The researchers also extended those sensor capabilities to include small molecules and heavy metals in the case of food and water safety. Those recognition sites can be designed with both high sensitivity in serum-like environments and specificity so as to differentiate d– and l-arabinose. In addition to CoPhMoRe-based sensor design, existing recognition elements can be tethered with histidine tags to Ni2+ complexes that act as fluorescent quenchers for SWCNT. In that way, an array using recombinant lectins can be implemented for glycan detection with micromolar dissociation constants (KD). The team also developed a mathematical model of glycan binding dynamics linking the matrix of observed dissociation constants, kinetics of binding, and occupancy to distinct glycoforms for identification. That formulation allows for straightforward calculation of the minimum array size necessary to distinguish a given set of glycans.
CMC Strategy Forum North America Scientific Organizing Committee |
---|
(as of July 2018) Siddharth Advant (Celgene Corporation), Kristopher Barnthouse (Janssen Pharmaceuticals R&D LLC), Barry Cherney (Amgen Inc.), Fiona Cornel (Health Canada), John Dobbins (Eli Lilly and Company), Taro Fujimori (AbbVie Bioresearch Center, Inc.), Carmilia Jiménez Ramírez (Gilead Sciences, Inc.), Michael Kennedy (CBER, FDA), Sarah Kennett (Genentech, a member of the Roche Group), Joseph Kutza (MedImmune, a member of the AstraZeneca Group), Emanuela Lacana (CDER, FDA), Kimberly May (Merck & Co., Inc.), Anthony Mire-Sluis (AstraZeneca), Stefanie Pluschkell (Pfizer, Inc.), Nadine Ritter (Global Biotech Experts, LLC), Timothy Schofield (GlaxoSmithKline), Zahra Shahrokh (ZDev Consulting), Jason Starkey (Pfizer, Inc.), Andrew Weiskopf (Biogen), and Heidi Zhang (Juno Therapeutics, a Celgene Company) |
Panel Discussion
A panel discussion followed the presentations. Panelists included Charles Bowerman, Anil Choudhary (CBER, FDA), Xun Gong, Manju Joshi (CBER, FDA), George Svitel, and Yan Wang. Panel topics, questions, and discussion follow.
Manufacturing and Characterization of mRNA Nanoparticles: mRNA nanoparticles can be formed by precipitation techniques or by using microfluidic approaches with two-phase solvents (e.g., lipids–mRNA solution). However, large polydispersities must be controlled. Particle size is characterized by using scattering methods (e.g., dynamic or static light scattering).
mRNAs can be manufactured by using a series of both synthetic and biologic methods. Particles are designed from initial sequences to have functions relative to their route of administration. During particle purification, both product- and process-related impurities must be removed to reduce potential immune responses. Currently, 10 programs are in clinical trials and 10 are in development (e.g., vascular endothelial growth factor mRNA for cardiac tissue regeneration).
What is the status of nanotube-based analytical technologies? Nanosensors are being developed to evaluate how they work in the field. Factors affecting binding and signal readout depend on the type of ligand used, but highly sensitive signal detection can be achieved. Work is underway to increase affinity of coatings for nanosensors. The polymer coatings of those sensors are the most expensive part; the nanotube of the sensor itself is not as expensive.
Adoption of Novel Assays for Characterization: The general requirements for “validation” tend to be fewer than those required for lot-release assays, but all assays used should be qualified as fit for purpose. Assay characterization should be sufficient to provide understanding of the strength of the data that the assay provides. For example, mass spectrometry has been used for multiattribute testing in process characterization, and the comprehensive qualification requirements were less than the validation requirements for its use as a lot-release assay.
What is critical to the approval of new analytical technologies? Regarding regulatory issues in implementing a novel analytical technology, appropriate method development, characterization/qualification, and validation should be performed. They should be justified and documented in a regulatory filing. Complicated technologies typically require large amounts of data to be filed.
Biodevelopers should show comparability or bridging to traditional methods as applicable. Whether multiple methods are required for a single attribute depends on whether a method is orthogonal to a new assay (e.g., measuring fragments and species of high molecular weight). Phase-appropriate qualification/validation might be conducted, but even then at early stages, more detail might be needed for a new method than for a familiar one (e.g., describing intermediate precision).
Assay qualification/validation requires an understanding of sensitivity, accuracy, and precision to ensure the absence of sample preparation and assay artifacts.
A molecule’s mechanism of action can dictate the amount of information required, as will the detail for specific attributes depending on their criticality and ability to have process control. Knowing a product is key. Understanding what is and is not a critical quality attribute (CQA) enables biomanufacturers to determine whether noncritical attributes should be tested routinely after characterization studies. New technology often comes with complex software that must be validated properly and understood according to regulatory requirements (e.g., data integrity).
Are new product modalities also driving new analytics? Complex product modalities are requiring methods beyond traditional methods to characterize and measure structure and function.
How can we make regulators confident about new technologies? Regulators have required that biomanufacturers show comparability of results with original technologies and the analysis of materials used in clinical studies or other relevant reference standards. Often the industry uses a new technology for characterization to provide familiarity and confidence in a methodology before moving it to quality control.
There was a recommendation that companies interact with regulatory agencies ahead of submission. In the United States, biomanufacturers can engage the Emerging Technologies Team (ETT). By speaking at conferences in the presence of regulators and industry, those subject matters can be discussed and made more familiar. For multiattribute methods, the FDA Office of Testing and Research (OTR) is working to help speed their development, and an OTR expert sits on the ETT.
After approval for one product, any relaxing of regulatory stringency for new technology depends on how similar a second molecule is to the first.
What are the requirements for cases in which new species are found in existing products? If a new technology discovers a new impurity or species in an existing product, you should go back to samples of product used in clinical development to check for patient exposure. If no clinical samples are available, there might be sufficient patient-exposure postmarketing, but you should test samples as far back as possible. It is necessary to determine whether a process changes created the new species, the method itself generated an artifact, or the method only allows for detection of a “new” species. If it is determined to be a new species, it should be well characterized. One risk when using new technologies is that you could be measuring different quality attributes than are measured with historical methods. However, the level of risk depends on an attribute, its criticality, patient exposure, and what the assay actually is measuring.
Use of Reference Materials to Help with New Technology Adoption: If different species are found with a new method, good reference materials can build a bridge back to clinical materials. A reference material forms a foundation or benchmark for testing (rather than needing to use different lots specifically during method development). Use of reference materials can help to understand variability of a method (rather than lot-to-lot variability). The US National Institute for Science and Technology’s NISTmAb reference material is a class-specific public reference material useful for MAb method development. Such materials would be useful to have for other biological modalities.
References
1 Mire-Sluis A, et al. Next-Generation Biotechnology Product Development, Manufacturing and Control Strategies: Part 1 — Upstream and Downstream Stategies. BioProcess Int. 18(10) 2020: 16–24; https://bioprocessintl.com/business/cmc-forums/cmc-forum-next-generation-biotechnology-product-development-manufacturing-and-control-strategies-part-1-upstream-and-downstream-strategies.
2 BioPhorum Biomanufacturing Technology Roadmap. BioPhorum Operations Group: Sheffield, England, July 2017; https://www.biophorum.com/biomanufacturing-technology-roadmap.
Disclaimer
The content of this manuscript reflects discussions that occurred during the CMC Strategy Forum. This document does not represent officially sanctioned FDA policy or opinions and should not be used in lieu of published FDA guidance documents, points-to-consider documents, or direct discussions with the agency.
Corresponding author Anthony Mire-Sluis is head of global quality at AstraZeneca, ([email protected]). Sarah Kennett is principal regulatory program director for biologics at Genentech, a member of the Roche Group. Siddharth Advant is executive director of biologic manufacturing at Celgene Corporation. Cristina Ausin-Moreno is a senior staff fellow at CDER, FDA. Barry Cherney is executive director of product quality at Amgen, Inc. Steven Falcone is vice president of quality at Sanofi. Jie He is a consumer safety officer at CBER, FDA. Alexey Khrenov is a biologist at CBER, FDA. Michael Tarlov is division chief at the US National Institute of Standards and Technology. Kimberly Wolfram is director of regulatory CMC at Biogen.
You May Also Like