Surveying BPI readers’ experiences SANJA GJENERO (WWW.SXC.HU)
Better, faster, safer: The current drug-development “paradigm” emerging from the FDA is pushing for innovations that reduce process inefficiency and cost. The plethora of new risk-based methodologies include tools being developed as process-analytical-technology (PAT) tools within the encircling parameters of a process design space. All this parallels (and drives) some predictions that the biotechnology industry has seen the last of its blockbuster models, as predictive genomic tools enable personalized approaches to therapeutic development.
Robert Goldberg wrote the following in the DrugWonks blog (www.fiercepharma.com/forward/emailref/9481):
The key to better drug development is not more bureaucrats or lawsuits, but a stronger scientific foundation for risk assessment, which is at the foundation of everything the FDA does. And genomics should play a central role in building that scientific platform…. How well will the future crop of drug candidates be validated? Will these drugs target specific groups of individuals based on biological markers of response and disease? Will the success rate increase — rather than decrease — over the next five years? Will time to market for those candidates decrease?
In previous special issues we’ve looked at steps specific to process stages and emerging technologies — cell culture optimization, implementing single-use technologies, protein engineering, and so forth. But late last year, one reader suggested that we take a different approach. He suggested that we explore what today’s pharmaceutical process design/development looks like. How are the above mentioned perspectives on the science- and risk-based approaches to drug development changing the way bioprocesses are actually designed these days? If (he suggested) creating an encircling “garment” of a design space is the new goal, how are related tools and risk-based concepts changing how the “tailors” design and/or adapt the “pattern” of processes to be developed and scaled up? What tools are affecting process design in these early, critical stages? What does “high-throughput screening” refer to during manufacturing and scale-up — outside of discovery applications? How does a modern process development team set design parameters and design a process for a specific molecule? What information is the team generally given at the outset that may not have been available in previous years?
Our plan has been to go beyond the classical diagrams to focus on the process development exercise itself. How does a team work to make a molecule scalable? How are steps optimized for a given molecule? What changes in technology are enhancing/changing/accelerating these processes? Through interview-based staff-written features and selected, invited articles, therefore, we have sought to profile current process design activities.Surveying the Field
We often begin a special-issue project by polling our readers about the topic. So our first step was to send out a questionnaire asking them what they knew of their companies’ views of process design. That effort brought us mixed results. The main lessons we learned from the responses were that each company approaches process design in a way specific to its needs and those of a given molecular entity (no surprise there); and that a number of the respondents appeared to be somewhat uncertain about why we were asking those particular questions in the first place.
Nevertheless, a few significant trends are identified within the answers and perhaps more so (as is often the case) in the open-ended comments. The following section is not intended to offer authoritative conclusions, but to build some picture of the current state of the industry regarding process design.
Who Is Doing the Work?: The very first question may reveal something of the clash of the old and new mindsets: Our respondents are from process development (37.1%), research and development (24.7%), and corporate management (19.1%), with the next closest group being from production/manufacturing at 7.9%. But these broad categories don’t tell us the entire story of an industry that is increasingly promoting use of crossfunctional teams in process design; we had hoped to find some indication that the traditional “silo” mentality is falling by the wayside. We assume at least that in a modern crossfunctional team the 37% from “process development” could include members from upstream, downstream, and formulations; and the 24% in R&D could include product development people from discovery straight through to clinical testing. Production/manufacturing can be both upstream and downstream, and the others mentioned fit right into the mix (3.4% engineering, 2.2% QA/QC/validation, 2.2% project management, 2.2% tech transfer/information technology, and 1.1% regulatory affairs).
Outsourcing Plans: The nature of the work determines the approach, and as usual, the survey revealed that no simple answer will do. But we wanted to find out what prior information is given to these design teams at the outset. To our question, “Are your process design decisions based on the potential for a future need to transfer a process to a contract manufacturer?” — 52.6% said yes, 47.4% said no. Each of the four phases we asked about (preclinical through phase 3) appears to be outsourced at an average of 31%, but 43.1% of outsourcing activities are reserved for transfer to commercial scale. Process design is not outsourced by 46.2% of the respondents, whereas 39.7% do outsource up to 25% of process design.
Quality by Design: We then asked, “To what degree has the emphasis of QbD and design space affected your company’s development and manufacturing strategy?” 44% said “some impact,” whereas 21.3% said “dramatic impact” (18% said “little impact,” 16% “no impact”). So from that we can gather that for around 65% of companies in the biotechnology industry, QbD has indeed made a noticeable, if not significant, impact in operations and planning. But because we did not correlate responses to company size, it is hard to know what to make of those who registered “little” to “no” impact.
More substantive information was offered in answer to “How has the ‘better–cheaper–faster’ mandate changed your process design activities?” With respondents asked to check all that apply, the answers supported what we’ve been hearing — with the top two answers again confirming the practice of using broad-based interdisciplinary teams in process design:
70.8% perform more development activities in parallel
61.1% involve cross-functional teams
45.8% start with higher-producing clones
43.1% incorporate single-use technologies
43.1% implement PAT/QbD
12.5% outsource more process development
The “other” strategies included use of new technologies, front-end loading, using platform approaches, developing higher capacity resins, incorporating a “unique rapid in-process analytical approach,” and accepting risk-taking scenarios to accelerate timelines.
Important Technologies: The next question was, “which of the following technologies have had the greatest positive impact on your product development and manufacturing processes?” Because we at BPI have devoted quite a lot of editorial space to discussion of single-use technologies, we rather expected those to head the list. Not so! Table 1 shows that design-of-experiments (DoE) and platform development ranked first, but the “check all that applied” responses show strong interest in the other tactics being used — and no doubt, in parallel. Significant technologies included under “other” listed platform analytical methods and characterization strategies, modeling, better understanding of metabolism through “omics,” and development of an upstream platform. And the logical question to follow (Table 2) was about automation and for which operations it may be most appropriate.
Table 1: We asked, “Which of the following technologies have had the greatest positive impact on your product development and manufacturing processes (check all that apply)?”
Table 2: We asked, “Which of the following operations do you feel are most appropriate for automated approaches (check all that apply)?”
Of the 4.4% “other” in Table 2, one person listed “continuous processing (perfusion, continuous chromatography)” — confirming another strategy that we’ve seen in the literature.
Titer Trends, Platform Plans: To the question of whether downstream constraints restrict or limit efforts to increase production titers, slightly more than a third responded “no.” But responses of “yes” and “I don’t know” averaged 31% each, so the only conclusion is probably “it depends.” The response to platform technology development was more revealing: 38.5% of respondents said their companies use multiple platform technologies, and 24.6% use a single platform (therefore, 63% of companies may have one or more platform technologies in use); and 18.5% have platforms in development. Others are still evaluating options.
That leaves 13.8% with no plans to implement platforms. Not knowing the size and nature of those particular companies, that answer cannot be clearly evaluated. But the follow-up question asked of that 13.8% (Why?) said that platform technologies are expensive and/or too restrictive (25% each), and 62.5% (again, out of 13.8%) believe that a platform approach will not work for their existing product portfolio.
According to our survey, 53.3% of new projects are not selected based upon their ability to fit into a company’s platform technology, whereas 46.7% are indeed selected on that basis — a surprisingly even split reflecting the newness of platform approaches to many companies. One respondent writes that his or her company is “trying to develop new platforms to fit novel technologies in the pipeline,” and another mentions that “projects not fitting in the MAb platform are also developed, but cell-line selection for MAbs takes place in the platform.”
PAT Tools: We asked about the use of tools specifically referred to as “PAT tools,” to which 64.1% said that they do not use them; 35.9% said that they do (but another 37 people skipped the question). We wondered, then, whether everyone knows what we meant by PAT tools. Certainly we editors have been hearing the phrases in conferences this past year. Of the 35.9% that answered “yes” to using such tools, we asked for examples and received what we think is a fairly representative list (presented alphabetically):
automated column fractionation
biomass probes in development
cell culture monitoring
fermentation off-gas analysis
FTIR, Lasentec particle system characterization, NIR, UV, mass spec, pH
monitoring of temperature, pH, and DO
NIR, RAMAN, fluorescence, PCA/PLS
on-line cell-density probes
on line HPLC with automated peak cutting
on-line monitoring of pH and cell density
on-line TOC analyzers
rapid microbiology testing
screens and use of all analytical and control methods that provide economic value.
Testing and Screening Activities: In answer to “What types of testing do you develop and perform in-house? (Please check all that apply)” we see (as indicated in a previous question) that much of this work appears to be done in-house:
86.8% product characterization
72.1% product stability
54.4% lot release
48.5% cell line stability
4.4% outsource all.
Survey respondents listed the following issues that are either promoting or impeding implementation of new approaches to process design.
Accepting risk-management approaches
Demonstrating process robustness and consistency to meet product specifications
Demonstrating product purity
Demonstrating removal of process-related impurities
Making changes (or inability to make changes) in licensed product manufacturing processes
Ensuring consistency, reproducibility, process controllability
Ensuring/demonstrating viral clearance for mammalian products; defining viral safety, virus inactivation procedures
Implementing QbD, because it forces one to explore through experimental design the ranges(s) of all quality-affecting process parameters to develop the extremes in the CCP-ranges properly; but QbD and control strategies are currently unclear worldwide
Lacking new regulations for new industry sectors and new technologies
Learning about/following PSM guidelines (procurement and supply management issues)
Maintaining GMP and GLP compliance; interpretating regulations (and the FDA’s perspective)
Performing toxicology studies
Proving quality and accuracy
Optimizing sterile filtration at the end of the process
Still waiting for FDA QbD to have an effect on immediate management
Understanding ICH Guidelines Q8 and Q9
Understanding processing and methodology guidelines
Understanding the evolving regulation of cellular therapies (stem cells)
Working within existing license limits
We asked for the sources of screening technologies and related libraries and found quite a lot more activity than in previous years, when only a handful of groups made such information available to the industry at large:
38.5% all/some (developed in-house, purchased, outsourced)
30.8% developed in-house
20.0% do not use such technologies.
Cost Parameters, Equipment Guidelines: Looking at process design parameters, we found that material cost parameters/guidelines are provided initially to 41.5% of designers but not to 58.5% of them. Similarly divided were the responses to “Does your company provide you with overall product cost guidelines?” (52.4% no, 47.6% yes). Of course, a cross-functional team is likely to include those who have expertise in developing and assessing such models, so whether that information is provided at the outset or developed by/within the team itself is not clearly addressed by the question.
Not surprising, 85.9% of our respondents plan to use both fixed and single-use equipment, depending on the unit operation; 4.7% use single-use equipment for everything; and 9.4% say they use no disposables at all.
Regulatory Issues: Finally, we asked for help identifying the regulatory issues most affecting current approaches to process design. The resulting list in the “Regulatory Concerns” box provides a useful summary of the questions people have and of the issues addressed by the current risk-based approaches to product and process development.Staff Analyses and Case Studies
Armed with the above results as well as copious notes from presentations given at recent conferences — namely the AAPS Biotechnology and ACS National Meeting (BIOT division) — technical editor Cheryl Scott begins this issue by looking at how new approaches to process design are furthering developments on all “fronts,” especially regarding automation of systems for high-content and/or high-throughput analytics. Building on companies’ “need for speed” when it comes to reducing overall product development, manufacturing, and approval times, she examines how they are going back to the drawing board to redesign existing and create new scalable processes using new technologies that can significantly reduce cost and time while improving quality, approval times, and success rates. Her comments and interpretations are enhanced by addition of hours of transcribed interviews conducted by our contributing editor, Lorna D. McLeod.
The discussion then turns to putting the pieces together, looking for confirmation and analysis of the trends alluded to in our survey and further pursued in the interviews.
I commissioned several articles for this issue using some of the more telling survey results as my guide. In “Shrinking the Costs of Bioprocess Development,” authors from Tecan, Atoll GmbH, and the University College London describe how automation in the different areas can contribute to accelerating process development. They profile applications currently in use for cell line development, cell culture upscaling, antibody development, downstream process development, and PAT.
DoE came out at the top of our survey of new methods in use for process design. So in “Design of Experiments Helps Optimize Cell Culture Bioproduction System,” Steven Peppers, formerly a principal scientist with Invitrogen in Grand Island, NY, offers an experiment combining a robotically controlled microbioreactor system with DoE methods to optimize cell-culture media and feeding strategies. He shows how this new process is rich in information and provides a solid understanding of the most influential factors affecting performance of specific cell lines.
We next offer a detailed case study on “Creation of a Well-Characterized Small-Scale Model for High-Throughput Process Development,” from David Zhang and colleagues from Diosynth Biotechnology. With streamlining process development as the focus of the biotechnology industry over the past several years, they discuss a characterization strategy for 2-L, 15-L, and 110-L bioreactors. They show how characterization information in conjunction with specific oxygen uptake rate (OUR) successfully predicted large-scale reactor performance.
The fourth case study is “Software Simplifies Accounting for Batch Genealogy: Improving Process Knowledge and Design Through Upstream/Downstream Data Analysis,” by Justin Neway of Aegis Analytical Corp. In it, he creates a picture of the next generation data analysis. His anonymous case study (the names were changed to protect patent holders) demonstrates how to use data access and contextualizing software to make meaningful, quantitative correlations between parameters from upstream and downstream operations.
Through these examples, and through the many voices from the industry reflected in this issue, we hope that you will come away from your reading with an enhanced understanding of where and how all the pieces fit together under current “risk-based” approaches to accelerate process design and drug manufacturing. As always, the interpretations in this issue come from our interviewees and whatever conclusions we felt we could safely draw as editors. But it is you who are out there doing this work, so we count on you to help us continue this discussion by submitting manuscripts, letters to the editor, and emails. Help us know what you want to see explored in greater depth. In the meantime, we welcome you to the presentations within these pages.