Recovery and Purification

View PDF


DownstreamWhat’s on the minds of groups involved with downstream processing in 2014? They want to capitalize on innovations in methodologies, materials science, and technologies to help optimize efficiencies. They’re using process design and flexibility in downstream processing to purify an emerging wave of antibodies and novel product modalities. They’re interested in disruptive harvest-step technologies, new chromatography chemistries, continuous processing and the interface between upstream and downstream, high-throughput technologies and modeling, sequence variant analysis, and improving flexibility and facility fit.

Innovative Techniques: Advances in materials science are offering new opportunities in downstream bioprocessing. Innovations in chemistries, supports, and instrumentation are taking chromatography to new heights. Some new methods don’t even use columns! And several new technologies — such as precipitation, acoustic-wave clarification, and chromatin removal methods that will be discussed this year — aren’t even chromatographic.

Flexibility and Single Use: New high-producing cell culture processes require “new and improved” purification processes to handle challenging high-mass demand products. New single-use solutions are making it possible to harvest high-density cell-culture supernatants at cubic-meter scales. Other single-use technologies of interest in this year’s program include advanced hydrogel membranes.

“The challenges of working with traditional chromatography columns and beads have been well documented,” explains James Stout (vice president of process sciences at Natrix Separations Inc.), “as the bioprocess industry drives toward a more flexible manufacturing paradigm.” His company promises to “enable greater flexibility and economic effectiveness” with its single-use chromatography platform.

Also, Benoit Mothes (senior scientist in downstream process development for bioprocess science and technologies at Sanofi) will report on progress toward a fully disposable continuous process. Through its accelerated seamless antibody purification (ASAP) process, the company uses disposables to run three continuous chromatography steps to obtain a pure MAb batch without human intervention, thereby decreasing resin and buffer costs.

Blurring the Lines: Speaking of continuous bioprocessing, the lines between upstream and downstream are blurring somewhat. Although expression and harvest of a protein product stream will always mark the line between “production” and purification, the relative “batch” concepts of up- and downstream are changing. As a result, the two programs share a session on this topic.

“Decoupling cell culture and purification processes offers a solution to accommodate increased product masses by adding flexibility to capacity and inventory management,” says Alex Brinkmann (engineer II in biopharmaceutical development at Biogen Idec). He will present a case study demonstrating the potential benefits of doing so. David Pollard (executive director of bioprocess development at Merck & Co Inc.) will also report on work toward an automated continuous processing method enabled by single-use technology.

BPI’s marketing and digital content strategist, Leah Rosin, conducted the following interviews as the conference program came together this summer. Participants addressed high-throughput approaches to downstream process development. Here, in Q&A format, is what they had to say.

Srinivas Chollangi (Bristol-Myers Squibb)

Srinivas Chollangi (biologics development scientist for Bristol- Myers Squibb) will be joining us for the “Rapid, High-Throughput Process Development” session on Tuesday morning, 21 October 2014. Chollangi’s case study is titled “Development of Robust Wash Conditions for Protein A Capture Chromatography Using High- Throughput Automation Technologies,” and it features new, unpublished data.

Abstract: Host-cell protein (HCP), DNA, and high–molecular-weight aggregate contaminant clearance represents a significant challenge during downstream process development for biopharmaceuticals. We have implemented high-throughput automation technologies to screen a wide array of excipients at different pH ranges in combination with salt species. This helped us evaluate their effectiveness in removing impurities from the product pool during protein A capture chromatography.

What automation technologies did you use to conduct this screening? We have used high-throughput automation technologies primarily assisted by a Tecan Freedom Evo system and other liquid-handling devices. These help us perform a range of process development operations and analytical characterizations. Those include plate-based batch chromatographies, plate-based buffer exchanges, enzyme-linked immunosorbent assays (ELISAs) for host-cell protein, and residual protein A characterization and ultraperformance liquid chromatography (UPLC) for rapid size-exclusion chromatography (SEC) analysis. Traditionally, regular SEC takes several days to run such a large number of samples, but UPLC allows for very rapid screening of those samples.

Finally, we used spectrophotometry and quantitative polymerase chain reaction (QPCR) for protein and DNA analysis. Again, these are all in a plate-based format that allows for screening large number of samples very quickly.

What was the general set-up of the protein A capture step? As the first step in our process development, we wanted to screen a broad range of process parameters and increase our product and process knowledge before performing process-scale chromatography experiments. The high-throughput automation platform was ideal for that, allowing for a good balance between speed and accuracy if operated carefully.

As a result of adopting this technique, our demands for resources and consumables were very low. We used only a few liters of clarified harvest for conducting all these experiments — as opposed to tens of liters of harvest, which is usually required by traditional process development experiments.

By defining an optimal operational space up front, we minimized the number of small-scale chromatography experiments required for scale-up compared with traditional process development strategies that rely primarily on a trial-and-error approach using small-scale chromatography experiments. This is both time and resource demanding and can significantly increase the costs associated with process development. It also restricts the number of drug molecules that can be evaluated in the pipeline. Using high-throughput technologies definitely helped us move forward quickly.

What is the level of improvement over previous methods gained through doing this process development exercise? Integration of high-throughput technologies into laboratory operations and purification development allows for screening a very broad range of parameters. To give you perspective, a similar study on the removal of HCPs during protein A chromatography was performed in 2008, I believe, by Shukla and Hinckley. It was a landmark study. Back in the days when HT technologies were not as prevalent, those scientists screened about 35 different buffer conditions. On a normal chromatography instrument, that usually would take several weeks to complete all those runs. By implementing HT technologies, we were able to screen more than 300 buffer conditions all within two weeks. Everything was done at a very fast pace.

Also, if you look at the A-MAb case study (a collaborative project among highly reputable biopharmaceutical companies including Genentech, Eli Lilly, and Pfizer; associations/9165/files/A-Mab_Case_ Study_Version_2-1.pdf), the general levels of host-cell proteins obtained after protein A purification and viral inactivation are in the range of 3,000– 7,000 ppm. But by using these new techniques to identify the right conditions, we have now optimized the protein wash conditions such that we can get numbers like <100 ppm, which is a dramatic improvement.

That could reduce the number of columns required downstream of protein A. If you look at the A-MAb test study, you see at least two or sometimes three additional columns required to purify and get to the ideal specifications. But we could potentially lower the number of columns required by adopting these new wash techniques.

Have you considered using other methods of purifications besides protein A? Why didn’t you choose to go that route? So far, protein A has been the most reliable workhorse for cleaning a majority of the HCP and DNA contaminants from antibody products. Although several groups are looking at alternative techniques to clear those contaminants, none of them have really measured up to the robustness of protein A. Cation-exchange (CEX) has been explored widely by many other scientists, as well as precipitation techniques to remove HCPs and DNA. But due to the associated manufacturing difficulties and a lack of robustness, neither of those techniques has measured up to the performance of protein A yet.

That does not mean that we have explored all conditions yet. So maybe some time in the future we might have a good method that will work as robustly as protein A — or even better. At this moment, however, we have not yet explored many methods ourselves. But we are in the process of evaluating some.

Why are you attending the the BPI Conference this year? I attended the BPI Conference last year, and it was a fabulous conference. It provides the platform for sharing new findings by biotech scientists across the world. It also invites so many pioneers in this field that it provides an excellent environment for having meaningful discussions among the leaders and emerging scientists to define the future course of biopharma. I loved interacting with all these wonderful scientists and leaders last year, and I’m eagerly looking forward to meeting them again and even more people this October.

Amitabha Deb (Novartis)

Amitabha Deb (fellow in the integrated biologics profiling group at Novartis Pharma) will be joining us for the “Rapid, High-Throughput Process Development” session on Tuesday morning, 21 October 2014. Deb’s case study is titled “High- Throughput Screening for Profiling of Non-Antibodies Using Disposable Technologies,” and it features new, unpublished data.

Abstract: Single-use technologies provide better economies and accelerate development timelines. Although a number of downstream process approaches using such technologies are heavily under consideration for process development, limited information is available about their use for candidate selection in late discovery and early development phases. For nonantibodies, developability parameters are poorly understood, so prediction of process performance based on such parameters is difficult. Case studies will be presented in which a traditional purification approach is compared side-by-side with a disposable approach. In addition to rapidly assembling processes, single-use technology helps users identify resource-intensive molecules and brings “process research” into the area of early candidate development.

Can you explain why developability parameters for nonantibodies are poorly understood? What makes prediction of their process performance difficult? As a process developer, my impression is that from the late 1980s to the early 2000s, biopharmaceutical industry experts mainly focused on monoclonal antibodies. We heavily invested in developing and refining a path for their successful technical development. All that process research brought us to a position in which we are quite confident about the developability parameters of MAbs.

In sharp contrast, we have increased our understanding of the non-MAb area, but not to that extent. Clearly, the recent “postantibody” landscape demands a better understanding of this class of molecules. We are seeing a significant increase of these biologics with diverse modalities. So we need to know more about them to understand their developability.

We know how to produce antibodies, how to purify them, and how to formulate them. But, this is a totally unknown area for nonantibodies, which don’t comply with any platform process. Prediction of their process performance is challenging, and at present the industry is addressing them case by case. The challenge spans across every area of process development for non-MAbs.

Can you share how single-use technologies enable accelerated process development? Single-use technologies are very helpful for early process development, and I represent a group that is deeply involved in these activities.

For downstream process development, we explored different polishing chromatographic steps and used membrane absorbers for purification. Those technologies provide insights into flow properties (mainly convective flow), which helps users run their processes much faster. Overall, there can be a significant gain in process time. I’ll be talking about this early process development.

If you consider late process development — or the full development, per se — then you can use membranes even for viral clearance studies. That approach has already been implemented in the MAb area for commercial production. Suchy disposable products (bags, connectors, probes, and sensors) were listed as the most innovative areas demanded in one 2013 biopharmaceutical industry bioplant survey.

The number of complaints using disposables has decreased over the past few years. Clearly we like to use them more and more in process development. Less cleaning validation is required, you can run a process much faster, and even in early development you can generate material faster for preclinical studies.

Here’s another area we always overlook: Single-use technologies allow us to produce better-quality material. That material increases patient safety because you have less and less chance of cross-contamination in multiproduct facilities.

What is the advantage to accelerating process development in this way? How does it affect product development in clinical trials timing? We would like to see the fast-to-clinic approach for our pipeline molecules, knowing that at least some of those molecules have a great chance for success. Single-use technology can help you to achieve that not only by improving process time, but also by helping you to achieve quality goals without significant validation.

In early development, our focus is on generating materials for preclinical studies. A number of companies are evaluating single-use technologies to produce material with high quality. But there are basic technologies with which we were very familiar. The manufacturing process is sometimes very low yielding, with significant high depreciation and operational expenditures. There is a clear paradigm shift in the biopharmaceutical industry to move the entire manufacturing process toward single-use. New facilities can have multiple products introduced in a much safer way.

In the case studies you’ll be presenting, generally what methods and equipment were used to optimize process development? We are in early development, with limited process optimization geared toward material production for preclinical studies. We are using a high-throughput screening for non-MAb development. This is also geared toward developing a high-yielding purification process — and, of course, this type of process is targeted to be developed in a short time.

For this analysis we are dependent on statistical modeling using standard softwares such as MODDE, JMP, and so on. But the main concern is to integrate high-throughput process development with an analytics package. That way, you can have a good idea on the process side very early in the research. In this area, we are also working with different vendors to better understand the use of statistical modeling with proper visualization of large sets of analytical data for readout.

Finally, why are you attending the BPI Conference? I think it’s a great platform to meet process developers from all across the world. I attended a number of conferences for the past few years. I think we can have a very good idea about the direction at which the industry is moving as a whole. In addition, it provides me with the opportunity for a very focused discussion.

I also think that it is a very good conference. And I’m based in Boston, so it’s a great local conference to attend.

Leave a Reply