Analytical and Quality

BPI Staff

September 30, 2014

18 Min Read

DigitalVision-Analytical-171x300.jpgWhat’s on the minds of analytical and quality laboratories in 2014? They want to stay abreast of both emerging analytical tools for biotherapeutics and increasing quality expectations. They need to optimize their analytical strategies and reduce quality risk across every product’s life cycle. They’re focusing on physiochemical characterization methods, impurities and product variants, biosimilars and complex glycoproteins, high-throughput methods for predicting stability and developability of early stage products — and of course, quality risk management.

Characterization, Structure, and Function: As is often the case, analytical and formulation laboratories have a lot in common. Especially at early stages of product development, the data they gather are complementary, and the methods they involve are often very similar, if not identical. So this year, you’ll see quite a bit of overlap between the quality and analytical and the formulation and delivery conference tracks.

Mass spectrometry and spectroscopic techniques such as circular dichroism (CD) and Fourier transform infrared spectroscopies (FTIR) are widely used to characterize protein structure. And nuclear magnetic resonance (NMR) “fingerprint spectra” are becoming a viable option for comparing samples. But the sensitivity of such techniques must be evaluated before they are relied upon for comparability testing.

Three-dimensional structure is vitally important to the in vivo function of therapeutic design (QbD), both for new products in development and in updating legacy processes. Data and analyses involved in quality risk assessments need to conform to standards of measurement validity and sound statistical inference. But, as one presenter cautions, companies need to guard against treating the risk assessment and decision-making that follow too casually.

Joseph Horvath (senior director of quality systems at Takeda Pharmaceuticals International) points to the importance of framing risk assessment: defining and delineating the terms of assessment to address the right question precisely and coherently. “A common error is to rush through these activities too quickly in an effort to ‘get started,’” he says. “But a risk assessment can only be as good, and as valid, as its framing allows.” Horvath will moderate a session to address the challenges teams often face in these efforts.

High-throughput screening is helping companies assess drug-like properties and manufacturability of candidate molecules at early stages. For example, differential scanning fluorimetry (DSF), dynamic light-scattering (DLS), and size-exclusion chromatography with multiangle static light scattering (SEC-MALS) can help some address thermal and colloidal stability. And others use SEC with DLS and UV spectroscopy to analyze protein aggregation.

Biosimilars are no longer a possibility, but a reality. Former FDA biochemistry laboratory chief Emily Shacter (now a consultant with ThinkFDA) will overview recent developments in the United States on Tuesday afternoon. And following her, Hugh Conlon (principal scientist in analytical research and development at Pfizer) will elaborate on related analytical challenges.

BPI’s marketing and digital content strategist, Leah Rosin, conducted the following interviews as the conference program came together this summer. Participants address process monitoring and control, quality considerations, and structure–function relationships. Here, in Q&A format, is what they had to say.

Pawel Drapala (Alexion)

Pawel Drapala (senior process engineer in technical manufacturing services at Alexion Pharmaceuticals) will be joining us for the “Quality Considerations in Bioprocess Development and Manufacturing” session on Thursday morning, 23 October 2014. His presentation is titled “Strategies for Continued Process Monitoring of Commercial Cell Culture and Purification at Large Scale.”

Abstract: This presentation outlines a framework for continued process monitoring of commercial manufacturing at large scale (>10,000-L bioreactors). Continued process monitoring, or ongoing evaluation of a validated process, has been performed to demonstrate continuous control of the commercial Eculizumab manufacturing process at Alexion’s Rhode Island manufacturing facility (ARIMF) so that safety, identity, strength, purity, and quality of the product are ensured. Data trending has been performed for selected process-monitoring parameters over time to demonstrate process control or identify trends before a deviation occurs. Those parameters serve as a historical database that may be used for investigations. The process data also have been used to evaluate and develop, as necessary, control limits for process parameters to ensure the consistency of the process. The underlining goal of this presentation will be to outline a commercial strategy for routine manufacturing process monitoring with quality assurance (QA) oversight for data evaluations against the established control limits.

Why is continued process monitoring important? How does it help companies in a tightly regulated environment? Process monitoring — typically what is called an “ongoing evaluation of your validated process” — in many cases is a regulatory requirement. If you have a product on the market (with a commercial, validated process), and you undergo a regulatory inspection, very frequently you are going to be asked, “Can we see your process monitoring program?”

There are several benefits to a process monitoring program. For example, a process monitoring program could be used to predict a deviation before one occurs. You could identify trends, and if you interpret those trends correctly, you can actually predict a deviation and potentially prevent it.

Other benefits of a process monitoring program include root-cause analysis and product-impact assessment, for example. If you have a historical database of all of your data, that’s going to be a great tool for any and all investigations that you do as part of your manufacturing campaign.

What types of tools and equipment are you using? The type of data that we analyze comes from many different sources. We have our inline instruments, things that are essentially fed directly into what we call our plant control system (PCS). They monitor temperature, connectivity, pH, osmolality, and conductivity: instantaneous data that are typically housed into some kind of a database. This would be your automation software: Delta V or Rockwell FactoryTalk, and so on. But that’s only one source of data used in process monitoring.

Then you also have data that comes from your quality control department’s analytical assays. Typically they take proteins. And structure–function relationships are not just a concern for antibodies and glycoproteins. Two Wednesday afternoon presenters will look at different types of enzymes in this light.

Process Analytical Technology and QbD: Strategies for continual process monitoring are important to quality by time to obtain, sometimes hours, sometimes days, and sometimes even weeks. That’s another data source. The final data source is the manufacturing floor itself — your manufacturing operators. Many things are still done manually: for example, cell counts and titration duration. All these things are also part of your process monitoring program.

Now you have all these data that come from various sources. The real question is how to integrate them into a cohesive process monitoring program. And we have various software packages that do that. Obviously, we need some kind of quality management software, and typically what’s used is the TrackWise program. That’s what we use in the industry. We also have process automation. As I mentioned before, we have Rockwell FactoryTalk, which is common; Delta V is common too. You can even use software specific to certain unit operations (for example, the GE Unicorn program).

Finally, what is probably the most powerful tool for process monitoring is called “control charting.” Typically you need some kind of statistical software package for this. Control charting is powerful because it allows you to distinguish between “common-cause” and “special-cause” variation. The former describes normal fluctuations that you expect within a manufacturing process; the latter describes things that are out of the ordinary. If you pick up a special-cause variation using process monitoring, you can either take corrective action before it causes trouble, or you can make changes. You can do something to ensure that you have a stable, under-control process.

Your abstract mentions using data trending to set process parameters. Can you explain how much data were sufficient for that? From a statistical point of view, you need at least n = 3 to do anything. That’s the minimum you need to get a standard deviation. For process monitoring, we recommend that you have at least 10 good manufacturing practice (GMP) batches. Ideally these 10 batches should include your process validation runs. You want batches that have no changes at all to the process, at least 10 because that’s the minimum to make any kind of prediction in terms of process monitoring.

To establish control limits and use them to make these kinds of predictions (in terms of distinguishing between common-cause and special-cause variation), you need at least 10 data points. As far as frequency, that really depends on how frequently you process. You may be able to generate those 10 batches relatively quickly, or it may take you a while depending on your processing time.

What are the most important process parameters in terms of quality assurance? Typically we have critical process parameters (CPPs) and critical quality attributes (CQAs). In most releases, the latter involve a series of assays intended to prove that your product is what you have approved on the market. There is a direct relationship between CPPs and CQAs. Your CPPs are actually a must-have for a process-monitoring program. If you have an FDA-approved product on the market, then typically your biologics license application (BLA) has a section listing those critical parameters. So by definition, they have to be part of your process-monitoring program.

In addition, we have key process parameters (KPPs). They don’t necessarily have to be a part of your filing, but we also use them as part of our process-monitoring program. They are in addition to CPPs. A CPP is something that you must have (e.g., product concentration, final formulation buffer PH, excipient concentration). They are must-haves. And your KPPs are things that are very beneficial (e.g., the overall process yield). Ideally, you would like your manufacturing process to yield as much product as possible. So you might want to have a KPP that monitors total yield, but that’s not an absolute requirement in terms of releasing a batch onto the market. Those are typically the parameters that we look at.

Finally, why are you attending the BPI Conference? The BPI Conference sounds like a great place to meet both driven and dynamic people. It’s really where you get some of the best ideas because this is a very rapidly changing industry. I’m always interested to meet people who are undertaking the same developments as we do.

Sarah Thomas (GlaxoSmithKline)

Sarah Thomas (site quality director at GlaxoSmithKline) will be joining us for the “Quality Considerations in Bioprocess Development and Manufacturing” session on Thursday morning, 23 October 2014. Her presentation is titled “Microbial Control: What’s Bugging You?”

Abstract: Expectations for “low bioburden” manufacturing of biopharmaceutical products have become increasingly stringent over time. Without adjustments in approach, the risk of failure is increased, and time is wasted on lengthy investigations and cycles of insufficient corrective and preventive actions (CAPAs). A proactive, comprehensive strategy is needed to help companies identify and control avenues for microbial contamination.

You mention increasingly stringent expectations of low bioburden manufacturing for biopharmaceuticals. Can you elaborate? We’ve seen a trend in the industry over the past five or so years of regulatory inspectors looking for better and better control over microbial contamination — particularly in downstream processing for biopharmaceuticals. There hasn’t been a specific change in regulations or guidelines, but I believe that over time, regulators have come to the opinion that with modern technology, we should be able to have better control and more consistency over microbial contamination levels. Thus, they are expecting us to use the tools that are available to make our processes cleaner and safer.

Can you briefly describe potential avenues for microbial contamination in biopharmaceutical manufacturing? Biopharmaceutical manufacturing processes are long and complicated with a number of places for the entrance of microbial organisms.

With upstream processes, the expectation for many years has been that they run essentially sterile or free from foreign contaminants. People have spent a lot of time and effort developing closed systems and validating steam-in-place processes, pressure holds, and so on, to make sure that those systems are truly closed.

Downstream processes, on the other hand, have traditionally been more open. At a number of places in the system there are small opportunities for microorganisms to enter the process. Over time, we see companies looking for more opportunities to close up connections and transfers from one vessel to another — any place where there is a break in an otherwise closed system. Fortunately now we have more technology coming into place with disposables, sterile connectors, and so on, that allow us to close up some of those openings to microbial organisms.

What types of process monitoring tools can be used to detect microbial contamination? Typical biopharmaceutical processes involve a number of tools to monitor microbial contamination. With bioburden and endotoxin testing, for example, it’s critical that samples be taken at the right places in the process. You really need to consider the point at which you have a real opportunity for microbes to enter.

Look at where samples are taken in regard to filters. For example, it is pretty common in the industry to have a filter before a pool tank (e.g., following a chromatography step). However, if you’re sampling right after that filter, you’re not going to get a true picture of the potential contamination at that point in the process. It’s important to consider where samples are taken to ensure that you are truly monitoring those potential places where microbes can hide.

Another point is not to neglect the importance of endotoxin testing. Although you may not have active bioburden in a sample, an endotoxin increase can indicate a biofilm hiding somewhere in a piece of equipment. So it’s important to monitor both bioburden and endotoxin results.

Can companies use the same basic platform to monitor and control microbial contamination across all manufacturing sites and products — or do they need approaches that are individualized to expression platform and product type? Basic microbial contamination test methods can be applied across different types of products and platforms. However, you need to consider for a given manufacturing process where those potential contamination entry points might be and what types of contamination might be possible.

If, for example, you’ve got a process that runs in an anaerobic or some aerobic conditions, then you need to consider whether you need to look specifically for those types of organisms. It’s the same if you’ve got a process that would be particularly susceptible to a particular type of bacteria, yeast, or mold. You need to make sure that you’ve adapted your methods to be specific for those particular cases.

So you’ve got standard methods that can be used for most cases, but you can’t neglect the specifics of a particular process.

Why are you attending the BPI Conference? I’ve been to the BPI Conference several times now, and what I find valuable is that it’s a good opportunity to learn about what other companies are doing. The same problems will tend to come up at different facilities in different companies over time. If you take advantage of learning either from projects that are new or problems that have occurred at other companies, then you can apply that knowledge at your own organization — and hopefully, reap benefits that you might not otherwise see. This is also a great networking opportunity to mix and mingle and have discussions with personnel from other organizations.

Johnson Varghese (Shire)

Johnson Varghese (senior director and head of the analytical development group at Shire Pharmaceuticals) will be joining us for the “Elucidating the Structure/Function Relationships of Complex Glycoproteins” session on Wednesday afternoon, 22 October 2014. As moderator, he will be leading a panel discussion at the end of the session titled “Strategies and Challenges of Establishing an Analytical Control Strategy for Complex Biomolecules.” Discussion points will include

  • establishing criticality of product attributes

  • process control for critical quality attributes (CQAs)

  • defining the overall analytical testing strategy for QbD

  • the state and suitability of in vitro bioassays for measuring product quality.

Can you explain generally what goes into establishing criticality of product attributes? It starts with a risk-ranking filtering process to distinguish between critical and noncritical quality attributes. As we begin to go through the stages of product development, we refine that filtering by getting more and more understanding of our molecule — either from structure/function studies or from our clinical and nonclinical studies — with the ultimate goal of establishing a control strategy.

By the time we commercialize, hopefully we’ve gone through that risk-ranking process multiple times and redefined our CQAs based on all the new information we’ve gathered. In later stages, we want to be very product specific. An an early stage, we can leverage a lot of platform information or prior knowledge. Certainly, that is something I would continuously monitor and improve on as we go along, irrespective of development stage.

How much more of a challenge are analytics for complex molecules than for monoclonal antibodies (MAbs)? In a nutshell, complexity is relative. What I mean by complex molecules is that they exhibit more complex primary and higher-order structure than MAbs. And there’s little or no prior knowledge regarding the impact of those attributes on safety or efficacy. Ultimately, that complex structure and lack of information will place more importance on very early characterization and make key structure–function studies necessary to assess important aspects of each molecule and define an early analytical control strategy for GMP manufacturing.

Let me go into a little more detail regarding that. From a structural characterization perspective, complexity for non-MAbs comes from their primary structure. They may have multiple posttranslational modifications and so on. We don’t know the sequence liabilities — the hot spots — whereas with MAbs (especially if a company has some experience with a given molecule) we can already focus in on the complementarity-determining regions (CDRs) or variable regions. We can focus on the Fc carbohydrate region, and we can focus in on the effector functions for IgG1 Isotypes. For other isotypes (e.g., IgG2s and IgG4s), we know that disulfide linkage shuffling occurs, and so on. Those things are all documented in literature and also may come from information that a company has gained internally. You can build on that knowledge with new MAbs, whereas with non-MAb proteins, you’re starting from scratch.

From a structural perspective, a lot of these molecules depend on their higher-order structures (e.g., dimers, tetramer, even octamers) for activity. Combine that with the additional posttranslational modification on multiple sites, and you have a broad range of size and charge distribution, which again makes for complex molecules. This places a lot more challenge on the analytical aspects. These multiple physicochemical attributes can influence in vivo and in vitro biological function, solubility, stability, degradation pathways, and so on.

Ultimately, for an analytical control strategy with specification setting, it’s possible to have a platform specification approach especially for early stage MAbs. For new non-MAbs you need to justify a lot of those ranges and show how you selected each assay, what to place on release and stability, and so on. In defining the analytical control strategy, it is definitely more difficult and will take longer for a non-MAb than a MAb because we need to understand structure functions. And that’s going to take time.

Would you describe the current state of in vitro bioassays for measuring product quality? What needs to be improved? In vitro assays are increasingly using biological materials or cells with physiological relevance. For example, assays are moving from surrogate cell types to disease-associated cells or tissues. Now we prefer to use, for example, cells from patients with a given disease for some newer assays.

Whether we do a cell-based or an enzyme-activity assay, going with more relevant substrates seen in patients gives us more assurance that we are measuring more accurately. This is another advantage of using physiologically relevant substrates: We can also now probe conformational aspects of an enzyme by looking at the kinetics of the associated reaction. I think that’s why regulatory agencies want us to use physiologically relevant cells or a substrates more.

As for what needs improvement, often (especially early on) only one functional assay is used to assess product quality. However, we know that multiple events are happening; that is, there are multiple pathways, and they all need to be characterized. For example, if a product needs to bind and then affect signaling, then ideally we should have assays that measure both the binding and the signaling process.

Among other aspects that could be improved, we prefer to have a true in vitro model that can predict a clinical outcome. That would be ideal. In addition, assay precision is an issue for us right now. We are finding plate effects with some functional cell-based assays. So we are studying how we place the samples on plates and investing in automation.

Why are you attending the BPI Conference? Well, I’m in Boston. So, first thing, it’s local and very convenient for us to get to. But really, it’s an excellent forum for learning about innovation in the areas of bioprocess development. It’s a great place for us to network and meet colleagues from other companies. So I think it’s a very relevant forum for folks involved in bioprocess development.

Listen Online! These interviews have been edited from transcripts for space and style. You can access the original conversations at www.bpiroundtables.com/bpi14.

You May Also Like