For many years the pharmaceutical industry was dominated by small (usually synthetic) molecules, mixed with a number of nonactive materials and encapsulated or (in the really old days) rolled into pills or pressed into tablets. Although synthesizing the active pharmaceutical ingredients (APIs), formulating the dosage forms, and analyzing the materials at every stage of a product life cycle were not always trivial activities, they were relatively straightforward. Most of the tools needed for analyzing/controlling each step of the manufacturing process were prevalent in laboratories across the world. Because early commercial production tools were very slow by today’s standards, in-process tests did not need to be fast or sophisticated. Indeed, the vast majority of early solid-dosage forms were “immediate-release” tablets or capsules that depended on gelatin-solubility for API release. Later, time-release dosage forms were subjected to the same in-process tests as immediate release forms: e.g., hardness, friability, disintegration, and weight variation.
All that was fine when single-punch presses (and later, somewhat larger units) were producing hundreds of tablets per hour. Final testing was sufficient for monitoring safety and efficacy, so the 20–30 final test doses and composite assays were considered to be acceptable. After all, in batch production it took weeks for a single lot to be made, so who cared if it took (several) days to analyze it?
As production methods grew faster and faster later in the 20th century, the industry found itself saddled with 1950s-era in-process and final lot analysis techniques. The best impetus to “modernizing” the way we monitor and analyze — and more important, control — drug production was the process analytical technology (PAT) guidance of 2004 (1). Along with successive guidances from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), and other agencies supported improved control of manufacturing processes through modern technology. The extension of analysis and control to process applications required the availability of advanced computers, sufficiently complex software, and measurement devices that could improve speed and accuracy in relatively small footprints. Around 1990, some companies began developing those needed tools. One example was a cooperative effort between Pfizer in the United Kingdom and Zeiss in Switzerland to develop the first wireless, in-place, near-infrared (NIR) spectrometer for real-time blend-uniformity measurements. FDA acceptance of that tool opened the floodgates for new equipment and peripherals.
The Biological Paradigm
Traditionally, manufacturing small-molecule dosage forms has been a two-step process: synthesizing APIs and incorporating them into solid dosage forms. The former is essentially organic chemistry; the latter is (or should be) based on materials science (e.g., mixing and tableting). Wireless spectrometers brought continuous monitoring and feedback (control) under the FDA’s PAT initiative. Over a decade after the introduction of PAT and quality by design (QbD), we see more real-time release of final dosage forms and a growing presence of continuous manufacturing (CM). It would appear that solid dosage forms are well on their way to true QbD — and eventually, continuous manufacturing when it is warranted.
Such tools have been used with organic synthesis reactions for longer than in tablet production simply because organic synthesis reactions take place in nonaqueous solutions that are amenable to spectroscopic controls. Parameters such as viscosity, temperature, refractive index, and other physical measurements have been easy to measure in organic solutions.
But expecting us to apply those same control technologies to biopharmaceutical products would be naïve. A number of fundamental differences separate the small-molecule and biological worlds. Instead of a controlled, synthetic, organic reaction in a chemical reactor, biomanufacturing relies on complex cellular biosystems with high sensitivity to their environment and feeding regimen in an aqueous matrix that cannot be controlled simply by established principles of organic chemistry.
Production of large molecules by microbes and animal cells requires control of numerous processing parameters such as nutrient concentration, temperature, pH, gases, agitation, and so on. Host cells, product(s), by-products (e.g., lactate, ammonium, and CO2), and the growth medium constitute a complex mixture with many chemical species present in a bioreactor at levels that are undetectable by many analytical tools, including NIR spectroscopy. Many of those materials are not strong absorbers of IR or NIR radiation, but their effects on other molecules and water can be followed by using chemometric methods.
Production of large biomolecules typically follows a two-step process: First, microorganisms produce the molecules of interest in “upstream” bioprocesses; then, the API is purified through removal of growth medium, cells, viruses, and other impurities in “downstream” processing.
Upstream production routinely relies on in-line and real-time measurements and control of parameters such as pH, dissolved oxygen, CO2 (both dissolved and in the bioreactor head space) that affect cell viability. Nutrients (e.g., glucose) need to be measured and controlled throughout the duration of a batch production, and byproducts (e.g., lactate and ammonia) need to be monitored. Until recently, manual sampling and off-line measurements with fundamental primary analytical methods were predominant control procedures. However, use of in-line spectroscopy as a PAT for monitoring and controlling bioreactors has increased significantly over the past decade.
All impurities in APIs are of critical concern, but the stakes are potentially higher with biological impurities, which often are proteins, nucleic acids, and adventitious agents not seen in synthetic processes. They can pose long-term carcinogenicity and mutagenicity dangers as well as the potential for immediate allergic reactions in patients. In the absence of immediate reactions, the risk of long-term harmful effects depends on the type of therapy. If a biologic drug is used in a one-time application (e.g., heparin in a cardiac event), there is little chance for minor impurities to do much harm. However, long-term use of a biologic such as insulin could allow even the smallest impurity plenty of time to harm a patient.
Evaluating Biosimilars and Generics
You might assume that a company developing new biopharmaceuticals will spend years assessing their potential harmful effects. From the time involved in development of the API, through all clinical trials, to subsequent stability studies associated with commercial manufacturing, innovator companies have time to accumulate a large portfolio on potential by-products and breakdown products of a drug substance and its production. As with generic competition for small molecules, however, competition from secondary biologics companies has arisen to produce the “same” active molecules from different bioprocesses.
By definition and often with abbreviated clinical trials before release and marketing, the resulting biosimilars have less time for evaluation of potential byproducts before marketing. A biosimilar API may be “identical” in structure to that in a patented drug, but biological processes express numerous proteins and nucleic acids, each particular to the mode of expression. When all is said and done (excluding potential lawsuits for patent-infringement, and so on), the most problematic feature of every biosimilar will be those exotic byproducts and their potential side effects. Different guidances and policies of regulators around the world add to the complexity of making, selling, and evaluating biosimilars.
Consider, for example, the “question-based review” (QBR) of generic drugs (2). One of its main provisions requires amended new drug applications (ANDAs, which also cover biosimilars) from disparate companies to follow a common format. Previously, each of the many generic-drug companies submitted documents using its own internal style. FDA reviewers had to navigate dozens of different types of applications, causing long wait times for those companies to get a yes-or-no answer on the fate of their products. Imagine the chaos that would result if an English teacher allowed each student to write a term paper in his or her own way. The style requirements alone made this guidance an excellent idea. Like a class receiving a term-paper assignment, everyone understood what was needed and in what order it should be presented. This did, indeed, speed up review times.
Unfortunately, it also gave generics companies some new responsibilities. The responsibility for product purity was extended both earlier and later than had been the case previously. The existing responsibility was “simply” to produce a drug (often covered by a pharmacopeial monograph) that met requirements of purity, assay, disintegration or dissolution times, and so on. Before the QBR guidance, it was sufficient to depend on a certificate of analysis (CoA) for APIs. But with biosimilars, a mere CoA never could have been a good idea — or even possible.
Generic-drug companies — including contract manufacturing organizations (CMOs) — now need to be familiar enough with the synthesis route (or biomanufacturing process) for an API that they can prove (validate) that incoming raw-materials testing and stability-indicating assays can both identify and quantify breakdown products from all APIs, no matter how they are made. That extends to stability programs: Each analysis method must be capable of finding and quantifying materials from the breakdown of dosage-form APIs, however they are produced.
That creates a constant feedback loop between suppliers and drug-company laboratories requiring analytical methods to separate all potential byproducts (from manufacturing) and all breakdown products from stability samples. At a “normal” or traditional generic company or CMO, you’ll find a number of trained analytical chemists who can adapt methods to the specifications needed. That simply adds a small amount of labor and time to their existing workloads when small molecules are involved.
But when it comes to biological or biosimilar production and sales, all bets are off. Whether a CMO is producing a biological drug product that’s “original” (under contract to a patent holder) or generating a product that is “similar,” the bioprocess will be far more complicated than merely mixing powders and compressing a tablet or encapsulating the mix. Understanding the effects of an API on final dosage forms is covered in ICH Q11 (3):
The identification of CQAs (critical quality attributes) for complex products can be challenging. Biotechnological/ biological products, for example, typically possess such a large number of quality attributes that it might not be possible to fully evaluate the impact on safety and efficacy of each one. Risk assessments can be performed to rank or prioritize quality attributes. Prior knowledge can be used at the beginning of development and assessments can be iteratively updated with development data (including data from nonclinical and clinical studies) during the life cycle. Knowledge regarding mechanism of action and biological characterization, such as studies evaluating structure–function relationships, can contribute to the assessment of risk for some product attributes.
Such control and understanding of biologicals are difficult enough for companies that have developed the drugs, even with a large number of biochemists, molecular biologists, and quality control (QC) analysts. For smaller companies (both producers of the bioproducts and the CMOs that package them as dosage forms) and those largely used to performing small-molecule analyses, this task is even more difficult. Clearly, any company producing a biosimilar would need comparable facilities to those of the major company that discovered and developed the originator bioproduct.
A Step Ahead
Since the 1980s, biologics have been seen as the next great step for the pharmaceutical industry. But as biomolecules have become more complex, our need for control and understanding of their development and manufacturing processes has increased as well. The potential for curing rare diseases and improving many people’s lives has expanded greatly, but as Spiderman’s Uncle Ben said, “With great power comes great responsibility.” Our quality programs will need to be designed much more stringently and carefully. A future with biopharmaceuticals, however, is far brighter than one without them — and that makes them worth the trouble.
References
1 CDER/CVM/ORA. Guidance for Industry: PAT — A Framework for Innovative Pharmaceutical Development, Manufacturing, and Quality Assurance. US Food and Drug Administration: Rockville, MD, September 2004; www.fda.gov/media/71012/download.
2 OGD. Question-Based Review (QbR) for Generic Drugs: An Enhanced Pharmaceutical Quality Assessment System. US Food and Drug Administration: Rockville, MD, September 2016; www.fda.gov/drugs/abbreviated-new-drug-application-anda/question-based-review-qbr-generic-drugs-enhanced-pharmaceutical-quality-assessment-system.
3 ICH Q11: Development and Manufacture of Drug Substances (Chemical Entities and Biotechnological/Biological Entities). US Fed. Reg. 77(224) 2012: 69634–69635; https://database.ich.org/sites/default/files/Q11_Guideline.pdf.
Emil W. Ciurczak is president of Doramaxx Consulting in Westchester, NY; 1-914-767-0720; [email protected]; www.thenirprof.com.
Having first appeared in the CPhI annual report from Informa (BPI’s parent company), this article is reprinted here with permission and edited for BPI style. Download the full report at https://www.cphi.com/europe/visit/news-and-updates/annual-industry-report-2019-final.