Introduction: Technologies Converge in Biopharmaceutical Laboratories

View PDF

Bioassay development is foundational to the well-characterized biotechnology product paradigm. Bioassays are the best tools for drug developers to use in determining the biological activity (potency) of their products, which has been a biopharmaceutical critical quality attribute (CQA) since long before that concept had a name. Thus, these assays are vital to quality assurance and quality control (QA/QC), preclinical studies, and clinical testing — and by extension to process development and monitoring.

Because of their complex nature, bioassays are among of the most challenging experiments to perform reliably with dependably accurate results. Consistency requires a controlled environment and qualified reagents, skilled analysts, and assay protocols that are intelligently developed, characterized, and validated. Depending on the use of a method, it could be required to comply with good laboratory practice, good clinical practice, or good manufacturing practice (GxP) regulations as well as electronic data security.

New technologies are posing new challenges while offering solutions to streamlining development and obtaining the best possible data for analysis. Laboratory automation can increase throughput and accuracy of results, but it often comes with challenges of its own — as Roche’s Hermann Beck reveals elsewhere in this special supplement. Powerful instruments such as mass spectrometers make it possible to understand biological products better now than ever before. Making those systems work together can be a specialty in itself.

Questions and Answers with a Laboratory Automation Expert
Milena Stanković-Brunner is a scientific development consultant at Synthace Limited in London, UK. The company’s Antha software platform creates reproducible and scalable biology workflows that can be edited and shared through the cloud for laboratory automation. She kindly answered my questions below around the end of 2020.

What are the most important trends in modern assay development for bioprocess development and QA/QC? Everyone wants to take the next step within bioprocess development and implement something transformational instead of making small incremental changes. In assay development, this can manifest as developing robust assays as early as possible. To do so, multifactorial design and execution play a pivotal role in ensuring that all the diverse conditions can be developed under which an ideal assay should run.

Speed is important, but equally (or perhaps more important) is developing the most successful assay possible. A common obstacle has been the trade-off between the time taken to optimize complex assays, the need for speed, and the increasing scope of modalities that bioassays are expected to address (e.g., cell/gene therapy products).
In the QA/QC space, there is a slightly different set of priorities. Whereas higher levels of automation and throughput have been implemented successfully in upstream and downstream processing, quality laboratories seem to be lagging. That could be attributed to the high burden of regulatory compliance, but QA/QC groups need to consider the approaches of others and figure out how to catch up. Rapid turnaround of off-line assays is a critical driver to ensure that improvements made in other areas of process development can be realized. That makes automation and integration key deliverables for quality teams.

Many organizations are moving toward multiple assays to triangulate CQAs. Examples include potency (assessed with immunoassays, mass spectrometry, and/or cell-based assays) and capsid packing ratios (assessed using quantitative polymerase chain reaction, qPCR, or next-generation sequencing, NGS). Those methods often get used in isolation, but digital infrastructures need to tie this rapidly evolving analytical landscape to bioprocesses in real time. That requires control of analytics and a unified data infrastructure. Such an approach is used to speed up process development and biomanufacturing (e.g., with real-time release testing).

How important are biostatistics to modern biopharmaceutical laboratories? At Synthace, we believe that biostatistics are crucial to modern biopharmaceutical laboratories. Biostatistics are generated in every step from experimental design powered by robust statistical methods such as design of experiments (DoE) to data analysis, structuring, and visualization.

However, implementing powerful statistical analyses in lab workflows is difficult. Automation brings precision and high throughput, but programing automation equipment (especially in complex DoE experiments) is highly inflexible and requires users to cultivate coding knowledge. We have experienced such problems first-hand: In our beginnings as a bioprocess-optimization company, we created Antha software to facilitate execution of automated DoE.

Data structuring and analysis empower and facilitate downstream assays and inform future ones. Structuring data from different sources remains a manual and error-prone process in most biopharmaceutical labs, and Antha software relieves scientists of that burden. Automatically structured, analyzed, and visualized data give them more time to do what they do best: make sense of it, learn from it, and exchange their knowledge.

What distinguishes your Antha platform? Most other options for automating assays come with limitations that it was created to address. Antha software ensures smooth communication between scientists and their automation, lowering the barrier to entry to its use.

For example, a number of options are available for scientists who want to execute fully automated enzyme-linked immunosorbent assays (ELISAs). But such a multidevice protocol usually requires users to interact with multiple different programs, increasing their cognitive burden overall. A scientist must program a liquid handler, a plate washer, and a plate reader and ensure the connections among those three devices. And the data analysis that follows (involving manual transfers) can be error prone.

Antha software provides a single point of contact that enables users to design, modify, and simulate experiments in silico flexibly without the need for programing knowledge. It integrates execution through a liquid handler, plate washer, and plate reader — and all required movements among them. Upon completing an assay, a scientist gets an automatically structured dataset with standard curve visualization and calculated sample concentration.

Going back to some of the trends you identified, how does your technology help? Users can transform the hours spent learning to program laboratory automation into minutes, designing experiments and modifying parameters on the fly. One top-five pharmaceutical company used Antha software to orchestrate automation of ELISA protocols and reported a 70% reduction in method programing time.

Scientists can design and simulate their experiments in silico before entering the laboratory. They can modify experimental parameters and retest experiments in silico multiple times. The Antha platform provides a step-by-step preview of an entire experimental process, including automatically programmed liquid-handling steps, plate movements, and washer–reader integration, with full tracking of samples, data, and metadata.

To go from a simulated outcome to execution, users set up plates that contain samples or other solutions following Antha-generated layouts, then initiate the automated protocol. The process entirely bypasses coding because the software automatically programs the automation.

Many automated systems include “wizard” presets to help users set up common workflows. Our Antha platform is like a “wizard” itself. It enables users to run in silico simulations of their protocols before physical execution. Thus, scientists can create protocols in the cloud regardless of where they are located physically, facilitating knowledge exchange and protocol traceability around the world. That saves time and resources by eliminating the need for “trial and error” strategies.

Another important functionality is saving methods that are used commonly and executed repeatedly. Saved methods can become template protocols (workflows). Users can select all or some template input parameters to modify, and a workflow can be resimulated in silico and reexecuted a number of times. This provides traceability, enables seamless exchange of assay metadata and knowledge, and saves time in experimental design and setup.

Can you provide some insight into how these regulatory requirements are met with cloud systems such as yours? Some of the most significant gaps in meeting regulatory requirements for laboratories are related to data integrity. Data records are often disparate and difficult to collate with all their associated metadata. Digital data solutions such as ours bridge such gaps by collating data to create complete records in the context of experiments that have been run, thereby moving toward a regulated environment.

Currently we are focused on the research and development parts of the process, supporting FAIR data standards (findability, accessibility, interoperability, and reusability) to ensure that complete and reusable data records are available (1). The next steps in moving to support GxP environments will include 21 CFR Part 11 enablement using our current authorization technology to enable audit trails and digital signatures. When this has been layered over the already extensive data records compiled within the platform, digital data integrity through cloud-based solutions will address some profound challenges for digital integrity in GxP labs.

Can users automate familiar assay kits with your software? How does the system aid in validating in-house–developed methods? Synthace developed this platform to support intuitive automation of both kit-ready and in-house–developed protocols. When implementing an ELISA kit in Antha workflows, first an in silico representation of the coating plate is generated. That enables metadata tracking and matching of antibody–antigen pairs during the automated run, enabling seamless analysis at the end of the experiment. The toolkit is expansive and can support different ELISA formats (direct, indirect, competitive, and cell based).

Antha software is designed to orchestrate and record metadata generated over the course of an experiment — including liquid locations and concentrations — to perform instant analysis of samples through either a four-parameter logistic (4PL) or linear fit. Provided statistics include intraplate coefficient of variability (CoV) for technical replicates, model-fit parameters, and associated errors to help researchers validate their data. The cloud provides easy access to and sharing of results that are associated automatically with each experimental run.

Does the technology support automating other assay formats? Being cloud based, the Antha platform integrates laboratory-automation machines into connected workflows and structures data from different sources, facilitating information exchange with experiment traceability and reproducibility. It supports device-agnostic assay execution on liquid-handling robots and integrates analytical devices (e.g., plate readers and bioreactors) to collect and structure data automatically. The technology can support a number of protocols, including DoE for assay, media, and buffer optimization; miniaturized purification; qPCR; and construct assembly (2–5).

More Conversations Ahead
Whether measuring potency, stability, or host-cell protein (HCP) content, a bioassayist’s ultimate goal is to develop a robust assay — or a complementary set of tests — validated to be trustworthy even under changing conditions (6). As my conversations in this special supplement show, different types of assays can pose different types of obstacles on the pathway to that final destination. First, Hermann Beck provides a user’s perspective on the topic of laboratory automation for potency assays. Then consultant Denise Krawitz highlights the pros and cons of mass spectrometry for HCPs. And finally, consultant Nadine Ritter explains some major concerns in developing and transferring stability test methods. I’m grateful to all of them for making time to chat with me over the past few weeks.

References
1 Experimental Data Generation As a Foundation for a FAIR Data Strategy. Synthace: London, UK, 3 November 2020; https://www.synthace.com/experimental-data-generation-as-a-foundation-for-a-fair-data-strategy.

2 Assay Development Using DOE and Antha. Synthace: London, UK, 2020; https://www.synthace.com/our-protocols/assay-development-doe/assay-development-doe-detail.

3 Miniaturized Purification with Antha and Te‑Chrom™. Synthace: London, UK, 2020; https://www.synthace.com/our-protocols/miniaturized-purification.

4 Automated qPCR Setup, Execution and Analysis with Antha. Synthace: London, UK, 2020; https://www.synthace.com/our-protocols/qpcr/qpcr-detail.

5 DNA Construct Assembly and Antha. Synthace: London, UK, 2020; https://www.synthace.com/our-protocols/dna-assembly/dna-assembly-detail.

6 Scott C, et al. Bioassay Evolution: Finding Best Practices for Biopharmaceutical Quality Systems. BioProcess Int. 18(1–2) 2020: 20–30; https://bioprocessintl.com/analytical/qa-qc/best-practices-for-bioassay-development.

Cheryl Scott is cofounder and senior technical editor of BioProcess International, PO Box 70, Dexter, OR 97431; 1-646-957-8879; cheryl.scott@informa.com.

Leave a Reply