As early as 1997, automation was ready to offer potential benefits to the bioprocess industry (1). Professor Bernhard Sonnleitner of the Zürich University of Applied Sciences’ Institute for Chemistry and Biological Chemistry suggested a “standard operating procedure” and pointed to the opportunities, requirements, and potential pitfalls of applying the principles of automation to bioprocess development and operations. If “boring and less interesting routine tasks” could “more efficiently and reliably be handed down to machines,” he explained, then personnel could “engage in more useful work.”
His proposal might be seen as a genesis of quality by design: First, measure everything that can be measured early on; then, determine which variables are most relevant and identify those that must be controlled/documented; and finally, collect data from the process and organize it according to those determinations. In just a few pages, Sonnleitner showed the way to developing what many practitioners now call profound knowledge (2). However, even he warned that “it will take some time and efforts” to make this state of the art into “state of routine as well.”
What Took So Long?
Over a decade later, we are finally well on the way. What took so long? According to Sonnleitner himself in a recent email, “developments have proceeded rather slowly” over the past 10 years. When I asked him why there were so few related papers published after his paper (until very recently, anyway), he pointed out that in general, development work does not find its way into journal publication as much or as easily as research does. “Development of ‘problem solutions’ does not really pay back very soon,” he said. “We need problem solutions rather than instruments.” And up to this point, he added, the “robustness of many methods, algorithms, and hardware is insufficient for industrial application. However, the PAT initiative seems to be very helpful and supporting. Obviously, such developments need much time.”
BPI contributing editor Lorna McLeod recently spoke with industry consultant Larry West (formerly of Finesse Solutions and Broadley-James Bioprocess Technologies) about the FDA’s process analytical technology (PAT) initiative. He reminded her that it “came out with a firestorm of attention.” Early on, he said, some people were thinking “If you didn’t do it right the first time, PAT would give you an opportunity to fix it.” But, he continued, PAT has “found itself usurped by the role of quality by design (QbD), operational excellence (OpEx), and all these other acronyms. PAT by itself was more or less compromised because many wondered what it did and did not entail. In combination with these other emerging solutions, it is giving everybody a sense of comfort that automation won’t equal vengeful regulation.”
In another discussion, Norbert Hentschel of Boehringer Ingelheim Pharma in Germany told Lorna, “In our new cell culture faciltiy we have a MES system, but we actually don’t use it for a formal PAT program. However, if processes are developed applying PAT concepts, I believe that some measurement can certainly be used to apply it in a manufacturing environment. But you can’t look at PAT alone. It’s part of quality by design in process development. This is the foundation of process understanding, and that is the foundation for process control with a PAT program. What’s needed is linking the controls in a facility to an understanding of product, process, and quality. But it’s hard to implement a PAT program with established processes.”
In a separate interview, Lorna asked Peter Watler of Hyde Engineering and Consulting (formerly of VaxGen, Amgen, and Allelix Biopharmaceuticals) why he thinks automation has taken so long to find inroads with bioprocessors. “We’re waiting for additional sensors that are robust enough to stand up to a manufacturing environment,” he told her, “and that’s coming along.” Ideally, something like online high-performance liquid chromatography (HPLC) would help determine chromatography pooling criteria and selection during processing. “This could also help control purification unit operations such as tangential flow filtration,” Watler explained. “It could be used to monitor contaminant profiles during diafiltration. Analytical sampling, analysis, and feedback information could all be packaged together. Naturally, it will take some time to get those technologies together.”
Watler described feedback control — wherein information obtained by, for example, such sensors triggers control actions — as an essential goal of PAT. He pointed out that the new FDA guidance on process validation (3) emphasizes feedback as a way to learn about a process through monitoring. “I think the regulatory guidance will help to stimulate the industry to keep moving in the direction of feedback control,” he said. Traditionally, process validation has translated to processes that are “locked in” and tightly controlled without deviation. So-called out-of-specification (OOS) results require investigation and corrective/preventive actions (CAPAs). Combining PAT with QbD should allow for normal fluctuations in temperature, for example, or material variabilities — especially with feedback controls set within design space parameters.
The process validation guidance, Watler says, “recognizes that there will be variability. It’s asking manufacturers, ‘How are you going to deal with this variability? What kind of feedback control are you going to have?’ All that then ties back to PAT. Ultimately, we’re controlling product quality, which is really the target of our manufacturing processes.”
Controllers and Communications: Looking back at Sonnleitner’s article, West remarked, “You know, 1997 seems like a lifetime ago. In defense of our industry, the reason we couldn’t do what was inherently obvious to the author was the lack of core technologies: the controllers and the handshakes between them and bioprocess equipment such as reactors and chromatography skids. A 1997 bioprocess controller was about the equivalent of a 2002 videocassette recorder. They were built to function rudimentarily because that was what they were tasked to be. What we realized fully five years after that article was that if we continue to run our processes on glorified VCRs, we could never achieve the efficiencies and models needed to make this a business and not just science.
“So the controllers started to evolve, and with that evolution the associated pumps, mass-flow gas measurement devices, everything evolved with them to elevate our capability to where we are today. Full-blown automation didn’t even really occur until 2005. It took that long for all the elements to finally come together and move forward. It wasn’t a lack of vision; it was a lack of technology.”
Christopher Procyshyn is CEO of Vanrx PharmaSystems, Inc., an emerging player in isolation technologies and aseptic processing. He told Lorna, “Where a lot of difficulties happen is at the level of communication between those with the automation capabilities and those with the process needs. If we look at almost any other industry, automated systems and process control are frankly becoming quite ubiquitous. And the regulators are seeing successful projects and saying that obviously it can be done.”
He pointed to his area, aseptic processing, as an example. “Recent statements both in Europe and the United States call automated technologies and isolator systems the ‘new standard in aseptic filling.’ This suggests that they are ready to start pushing the envelope of what people have been comfortable with. But the real difficulty I’ve seen overall is a lack of understanding of the underlying processes. To automate a process, you have to understand it completely. People need to understand what’s required and what’s possible. I think more can be done on the vendor side to gain insight and knowledge of what customers need, then creating solutions and problem-solving at a deeper level. The communication could be better. [Respected consultant] James Agallaco has recently been rather vocal about this issue. It’s really up to everybody to understand how far behind we are in this industry.”
In the same conference call, Lorna also spoke with Martin Rhiel of Novartis (formerly of WAG/Schering Plough and Cytos Biotechnology), who knew Professor Sonnleitner at the ETH in Zürich, Switzerland some years ago. “It was always difficult,” Rhiel reported, “to find the right sensors for your bioprocess. It’s not like in other industries with everything chemically defined. If everything would work as well as a temperature probe, then of course one could use feedback controls for critical process parameters.”
Rhiel agrees that the real difficulty is a lack of profound process understanding. “Sensing does help us a lot,” he said, “but it is difficult to completely automate. Getting a reliable signal, having a reliable sensor with valid readings, that’s a huge challenge. I think that’s one of the reasons why we’re not where we would like to be.” Because of the complex nature of bioprocesses, Rhiel says that even the best sensors currently available aren’t yet reliably plug-and-play. “They’re good, but not perfect yet. We use classical probes in our GMP environments, and for measuring biomass we have a sensor. This works quite well, it reliably monitors cell density, but for many other signals it can be difficult. Novartis actually started a PAT laboratory with online HPLC and other online probes about 10 years ago; it took quite a while to get reliable measurements. Luckily our company could work together in close collaboration with a Swiss vendor. With an integrated approach, we can solve a lot of problems with feedback and new testing approaches, but not yet within the GMP environment. It will be a major effort to qualify all this equipment.”
Hentschel told Lorna, “I know a lot of facilities with a very high degree of automation and control. In our newest cell culture facility (which started up in 2003), we have MES directly connected to our distributed control system (DCS). And the facility automation is directly controlled by electronic batch records. So I believe we’ve moved far beyond the degree of automation we had in 1997.”
Major companies have introduced automation in the form of communication between traditionally standalone devices such as blood gas analyzers and nutrient monitors. These are tied into automated sampling solutions, allowing such devices to run all day, every day without significant human intervention. The next step is tying that into bioreactors and chromatography skids to create an automated loop whereby a process is monitored continuously by a controller, which communicates its findings to technicians who can then make decisions and act upon them. That, West says, will be the next generation of automated bioprocesses.
The initial problem companies can face at that point, he admits, “is fear. Turning around and walking away from a system that’s fully automated is a little bit intimidating.” Will that system perform to its promise? “There’s a lot of potential to overpromise the technology,” he warns, which then fails to meet those expectations if the bar is set too high.
But are those fears grounded in reality? Or do most companies find success? West told Lorna, “Mostly they’re reporting significant yield and performance improvements. Originally, the idea was that there would be cost benefits — and that was heightened with the recent economic turmoil — but the deliverable has proven to be not just the relative cost of putting someone into a more productive role doing something else.” Companies are seeing actual performance improvements, and those achievements are driving back into their systems.
Not long ago, when a company sought to buy a closed-loop automation solution, only a handful of vendors could deliver that level of automation. But in the past two years, their numbers have greatly increased. Some users are already saying to the vendors, ‘If you want our order, then you’d better offer this.’ “It’s gone from being technology for its own sake,” says Larry West, “to technology for improving bioprocess management.” Vendors have begun investing in automation technology as not simply another feature to add to their systems, but rather, a strategic advantage to offer their customers.
West mentioned Nova Biomedical as one company that’s particularly active in this area. “They put a significant amount of money into automating their nutrient monitors to give them OPC capability. This capability, coupled with similar efforts from firms such as Groton Biosystems in the realm of automated sampling, have in effect permitted the building of bridges between normally stand-alone processes.” (See the OPC box on the next page for more information.) “This inherently makes these devices more capable of communicating with associated next-generation controllers that are fast becoming increasingly dominant in the market.” Other active companies he mentioned include New Brunswick Scientific, SciLog, and Sartorius. Others have highlighted Aber Instruments and BioSpectra. On the controller side, West highlighted DASGIP as a company that’s made interacting with other equipment part of its mandate.
“When I say who we’re working with,” Hentschel pointed out, “that doesn’t mean there’s no one else. But we worked together with Werum Software to develop the MES we have in place. And we have different suppliers for the process equipment automation: e.g., DCSs from Honeywell or Siemens. MESs offer great advantages. For example, they can detect out-of-range events and automatically alarm you, which is a big benefit during your batch record review — so you don’t have to go through everything by hand. The very high traceability of processes also allows you to use process automation and data collection as a tool for process optimization.”
Procyshyn points out that vendor advancement depends somewhat on communication with end users. “But I think it’s got less to do with the vendors themselves,” he said, “than with the projects being done. People aren’t willing to work on developing new sensor technology and new installations in the middle of a project. So when there isn’t a solution that’s been developed ahead of time and proven and used, people will choose the path of least resistance. During the past few years, it’s really the vendors with more process knowledge that can propose those solutions with some process understanding to back them up. And that allows people to take the leap.”
WHAT IS OPC?
Object linking and embedding (OLE) for process control (OPC) was the original name for a standards specification developed in 1996 by an industrial automation industry task force. It’s how real-time data is communicated among control devices from different manufacturers. The OPC Foundation maintains this standard, saying that officially OPC is no longer an acronym, so the technology is simply known as OPC. Although it has been heavily used in process industries, it is also widely used in discrete manufacturing, thus the deemphasis of the phrase process control.
Electronic Records, Electronic Signatures: For those of us who have been covering the bioprocess industry over the past decade or more, the first thing we think of when we hear someone talk about process control data is 21 CFR Part 11. We all remember the trauma that came along with its 1997 introduction — and guidance documents published in 1999 and 2003 didn’t help much. Although they narrowed the scope of the regulation, they also managed to contradict it in some areas. A final guidance appeared in May 2007, but the original regulation had since been withdrawn for consideration. A new version of Part 11 was expected in 2006, but it has yet to appear. At least one member of the FDA’s Part 11 working group has publicly stated that the timetable for that release is “flexible.” (To say the least!)
West told us that 21 CFR 11 “put us all through the drill of self-analysis. We did a lot of things to accommodate it, but we also went kind of silly on some things (chart recorders, for example). Everything got hit so hard with that broad-stroke brush that it really ended up intimidating some people.” Now such memories may be translating into a fear of automation.
“The pendulum has swung back a bit,” West said, “to where we appreciate 21 CFR 11, but we also have clearly defined where it does and doesn’t go. Now we’re able to use the equipment and associated technologies to take us to the next level.” Interestingly, he pointed out, “now the drivers aren’t as much regulatory requirements as business necessity. ‘We’re running out of money. How are we going to run this plant with 30% fewer people?’”
Companies such as Emerson Process Management make automated plant management systems, another concept that’s finally making headway in the bioprocess industry. Procyshyn said, “In my experience, that’s becoming a forefront if not the standard for most systems, particularly in North America. We’re seeing its migration into aseptic and fill-finish facilities. We built ours without electronic batch records, but with 100% supervisory control and data acquisition (SCADA) control and MES planning for data historians and process operations. Almost every process in a modern pharmaceutical system is now electronically controlled in some form. In fact, some of the earlier work that Emerson and others did were in the early days of biotech with upstream production. So it definitely always has been and always will be part of it.”
West is the named inventor on the patent for the DeltaV-based BioNet bioprocessing control system from Broadley-James. “So my answer will be a bit biased,” he admitted. “If you take a step back, Emerson is really just DeltaV when it comes to bioprocess control and management. Are DCSs the future of bioprocessing? Are we going to see hybrids of DCSs for plant networking and benchtop PCs for research? Four years ago, we were all using programmable logic controllers (PLCs). Three years ago, there was a renaissance with DCSs that just ran its course due to cost factors. Today, emerging controllers from companies such as Applikon, New Brunswick, and Finesse have moved away from DCSs to platforms such as the Intel Atom processor much like that found in a high-end personal computer. So we’ve almost come full circle. Ten years ago, you couldn’t give a PC to a bioprocess person because the ‘blue screen of death’ scared them out of their scientific socks! Today, with the evolution of the Microsoft Windows operating system and several years of comfort, PCs are reemerging as a viable platform on which to do bioprocess work. Not to mention, they’re about one-tenth the price.”
Upstream, Downstream, Fill and Finish: One thing we wondered, having seen some discussion of perfusion cell culture in recent meetings and articles (4), was whether the nature of this cell culture mode makes it more amenable to automated control. Larry West, too, has noticed this resurgence of interest. “I was at a meeting not too long ago,” he told Lorna, “where people were asked if they had an opportunity to use perfusion and just walk away from the batch, whether they would switch to perfusion culture. The vast majority of people in the room raised their hands.” When the same room was asked who was actually using it, however, only two people answered in the affirmative.
“From an automation standpoint,” West continued, “perfusion has its own challenges. Some of its associated failures, automation can’t fix (for example, central failure due to drift over time). In some instances, very highly informed process management people are addressing some of these shortcomings with technology, and the case in point would be pH probe drift due to their exposure to the process stream. In perfusion, that’s significant because that exposure timeline can be so long. Five years ago, you bought a $2,000 retraction assembly, and you pulled out your pH probe and changed it, then put it back and hoped you didn’t get contamination. All this associated infrastructure, it wasn’t very practical. Today, you’ve got the same probe in line, but now you’ve also got a nutrient monitor and associated in situ sensors providing improved insight into the bioprocess. However, nutrient monitors and associated automated sampling functionality have really redefined the process. Today, an in situ pH sensor can indicate a reading of 6.2, but a quick check of the nutrient monitor can determine that to be 6.5, indicating drift on the part of the pH sensor. This allows for the controller to adjust its control strategy for the correct pH while initiating a standardization of the pH sensor to correct for the drift. That’s using the technology as a resource, and it makes perfusion more practical.”
Procyshyn doesn’t think perfusion is any more amenable to automation than other culture modes. He said they all have “some level of continuous process monitoring adjustment.” And Hentschel agreed. “We’ve automated a lot of different processes in cell culture, batch fermentation processes, chromatography columns, filtration systems, and so on. For example, our weighing processes for raw materials are highly automated. We also have perfusion culture established, and I wouldn’t say that it is necessarily a prominent example for automation.”
“Personally,” Rhiel put in, “I do like perfusion a lot. But only a few companies are using perfusion culture. If it runs well, and its well automated and under control, it can have the highest productivity you can get. But it’s not so commonly used.” Maybe that will change if the process can truly offer the walk-away solution West mentioned. Clearly, many upstream process engineers would be interested in learning more.
Another topic many people are talking about these days, particularly in relation to monoclonal antibody production, is platform purification technologies. And it’s another advancement that we thought might help automation find a place in biomanufacturing. “Absolutely,” Larry West agreed. “That’s where the money’s at. The downstream guys know they’ll have to break up some bottlenecks and invest in the necessary technical infrastructure to do so. If you look at nutrient monitors five years ago, they would give you pH and dissolved oxygen (DO) readings. Three years ago, they would give you pH, DO, amino acids, and gas analysis. Then in 2009, they started measuring IgG. So they’re moving downstream as well, and that can correlate directly to automation and tying in real-time measurement feedback to optimize chromatography.”
“I believe platform technologies make automation in downstream processing easier,” said Hentschel. “Copying an existing automation program for downstream operations requires less adaptation if you can apply most of the same subroutines. Processes are also automated without platform technologies; it just takes more effort.”
Aseptic processing is an area where automation seems to be really taking hold. Is this as true for biopharmaceuticals as for classical drugs? Yes, Hentschel told Lorna. In fact, “aseptic processing, in some aspects, was very highly automated from the beginning. For biotech products, you typically cannot sterilize your aseptically filled product, so you want to prevent any human intervention whenever possible. So a high degree of automation is desireable. There was a very interesting talk by Martin Van Trieste (vice president of quality at Amgen) during the annual PDA meeting, where he showed a very highly automated aseptic processing operation in his talk on the future of aseptic processing (5).”
West countered that “automation is making a strong move in inspections, especially. Traditionally, it was associated mostly with low-risk APIs. But what Genentech proved with their Oregon facility, when they went all DeltaV, they embraced the idea that classic automation solutions could be applied to biotherapeutics. They kind of validated the concept of fill-finish running on DCS. Bosch and other contract packagers have used PLCs, which have been considered faster than DCS for on-off applications such as fill and finish. But over the past several years we have seen companies such as GE and Emerson partner to create a chromatography skid that allows discrete (on/off) functions to be handled by the GE Unicorn platform while analog functionality is managed by Emerson’s DeltaV system. This model is rapidly evolving: In 2009, we saw a SciLog TFF skid controlled by the Emerson DeltaV platform, which manages associated discrete I/O. Such application progressions ensure the continued evolution of automation throughout a product’s life-cycle, begining in research and ending in fill-finish.”
Procyshyn pointed out that the pick-and-placement and visual analysis functions in fill and finish are ideal for automation. “One thing we’re focusing on is developing tools and techniques for use in areas where isolator technology and peroxide sterilization is possible. In that case, once you’ve achieved that level of operation in a clean and sterile environment, then there is a lot that can be automated. I would suggest that nothing can’t be automated. The problem I’ve seen with traditional approaches to aseptic automation is that typically it’s an add-on, an added complication or level to a process, whereas in other industries it is integrated from the bottom level. If you add additional levels of complexity and failure points, this has been some of the challenge.”
He explained that his company’s focus is “actually looking at processes from a design standpoint, how automation and microprocessors can give us new tools and opportunities in redesigning processes. I think very much the turn of the future is toward fully aseptic operations. It’s very clear that humans and human-born contamination are the largest problem. And we’re seeing that those facilities that embrace automation are having excellent success. We’re also seeing regulators in the field getting a lot more stringent in their expectations for contamination control, so I think the general trend is for this to come about a lot sooner than what some people might have thought in the past.”
Where to, Now?
If his approach sounds familiar, it should. It fits automation very well into the context of quality by design. Procyshyn said the key is “focusing on what it should be, the right design and designing it right, then working on engineering it to a level of reliability and efficiency. I think the level of risk that comes with one-offs on the customer side has really pushed automation to the background. What we focus on is doing that work ahead of time in prototyping and development, so that when we’re working with a customer on a project, we have a solution that’s been through its paces already. When that’s done, people embrace automation.”
He pointed out that people in bioprocessing have used numerous forms of automation electronics (for example, in QA/QC laboratories and process optimization/validation studies), but not in the most critical areas because of risk. That was until modern solutions became available. “I think this is changing because,” Procyshyn said, “let’s face it, the average person in this industry has a very complicated job. And it takes up a lot of their time. So you add another level of change, and there are only a chosen few who have the luxury of time for that. It’s going to take a level of integration that we see in other industries such as food processing, electronics, and automotive to really do this. Car companies didn’t just do this on their own when they went to full automation; they worked with their vendors and supply chains, and that’s how it’s going to be in this industry.”
Often the point is made that making drugs (especially biologics) is too complicated to automate, that the regulatory requirements get in the way. Procyshyn counters: “We can’t make cars that kill people, either.” The larger difference, he says, is in cost pressures and efficiencies. “With more pressures from global healthcare peers now,” he said, “we are seeing automation take hold. People want to put more into their R&D, and automation saves money. It doesn’t cost more if it’s done properly.”
Rhiel agreed. “We can take proven technologies from other fields and adapt them to the field we’re in. We’ve had good experiences with our high-throughput screening and other automation projects. Of course, there are always challenges. It also took a while for robots to become standard in the automotive industry, and the first ones looked very different from those today. Novartis started from scratch with its automation projects; there was nothing on the market to buy, so we developed it with a vendor. It just takes time to work out a routine that’s robust.”
“It shouldn’t be strictly up to our customers to drive innovation,” Procyshyn added from the vendor perspective. “A lot of our market has said to Novartis and Genentech and others, ‘You be our guinea pig, and try the new technology, and we’ll see how it goes.’ Sometimes it works great, but sometimes it doesn’t (particularly in isolator technology). We need to be doing more development before it’s on the critical path of drug production. There are heavy regulations in the aerospace industry, and Boeing does testing and product development. They work with their customers, but it’s a tested and proven product before United and Lufthansa are flying that plane.”
Rhiel countered, “It would be really nice to just buy it and implement it, but this doesn’t always work. In my personal opinion, a big hurdle is the regulatory environment. If you have a registered process, it’s difficult to change it, so that’s a big hurdle. Of course, there are cost pressures. Changing and revalidating a process and getting approval again, that’s a huge cost. So it was easy to use common technology that works well enough with the health authorities, and it was easier to get approval. If you’re at the forefront and going to the health authority, it’s always difficult. But nowadays, the FDA is working together with pharmaceutical companies in implementing new technologies.” Collaboration, it seems, is key.
Procyshyn pointed to a successful example of the acceptance process. “Over the past eight years — more in North America than in Europe, where they’ve done a better job of applying good engineering — we saw isolator technology come from something that was considered risky to a standard in modern manufacturing.” So even though the pace of change is slower in bioprocessing than in some other industries, he affirmed, it is possible. “The fact is that pharmaceutical companies can do it. It’s not a technical issue, and it’s no longer a real regulatory issue. It’s more about communication and development and investment. I think equipment vendors can do a lot better to invest in their technology rather than relying on their customers to do that.”
Hentschel pointed out, too, that the current interest in automation for bioprocesses may well stem from the growing costs of labor over the past decade or two colliding with a recessive economy. For whatever reasons, process developers are learning that the time is now for implementing many types of automation into their work, from cell line characterization to process optimization. Whether it goes on to establish itself as a major aspect of the actual processes they create remains to be seen. But in the analytical laboratories at many companies, automated testing has already become indispensable.
Larry West says automation is a valuable tool that “can be used to benefit all of us as an industry. It doesn’t have to be perceived as a weapon with which to reduce headcount or eliminate people’s roles. It really is a tool. And as long as we keep that in perspective and wield it accordingly, this industry will make some serious advances.”
Case in Point: Vendor Gets Together with Users and the FDA. Online at www.bioprocessintl.com/bpiextra, you’ll find contributing editor Lorna McLeod’s interview with Caliper Life Sciences, Inc. CEO Kevin Hrusovsky about the training session his company was invited to present for FDA reviewers.