In the Laboratory Automation Zone

View PDF

When you hear the phrase “laboratory analysis” on a TV commercial, maybe you imagine a technician in a white coat and safety goggles pouring a chemical from one test tube to another.

Technicians still wear white coats and goggles, but today, in many labs, they’re not the ones pouring the chemicals. Instead, tiny trays carrying minuscule dabs of samples are whisked by robots from one analytical workstation to another. The workstations are equipped with ultraprecise instrument systems to prepare the plates, apply dyes, dispense reagents, mix whatever needs mixing, incubate cells, maintain temperature controls, apply UV light or chromatography or spectroscopy or X-rays, and measure and record reactions — thousands of them in an hour. The whole operation is governed by a PC on the laboratory bench, which tells the workstations and robots how to conduct the testing and tracks the experiment’s progress from start to finish.

Laboratory automation offers scientists “the ability to set up large sequences of experiments and walk away — in theory to do something productive,” says Steve Hamilton, an analytical chemist with the consulting firm Sanitas, a charter member of the Association for Laboratory Automation (ALA). He co-taught a short course on laboratory automation at the ALA’s LabAutomation conference in January 2008.

“Automation also increases reproducibility,” he says, “because [testing by hand] gets very repetitive. It’s the kind of thing a person gets tired of doing, and so you can sometimes get sloppy.” The right machine can do the same thing over and over — whether cell-based or enzyme assays, ELISA, sequencing of genes, or PCR clean-up — and get accurate results every time. And it works nights and weekends without overtime pay.

Since the early days of lab automation, back in the 1970s, a whole industry has grown to provide a wide range of automated analytical processes, specialized workstations, robotics, and information-management software to help scientists design, conduct, and track experiments and analyze the data. Forty years of improvements since then have sped up the work by orders of magnitude. In 1960 an efficient technician could screen perhaps 100 samples of, say, potentially cancerous breast tissue in a day. Now a fully automated laboratory can perform tens of thousands of screenings in one single day. Along the way, lab automation equipment and techniques have evolved to handle laboratory tests that were unheard of 40 years ago: sequencing of genes, tissue culture, stem-cell differentiation, polymerase chain reaction (PCR) and X-ray crystallography.


It is not surprising that the pharmaceutical industry is the biggest consumer of laboratory automation, but it is not alone. Laboratory automation systems are also important in the food-processing industry, in agricultural research and development, in materials science, in law enforcement, and in counter-terrorism.

One of the first automated devices for laboratories appeared in the 1950s. It was the AutoAnalyzer made by a company called Technicon (the product is manufactured these days by Seal Analytical, The AutoAnalyzer sipped up liquid samples and reagents into a flowing stream, separated them with plugs of air, and then combined each sample with the proper test chemicals and routed it through silicone tubing to various analytical stations.

The AutoAnalyzer allowed a researcher to set up laboratory operations in a continuous flow. “This was a really big development,” says Hamilton. “You could set up maybe 100 samples, have all the reagents [prepared], and walk away and leave. You’d come back in several hours and the AutoAnalyzer would have performed all the tests.”

By the late 1970s, researchers were exploring the use of robots. For some years, the automobile industry had been using big, relatively simple electromechanical devices that performed single tasks over and over. Why couldn’t robots be made smaller and smarter for laboratory work?

In what proved to be a serendipitous convergence, the microprocessor was being developed at the same time, and the researchers put the two technologies together. By the early to mid-1980s, microprocessor-driven robots were operating in laboratories. The first robot to appear on the market was made by a company called Zymark (now a division of Caliper Technologies, “It took off and was very popular,” says Hamilton. “It could do a lot of lab unit operations that people had been doing on a lab bench.”

The Zymark robot was given a try in the actual mixing of reagents, rotating its “wrist” back and forth and dumping the contents into a test tube. “It was great fun to watch,” says Hamilton, “but we quickly realized that, while this was cool, it was slow. A robot arm is more of a general-purpose device, best used to move things from one place to another, not for delicate manipulations.”

Within a few years, lab unit operations evolved into the arrangement most common today. The main analytical work is done at workstations, some of which are composed of electromechanical devices designed to perform part of the test sequence — for example, squirting reagents into tubes, or, later, into microplates — and others consisting of instruments for processing the samples: sequencing genes, identifying and characterizing proteins, performing immunoassays, or conducting whatever tests are called for. The workstations are linked by robotic systems that transport the test materials from one workstation to the next. The whole sequence is practically untouched by human hands — carried out by the workstations, the robots, and the computer program that oversees it all.


The 14 companies exhibiting within the Lab Automation Zone at the 2008 BIO International Convention cover the range of products falling within this category.

Acurian (Booth #4236):

Ahram Biosystems (Booth #4142): (email)

Air Products and Chemicals (Booth# 4238):

Andersen Products (Booth #4136):

AntiCancer Inc (Booth #4037):

CapitalBio Corporation (Booth #4138):

Cobra Biomanufacturing (Booth #4135):

Competitive Technologies (Booth #4035):

Greiner BioOne (Booth #4039):

e Pharmaceuticals
(Booth #4139):

Kelly Scientific Resources (Booth #4143):

Micromeritics Instrument Corporation (Booth #4137):

Tecnisco (Booth #4045): email))

thinXXS Microtechnology (Booth #4043):

Because the pharmaceutical industry is automation’s biggest customer, says Hamilton, its needs are largely driving the development of lab automation technology. In the race to bring new drugs to the marketplace, companies must test many, many chemical compounds in the hope of finding something in them that might prove efficacious against disease.

It is essentially a job of sifting a few grains of wheat from a mountain of chaff. Pharmaceutical companies are constantly collecting potential drug-like compounds. A few will be candidates for breakthrough drugs. But which ones? The bottleneck has always been in the testing. Companies needed a way to test multiple compounds for multiple characteristics and get results fast. High-throughput screening, a key development of the past decade, is answering that need.

High-throughput screening takes advantage of an important early development of lab automation, the specification of a set of standards for microtiter plates. A standard configuration for these microplates (as they’re nicknamed) allowed manufacturers of instruments and robots to use known dimensions for their designs.

Microplates were developed in the early days of lab automation to replace racks and racks of test tubes. Essentially, a microplate is a molded plastic container with a grid of little depressions, or wells, to hold the compound being tested. A microplate is roughly the size of a thin paperback book. Initially, microplates had 96 wells, and test materials were dispensed into the wells by hand. As laboratories became more automated and robots gained functionality, microplates evolved to contain smaller but more numerous wells. The newest microplates contain 1,536 or even 3,456 wells.

Because the tiny wells hold less of everything, the smaller plates are more cost-effective. Largely for that reason, the technology of test plates is continuing to evolve toward ever-smaller test surfaces. “One motivation to miniaturize is to save money on compounds and reagents,” says Jim Sterling, an engineer at Keck Graduate School of Applied Life Sciences in Claremont, CA, who’s studying microfluidics. “Compounds are very expensive, and they’re the jewels of the company.” Not only can miniaturization save money, he says, but it can also accelerate some processes and simplify the type of automation required — possibly eliminating the need for external robots.

Microfluidics and microarrays are the latest evolution in the trend toward miniaturization. Microarrays are like microplates minus the wells: The test material is dotted onto the surface by a sophisticated dispensing system that operates much like an inkjet printer. A microarray of one square centimeter can hold 250,000 dotted samples. Microfluidics harks back to the early days of the AutoAnalyzer. Instead of flowing through silicone tubes, however, fluid reagents and compounds are dribbled through tiny channels etched into a plastic surface the size of a credit card.


Association for Laboratory Automation (ALA);

ALA LabAutomation conference (January 2008); See especially Short Courses and New Product Launches.

Journal of the Association for Laboratory Automation;

Keck Graduate School of Applied Life Sciences; www.kgi/edu

Microfluidics, among its other virtues, holds promise to answer of the pharmaceutical industry’s main questions: how to reliably test drug efficacy and side effects in human cells or tissues without having to test them on humans. The tiny etched channels of a microfluid system are a potential stand-in for the human vascular system, says Hamilton. Test compounds fed through such a system can test the reaction of, say, a culture of liver cells to various druglike substances. “Liver toxicity is a common reason many promising drugs don’t make it to market,” says Hamilton. Microfluidics holds promise for reliably detecting toxicity of drugs outside the human body.

Understandably, the pharmaceutical industry and others have embraced the new technology — some to the point of bedazzlement, says Hamilton. “Everybody got so enamored with the ability to test millions and millions of compounds, they forgot the science and let technology decide what they would do.” If unchecked, this tail-wagging-the-dog dynamic — common in technology-driven environments — can hinder effective research and development. “I tell my students, lab automation can enable your ability to do good science,” says Hamilton, “but it cannot create good science.”

Another challenge facing users of the technology is that it can be expensive. Not only are the systems costly, but they are complicated, and it often takes longer than anticipated to get them up and running. In an environment where new discoveries prompt rapid changes in screening processes, a company risks investing in a system that’s obsolete on the day it’s launched.

Cell biologist Tim Gryseels, a senior scientist at Pfizer in St. Louis, MO, was on the implementation team for a project to automate a cell-line screening system for the company. The team chose a robotic component from one vendor and other instruments from other vendors. Integrating all the functions proved a slow and frustrating task, says Gryseels. “When we had issues with the robot, we’d call the company, and they’d say, ‘Well, it’s probably the third-party vendor’s problem.’ So we’d go back to them, and they’d say, ‘Well no, it’s not our problem, go talk to the robotics company.’ When you have third-party instruments on the system, you might have a hard time figuring out who’s going to fix it.”

In addition, the robotics vendor was bought out by a larger company soon after the contract was signed, and most of the people who had designed Pfizer’s system left. As a result, says Gryseels, customer service was less than adequate during the installation period.

“We started designing the system in 2004. By the time it finally worked, it was early 2006, and our process had changed,” says Gryseels. “So when people invest in automation, they need to realize that they won’t have it up and running the next day.”

This particular job was exceptionally troublesome, Gryseels acknowledges — other automation projects he’s been involved with have gone much more smoothly. Despite the initial trouble and expense, Gryseels believes that, to stay competitive, companies such as Pfizer must invest in some form of automation to achieve higher throughput in screening. “You just need to put a lot of thought into it,” he says, “and make a careful decision.”

Leave a Reply