In the Information Technology Zone

View PDF

Successfully driving your global business requires vigorous, secure information exchange within your facility, from site-to-site, and with your partners and contract service providers. Tools to capture data in real time support decision making and enable companies to manage volumes of historical data needed for regulatory submissions. Software is available for data mining, managing clinical trial networks, and assessing toxicology data (among many other things). Information technologies in the biotechnology industry facilitate development and delivery of new therapies and diagnostics, improvements in patient care, and other life-science and healthcare technologies, goods, and services (see “A Sampling of IT” products box).

Those who were working in the biotech industry in 1997 when 21 CFR Part 11 was introduced — detailing the FDA’s then-new requirements for incorporating electronic records and signatures — will remember how daunting those processes appeared to be (and how expensive) (1,2,3). Enforcement of Part 11 was begun in 1999, but given the complexities of interpretation, the guidance was somewhat narrowed in scope in 2003. That version, intended to better align with the agency’s evolving risk-based approaches, was geared only toward records required by predicate rules and records required to demonstrate a company’s compliance with predicate rules. So discussions continued in industry publications and conferences seeking ways to upgrade legacy systems, ensure the security of electronic signatures, facilitate the continued readability of electronic submissions — and come to a consensus on just what a predicate rule is (4). One follow-up draft guidance details how to determine the receipt date by the FDA if a submitted file arrives corrupted and can’t be opened (5). The FDA is currently expected to release an updated, revised version of Part 11, perhaps this year.

For many biotechnology IT departments, those early discussions of Part 11 marked their first introduction to the good manufacturing practices (GMPs) long underpinning the departments they supported. I recall one speaker at a conference several years ago who described how his IT department would occasionally announce pending upgrades to software systems — only belatedly involving the R&D users of those systems in discussions about accessibility of older files, data sharing and maintenance, and continuity of associated records. The industry continues to benefit from the evolution of technologies for information management. And IT departments have of necessity become better involved in decisions affecting a company’s entire enterprise rather than its individual functional environments. Software systems today allow multiple users to share data across departments and throughout technology transfer.

Data Management Challenges

In an article scheduled for an upcoming issue of BioProcess International, Pietro Forgione, biopharmaceutical business development manager of IDBS (, outlines core challenges faced by biotechnology IT departments. The following list is excerpted from his manuscript in preparation (6).


Types of systems and services offered range from networked instrumentation, to customer relations and sales tracking, to data tracking and report generation. Here are the companies scheduled to appear in the IT Services Product Focus Zone. Other IT-related companies can be found at other locations in the hall, so please go to to plan your meetings

Amico Accessories (Booth #5013);

CDC Software/Ross Enterprise (Booth #5023);

Dell Healthcare and Life Sciences (Booth #4913);

EvaluatePharma (Booth #4923);

i3 Statprobe (Booth #5018);

Moritex Corporation (Booth #5120);

Technology Vision Group LLC (Booth #4919);

Different Data Formats: Data are generated by several different functional groups throughout a process development lifecycle. Typically, Microsoft Word and Excel documents hold scientific process data in folders on validated servers, and paper notebooks follow sign-off procedures and are manually archived.

Multiple Locations: Separate data sources typically contain data as varied as instrument output held on a file system on a network, analytical data captured and stored in silos, SCADA/Historian software information, and LIMS-based databases (SCADA describes supervisory control and data acquisition software for process control; LIMS describes laboratory information management systems). Because data sources are often spread across multiple locations, organizations cannot easily retrieve knowledge from their data and fail to exploit the full value of this important asset.

Report Generation: Generating reports is often frustrating and time-consuming because data are stored in multiple applications and in disparate systems. When compiling results from contract manufacturing organizations (CMOs) to client organizations, reports are manually retrieved from each data silo and then stored in another silo on a validated server. Researchers copy and paste from multiple paper, Word, or Excel files, which involves error-prone transcription steps that introduce a large element of risk into the reporting process.

Sharing and Communicating: Scientists often work on different parts of a process (fermentation, downstream processing, analytical development) and have to collaborate across different unit operations to pull it all together. Currently many organizations lack a reliable and accessible method of communicating and distributing collaborative information to departments involved in the process lifecycle. Sharing information is also difficult between groups. Organizations that partner with CMOs experience problems in sharing and transferring data, potentially across a number of global sites.

Tracking changes is almost impossible. For example, changes made to the value of a cell when transcribing data into a spreadsheet cannot be detected, which presents a big challenge in proving who edited and authenticated record changes. Regulatory compliance requires identification of changes and the person responsible, and it is an overarching consideration for organizations addressing data management issues.

Quality By Design (QbD): The FDA’s current good manufacturing practice (CGMP) for the 21st century and process analytical technologies (PAT) initiatives are a significant current consideration for most biopharmaceutical and pharmaceutical organizations. CGMP encourages adoption of QbD, a best-practices approach that integrates quality and risk management into the development and clinical manufacturing process (7, 8).

When data are on paper and in scattered file sources, associated contextual information such as QC results and QA documentation may be lost or extremely time-consuming to compile. Right now it is still difficult for many companies to adopt a true continuous improvement approach to manufacturing and development that follows the new philosophy.

IT Products and, Solutions

At the 2008 BIO International Convention, a number of companies in the IT Services Zone and at other locations in the exhibit hall will showcase products and services that, together, define the range of industry approaches to data and documentation management. Examples of just a few of the issues being addressed by IT companies follow here, both illustrating the close working relationships between IT service provides and their clients. In one example, Keith O’Leary, director of product marketing for LabVantage Solutions, Inc., offers his thoughts about dealing with legacy systems in the box titled “Replacement LIMS: Configuring for the Future.”

Software as a Service (SaaS)

The acronym SaaS stands for software as a service. Web-based software applications have been offered for several years now, and certain applications are beginning to make their way onto the Internet as well. Probably the most well-known application is Sales Force, an on-demand customer relationship management (CRM) service ( Some larger companies that have introduced this delivery model with their products include Microsoft and Oracle (see for example; and

In my recent conversation with Dirk Beth, president of Mission3, Inc. (Phoenix, AZ,, I learned about his company’s SaaS model and the ways in which it addresses needs for compliant information management.

Beth began by explaining how his company’s model differs from traditional approaches. “From an overall perspective, I think there is a trend in information technology, not only in life sciences, toward software as a service, which is really a delivery mechanism. Traditionally a company — we’ll talk life sciences now — would buy a product for its functionality, and it would pay a significant licensing fee. Then it would hire people, consultants usually, to come in and work through the implementation, gathering requirements, customizing, and sometimes configuring the system to work the way that company wanted it to. And then the consultants would go off, and every time something didn’t work right, the customer would have to pay to get things fixed or pay maintenance on a yearly basis. That process is still going on, of course, and people are still buying enterprise licenses.”

Web-based applications are usually configured by a purchasing company itself, incorporating its required features and functionality, and the application is then hosted by the solution provider, the software provider. An advantage is that a company doesn’t pay a large up-front fee, but instead pays a monthly fee similar to paying an electricity bill or a lease on office space.

Validation Processes: Beth believes this makes a great deal of sense from a cash flow perspective. “A company doesn’t have to hire IT to maintain the applications, and a company doesn’t have as heavy a burden in the validation of those applications because the data center environments have been validated. Therefore, the software’s been validated separately. A company performs its own validation of the application’s functionality and can move on more quickly.”

Beth explained that records of validation conducted internal to the data center are made available to his customers, including all documentation associated with that process and the validation scripts related to the software. Although a validation process is still necessary, it’s significantly less effort than it would be if a system were installed at a customer’s own site.

Lifecycle Management: I asked Beth how the new systems support a product throughout its lifecycle. “We deliver comprehensive document, project, and submission management publishing solutions. So from a project management standpoint, a biotech or pharmaceutical company can roll out a complete product from basically nonclinical through postmarketing. It can roll that into our product, manage all the documentation, mostly on the regulatory side, and then automatically roll that into either eCGD (electronic common technical document) or ad hoc type of submission management and submission publishing with a high level of reuse.” To facilitate sharing of data across departments and functions, the product therefore integrates and flows data from one system to another. The client is consulted when upgrades become available. “Everything in our system is built to be audited, every transaction is audited, and it’s Part 11-compliant by design.”

Protecting Data: This doesn’t mean, however, that life sciences companies are embracing this and other external data-storage models without reservations. With an SaaS model, a company no longer actually houses the software — and thus its own data — within its own four walls. Beth talked about this in the context that data management seldom falls under a biotechnology company’s set of core competencies.

“The concern is that data are not as protected. But the truth of the matter is that most biotech and pharmaceutical companies, especially the small to medium ones, aren’t IT organizations, and they don’t really focus on providing data security and their own data center. By outsourcing responsibility for software availability backups and disaster recoveries, a company of this model is actually providing more security, more protection for its documents and data.”

As an example of the need for increased understanding of data security issues, he mentioned how employees will express reservations about the security of their R&D data, yet otherwise freely attach a sensitive document to an e-mail and send it off. But “in our environment you never have to do that. You’d go directly to our secure environment and share documents with other collaborators that are maybe not part of your company but maybe work for your CRO, working directly through our environment so you don’t have to attach something to an e-mail and let it bounce off 40 different servers before it gets there.”


For many very early adopters of laboratory information management systems (LIMS), much of the return on investment has run its course. The early LIMS were purchased and customized to address specific laboratory requirements and were effective production, automation, and research tools. However, as new la
boratory technology and process changes have been introduced over the past 20-plus years, LIMS to support such changes have also evolved significantly.

This has resulted in outdated systems with custom coding that is not only expensive to maintain, but address only the specific needs of one laboratory, in one isolated location. Worse yet, such systems may no longer even be applicable to a laboratory’s needs. However, organizations are forced to keep old processes in place solely because legacy systems allow nothing new.

This evolution has many laboratories reevaluating their current LIMS requirements, while keenly aware that fast-paced technology advances requires them to consider their long-term future needs. According to Jonathan Witonsky, manager and industry analyst with Frost & Sullivan, LIMS market growth will be fueled by large pharmaceutical companies replacing their aging legacy LIMS (1). Witonsky adds that laboratory requirements have changed dramatically, and LIMS must provide an enterprise-wide solution capable of supporting multiple business units spread out over several geographic locations. He also comments that easy-to-integrate, off-the-shelf programs are the future-proof solutions that provide expansion flexibility that will continue to provide value over longer periods of time.

Because many of the larger companies have a global footprint for both research and manufacturing, LIMS need to be evaluated as a strategic component of an organization’s overall operational and IT infrastructure. LIMS decision makers are not in the laboratory alone, but should include the operations executives and IT professionals evaluating the strategic impact and value of the LIMS on a global scale. Ultimately, LIMS should be evaluated based on its functionality, flexibility, and technology.

Functionality: Expenditure and investment in both capital and human resources are usually at the top of the list for evaluation. A LIMS must provide a measurable return on investment and demonstrate a lower cost of ownership. On the one hand, point-specific systems that require either multiple LIMS or extensive customization to meet enterprise-wide needs run opposite to those objectives. On the other hand, a configurable off-the-shelf LIMS, such as LabVantage’s SAPPHIRE, can operate on the same thin-client enterprise platform and provide out-of-the-box solutions for research and development, biobanking, stability, quality management, and more (a thin-client architecture requires only a Web browser with no plugins or applets and all significant processing functions are conducted directly through a server rather than using the server primarily for storage and backup). This offers the advantage of managing only one functionally rich LIMS across an entire enterprise increasing knowledge sharing, easing end-user adoption, and reducing cost of ownership.

Flexibility: Impact on existing policies and procedures both within and beyond laboratories play a significant role in the evaluation. Establishing standard policies and procedures can often take more time and investment than implementing the technology. Therefore, the configurability and adaptability of the LIMS should be scrutinized. Flexibility of the system is critical for process adaptation, expansion, and interaction across and beyond the laboratory. Accordingly, decision makers should evaluate the extent of a LIMS’ configuration capabilities. Are they limited to layouts and user roles, or can you configure the specific fields, labels, rules, and workflow in the LIMS? Is that configuration achieved through hard-to-maintain custom code or through easy-to-use configuration tools? Will such flexibility and tools reduce validation cost and effort and eliminate the reliance on high-cost programming resources? Flexibility is also demonstrated in solutions with open platform architectures that allow for ease of integration and interfacing, especially if they offer the latest in certified interfaces to critical enterprise resource planning systems.

Technology: At the end of the day, the purchase of a LIMS is a technology purchase, and therefore all the typical technology concerns have to be considered. At the top of the list are a few key concerns such as how current the underlying technology is, whether it fits into an organization’s overall IT infrastructure, how easy is it to deploy, and how difficult is it to maintain. LIMS purchasers should seek out vendors that can best address these questions with a solution that uses current thin-client, browser technologies. A LIMS with a zero footprint architecture can provide secure enterprise-wide access with no plugins, downloads, or applets on the client, making it easier to access, deploy and maintain — ultimately enhancing ease of use and lowering cost of ownership. In addition, LIMS with true multinationalization support (M18N) can provide multisite, multilanguage capabilities meeting global requirements and enhancing a user’s experience.As companies continue to evaluate the pros and cons of replacing a legacy LIMS, they should keep in mind that although replacing a LIMS comes with a cost, not replacing it may have not only the actual cost of maintaining an outdated system, but an opportunity cost that could far exceed the new investment. Although business requirements and objectives will differ company to company, a configurable off-the-shelf LIMS that leverages the latest in thin-client computing is designed to meet the varied demands and deliver value today and into the future.


1 Witonsky J. Biomarket Trends: LIMS Growth Driven by Replacement Purchases. Gen. Eng. News 27(21) 2007;

Keith M. O’Leary is director of product marketing, LabVantage Solutions, Inc., 1160 US hghway 22 East, Second Floor, Bridgewater, NJ 08807 USA; 1-781-587-1827, mobile1-908-444-1012, fax 1-781-623-0410, ;

Another Outsourcing Model: He agreed, also, that outsourced IT is becoming a more accepted piece of the industry’s outsourcing portfolio. “Like many other things in life sciences where you are outsourcing clinical studies or trials, outsourcing some of your IT actually makes a lot of sense. It’s not a core competency of biotech companies.”

Smaller companies and start-ups that are “virtualized” can especially benefit from outsourcing data management. Many customers are actually very small organizations with 10 to 20 employees, but that manage and coordinate a much larger network of people and data. “It doesn’t make sense for software vendors to provide software that doesn’t facilitate that kind of collaboration,” Beth adds.

Managing Access and Security with Multiple Usersand Over Time: I asked how the traditionally rapid turnover in the still-entrepreneurial biotech industry impedes continuity of data security. He told me that his company’s software is licensed per project, not on how many users there are. As long as someone is working on the project, he or she can have access to the software. “It goes toward our understanding,” he added, “that life sciences employees do not work in a vacuum.” This is also more realistically designed to accommodate the long-term nature of drug development projects and the fact that many different people and groups come and go throughout that process.

And as those groups come and go, so too do program versions. How does a company most seamlessly handle migration or inoperability if its IT partner needs to upgrade a its system?

Beth mentioned that one factor easing such transitions is that the Adobe portable document format (PDF) is now the global de facto standard for sub
missions. “The eCTD [electronic common technical document,] kind of laces those together with the XML [extensible markup language], but the archival standard for PDF is that it has to be accessible for the next 50 years.” He affirmed that a great deal of effort is going into making sure that data will be accessible by regulators even 50 years from now.

He commented that the Internet also provides a platform that allows people not to think about information storage media. “Data storage is getting cheaper and cheaper and more resilient than ever before. If you put something out there, it really never goes away. That can be bad for a lot of people, but from a data archiving standpoint, that’s pretty good. It’s just a matter of who you give access to that information.”

Training and Technology Comfort Levels: Beth’s company conducts an on-site training session with its customers initially. Although that might seem counter to using a Web-based program, He says that “anytime you introduce change, it’s nice to be there helping people through that, and so we mostly do go on-site. But training is not really a main line of business for us, so we leave behind all our training materials so that customers can train the trainer or distribute those materials. We provide training and materials electronically as well, so customers become more independent and can train future users of their system.”

In answer to my question about comfort levels he encounters with electronic systems, Beth acknowledges that for the foreseeable future, there are still different levels of comfort in companies that are implementing such systems, so it is important to provide help in every way possible. “Some customers may have years of experience of taking a pharmaceutical product from discovery all the way through market and postpatent life. Some have been in the business for quite a long time with, therefore, a significant experience level. But they may not have the acumen with the software that is seen among younger start-up companies.” In contrast, many computer-savvy but less experienced researchers many not know what functionalities are specifically necessary to ask for that will bridge future drug-development phases.

A newer trend that Beth confirmed is that he doesn’t run into many situations in which IT department personnel don’t have some level of understanding of “GXPs.” In fact, he told me about a consortium (LSIT) that is developing “GIPs” — good informatics practices ( Other industry groups such as the BioIT Alliance are also working to further collaboration and development of data management criteria.

Beth pointed out that every company interprets guidances somewhat differently. The introduction of and discussions about Part 11 several years ago helped bring more people into understanding of the regulations. “We’re keenly aware of the fact that every time we go in and install our system there is the need to comply with Part 11. IT folks and CFOs understand that the question needs to be asked about how you deal with the X guidance or X standard. And as a submissions technology company, we have to deal with quite a few other standards and guidances from the ICH, FDA, EMEA, the Ministry of Health in Japan, and Health Canada — dealing with their interpretations of ICH guidances plus their own regional requirements.”

We returned to the issue of career mobility in the biotechnology industry (e.g., the amount of turnover, in general) and suggested that IT service providers are probably always encountering people who are brand new to the whole regulatory framework.

Beth said that software systems can address much of that type of transition. “You know, you build processes. You interpret those processes into software. You have a software product where data flow from one system or one group of functionality to the another. So you have less chance of getting dinged if you will. This makes it easier for management of certain departments to change and nothing to be interrupted in the process.”


1.) US FDA 1997.Guidance for Industry. Electronic Records, Electronic Signatures Code of Federal Regulations Title 21, Part 11.

2.) US FDA 2003.Guidance for Industry. Part 11, Electronic Records; Electronic Signatures — Scope and Application Code of Federal Regulations Title 21, Part 11.

3.) Huber, L. 2007.Update on FDA’s 21 CFR Part 11: Guidance on Scope and Applications Exclusive Monthly Compliance News.

4.)Site hosted and supported by Waters Laboratory Informatics ( Frequently Asked Questions: 21CFR Part 11.

6.) Forgione, P. Data Management in the Supply Chain; excerpted here, full version in Press at. BioProcess Int..

7.) Guidance for Industry 2006.Quality Systems Approach to Pharmaceutical CGMP Regulations, Department of Health and Human Services, US Food and Drug Administration.