Better, Faster, Cheaper, Compliant

Modern Drug Development Demands a Streamlined, Informatics-Driven Approach to Product Quality and Regulatory Compliance   

For the businesses involved in the design, development and manufacture of pharmaceutical products, addressing compliance and quality is more challenging than ever. In an increasingly complex regulatory environment, companies are under pressure to bring profitable new therapies to market quickly, while also slashing development costs and managing a growing global network of external partners. These priorities oftentimes compete with one another: How, for example, can compliance be managed efficiently and cost effectively when complex laboratory methods and batch processing steps need to be checked and validated multiple times? How does an organization keep a tight reign on cGMPs (Current Good Manufacturing Practices) when a number of research, development and manufacturing activities have been outsourced to CROs and CMOs based in different locations and jurisdictions around the world? When there’s only so much budget to go around, is it even possible to slash cycle times, improve innovation, and also ensure that no one with a stake in product quality drops the ball?

Bringing Quality by Design (QbD) to Drug Development

Compliance issues—prompted by the all-too-common process deviations that happen in quality control laboratories everyday—can lead to serious business headaches for participants across the drug development value chain. 483 citations, production shutdowns, or delays in the regulatory submission process can set a company back tens and even hundreds of millions of dollars.

With so much at stake, consistent and more controlled information management and process execution during the QA/QC stages of drug development is a must, as businesses simply cannot afford to let data transcription errors and other small mistakes snowball into big problems that wreak havoc later downstream. This is the driving principal behind the FDA’s embrace of Quality by Design (QbD): the concept posits that quality can’t be inspected into products. Instead, it needs to be built into the innovation lifecycle from day one, beginning with early research and continuing through quality testing, quality control, production scale-up, manufacturing and beyond. The key is to enable a compliance management and cGMP strategy that’s both effective and efficient. The benefits of QbD can’t be fully realized if it costs too much (in terms of time and resources) to put into practice.

Getting Quality Right

Here’s a startling statistic: According to a 2009 report in Validation Times focused on pharmaceutical quality control, nearly 60 percent of the 483s issued in that year were due to simple deviations in procedure execution. SOPs existed, but either weren’t followed or weren’t documented correctly. Why? A big part of the problem is that many laboratory operations still rely predominately on paper-based systems and manually-driven, disjointed processes to capture information and carry out the tasks essential to quality assurance and quality control. Because these systems are especially susceptible to human error (such as data entry and transcription mistakes or an overlooked testing step), they require constant checking, re-checking and validation that adds no value to actual product innovation but contributes significantly to development time and costs.

Other industries, such as automotive, semiconductor manufacturing and electronics, have integrated QbD into their operations by deploying informatics solutions that help them better capture and share data, codify workflows and create standardized, reproducible best practices. The pharmaceutical industry would do well to follow this example, and fortunately there a number of technologies now available that can help companies accelerate the development of high quality products while reducing costs and risk. Here are four key elements essential to a creating a more streamlined, effective and efficient cGMP operation:

  1. Automated, Electronic Data Capture. The saying “garbage in, garbage out” is especially relevant when it comes to QbD. If a transcription error results in a test result being incorrectly reported in a LIMS system, it matters little if every quality assurance step thereafter is executed correctly. Back tracking and rework will be required once the mistake is discovered, and the further down the product development chain this happens, the more expensive the remediation cost. This is why eliminating manual, paper-based data entry and data transfer processes from cGMP operations is so important. Digital laboratory solutions have come a long way since early electronic laboratory notebooks (ELNs) were first introduced as a means of capturing experimental data. Today there are solutions capable of automatically capturing information directly from the hundreds of applications, systems, instruments and devices used in the QA/QC lab or on the processing floor. When data related to method preparation (reagent information, weighing operations, etc.), from analytical instruments (chromatography and spectroscopy) and from analyst or operator observations can be electronically captured and moved through various product development stages, companies can greatly reduce and even eliminate data entry errors, minimize time consuming data handling tasks and rework, and ultimately improve quality and compliance.
  2. Real-Time Process Control. Catching compliance deviations as they happen, rather than after the fact, requires an informatics solution that’s capable of guiding and managing procedure execution in real-time. The transition from paper to digital data capture is a critical piece of such a system, but it also needs to be able to automate and control quality processes “under glass.” Imagine a set of electronic eyeballs that tracks laboratory execution and alerts operators when something is outside of the expected norm (such as a non-compliant sample weight). With this kind of tool in place, deviations can be identified and fixed immediately. In addition, validation and review redundancies can be greatly reduced, or eliminated altogether. In a traditional paper-based environment, a huge amount of checking and double-checking is needed to ensure that everything has been done correctly. But when procedure execution is digitized, companies can capture a complete audit trail of every action that was taken in the lab (or in the case of batch execution, on the plant floor). In this way, procedure validation and review is automated through a process known as “review by exception.” Deviations are highlighted by the technology system, so that reviewers only need to focus their attention on procedures that have been flagged, instead of having to look at every single piece of data and step that is part of the process.
  3. Purpose-built for cGMP. There’s a key difference between the activities typical of early research (lead discovery and compound design, for example) and those that are essential to ensuring product quality and compliance. Early research is far less structured, as scientists need the flexibility to experiment with new ideas and hypotheses. When one approaches a QA/QC or batch processing environment, however, loosely structured procedures are a liability. Ensuring process control, and hence quality, demands that operators execute procedures consistently, every single time. Thus, any solution deployed in these areas must be purpose-built to guide cGMP operations specifically, including safety and stability testing, quality assurance, late stage quality control and commercial production. Without a system uniquely created for this purpose, it will take a good deal of customized coding and point-to-point integration (a costly and multi-year IT endeavor) to integrate the data critical to QbD with the procedure execution workflows that move quality, testing, scale-up and early production forward.
  4. Designed for Integration and Collaboration. Finally, while cGMP procedures do require electronic capabilities geared to structured lab execution and batch processing environments, any solution deployed cannot operate in a vacuum. An organization’s quality operation needs to be able to easily connect with the information systems and stakeholders responsible product innovation and manufacturing up and down the value chain, so that different experts and disciplines can learn from one another. This is where a web services-based enterprise informatics platform can play an important role. By allowing users to electronically integrate diverse information silos and processes, this type of system empowers development contributors to combine and share knowledge that improves understanding of the variables affecting product efficacy and quality. It also can allow QA/QC to feed observations back into the ideation and design stages of development. Beyond an organization’s “four walls,” Cloud-based technologies are another useful way to bring growing networks of external partners into the mix, by providing a simple way for third parties to contribute to QbD efforts.

Improve Productivity, Slash Cycle Times   

The experiences of real-world global pharmaceutical, biotech, and contract research and manufacturing organizations speak for themselves—by deploying integrated software products specifically designed to facilitate data capture, automate procedure execution and enhance collaboration across the product development pipeline, users have reported productivity improvements up to 25 percent and 50-75 percent reductions in quality and compliance-related cycle times.

Here’s how one large pharmaceutical company has benefitted from such a solution: The user deployed a process management and compliance software suite that includes an ELN for capturing critical data, a lab execution system to guide cGMP operations and an enterprise informatics platform to integrate data and ensure knowledge transfer at crucial research, development, pilot and manufacturing hand offs. The software takes existing written protocols and presents them electronically with data that has been captured directly from lab instruments (no manual entry required.) Analysts and operators interact with the SOPs “under glass,” using PCs or tablets, and are alerted when any step is out of compliance. At the end of the process, all data is aggregated in a review dashboard for audit and validation purposes. Compared to paper based or hybrid manual/electronic approaches, this is a huge leap ahead, and a testament to the value of leveraging informatics to drive QbD, improve compliance and deliver better, more innovative and profitable drug products to market, fast.