Transforming Upstream Bioprocessing in the Digital Era

Data Analytics
Nov 16, 2021

Today in biopharma, only one out of every 10,000 new drug candidates reaches the market. The probability of clinical success is less than 10%   (from Phase 1 to approval). As a consequence, the cost of drug development is constantly increasing with a current annual expenditure of more than 2 billion euros. It takes 10 years on average from discovery until approval. This paints a clear picture that the biopharma industry is ready for better options.

This article is posted on our Science Snippets Blog 

Digital bioprocessing enables to shift the focus of upstream bioprocessing from the process state to the biological state using data analytics to create in-silico models for process monitoring, optimization, and control

To be fair, the idea of biopharma 4.0 has been “in discussion” for nearly a decade now. How far have we come? Are we there yet? Well maybe. Yes and no. One thing is certain, the era of digital transformation has reached biopharma, and the companies that will find success in the next phase will be those who adopt an analytics-based approach to accelerate innovative solutions in product development and production. Here, tools to help optimize upstream bioprocesses will be essential.

In upstream bioprocessing – the part that involves cultivating a living organism to produce a large molecule—manufacturers are striving to produce a quality product with good efficacy, which means meeting a list of Critical Quality Attributes (CQAs). CQAs are defined as physical, chemical, biological, or microbiological properties or characteristics that should be within an appropriate limit, range, or distribution to ensure the desired product quality.  (Definition from ICH Q8(R2) annex)


Unfortunately, these CQAs cannot be easily monitored online. Therefore, manufacturers typically track quality through process variables impacting CQAs (Critical Process Parameters -CPPs). 


This means that in order to ensure quality, manufacturers rely on keeping their process consistent, rather than having to test every batch of every product.

What does that look like in practice? Consider this example, where it’s necessary to ensure a specific charge variant for a protein in order to meet the specified quality target of the drug being produced. We know that the pH influences the evolution of the charge variant of the product, and we can plot this influence.

The influence of pH over charge variants used to define the acceptable range and operating range of the quality target can be seen on a plot.

From this, we extrapolate what the acceptable range of variation of the pH is to ensure that we reach this quality target. However, it’s not a one-to-one relation. pH also influences other CQAs, and sometimes there are other constraints from the experimental point of view, like ensuring the pH doesn’t vary too much. So typically, manufacturers implement an operating range that will be smaller than the acceptable range but still ensure they can reach the quality target.

In conclusion, we constrain the operating range (CCPs) to ensure the CQAs are met. This is a well-accepted and FDA-promoted method of ensuring quality. But is there an even better way? While multiple researchers have highlighted the potential benefit to shift the focus from CPPs to CQAs, process performance is still mainly evaluated in the process space using Risk Analysis, DOE and MVDA. One way to enable such a shift is to use data analytics and associated modeling tools.

Problems with Today’s Biopharma R&D Strategy


The Biopharma sector is facing an economic challenge: R&D costs are growing faster over an average investment period than actual revenues over the same period. This is not due to a lack of innovation (technology) or market need (with an aging population more in need of drug therapies than ever). But rather, perhaps due to the way drugs are discovered and developed. A couple of theories have been proposed around this:

  • Kelvin Stott, an experienced consultant, and executive introduced the “law of diminishing returns”, proposing that biopharma companies have already picked all the “low hanging fruit” among successful drug ideas, so now more effort and money are needed to get any new marginal progress.
  • Standish Fleming, founding managing member of Forward Ventures, and a veteran of venture capital investing in therapeutics discovery pointed to the inherent unpredictability of drug discovery endeavors and how using an outdated business model of drug discovery are factors.

In this context, we observe that biopharmaceutical companies tend to outsource their early activity in order to reduce their costs and to be more agile and more flexible around potential market disruption. Their main focus becoming then the go-to-market activities Which means that they have are growing need to accelerate their operational tasks.

How Can We Accelerate Biopharma Operations?

The path to accelerating biopharma operations can be stacked into four pillars:

  • Digitalization – The path to acceleration relies on computerization and digitalization of the information used and generated at each step of a product development process (aka digital transformation).
  • Data management – Once all the information is digitized, it needs to be accessible, organized, and contextualized
  • Analytics – this structured digital information can therefore be used to feed data analytics and associated modeling tools to generate valuable insights for further process optimization and control 
  •  Automation and control – development and implementation of advanced automation and control strategies leveraging the knowledge generated by analytics 

Process Optimization in the Biopharma Industry

The design, optimization, and transfer of technology (abbreviated as DOT) of processes in the biopharma industry is a long and tedious approach that relies mostly on trial and error. Companies generate a lot of data during the process and do not really know what to do with it or how to use it efficiently.  

Furthermore, biological systems are complex and not fully understood, and present behavior that is not linear, making data analysis difficult. This is one of the reasons that standardized workflow appliable to the entire drug development process are not yet have been made available.

But there is an opportunity to improve this with in-silico modeling and simulation using real-time analytics and process optimization.  The potential benefits with this sort of digital twin approach are:

  • Rapid prototyping and process development
  • Improved process performance and product quality
  • De-risked transfer to manufacturing

How Do You Implement In-Silico Modeling?

Creating a digital twin enabling in-silico modeling and simulation involves going beyond just monitoring what happens in a bioreactor and implementing consistent control. The data collected during the process is used to create a digital model so you can not only start to understand but also predict what is happening in the process. 

Here is what a digital bioprocessing platform looks like in diagram form:
 

In-silico bioprocessing enables to shift the focus of upstream bioprocessing from the process state to the biological state using data analytics to create in-silico models for process monitoring, optimization, and control.

 
Macro Versus Micro View

The usual approach implemented in such a digital bioprocessing platform relies on describing what is happening within the bioreactor from a “macroscopic” point of view. This type of description is typically used for optimization and control for process development. However, if one wants to use modeling tools to pinpoint potential grow boosters (i.e., media design) and gain more details of the metabolism), a more detailed description at the microscopic level of what is happening within the cell is needed. 

This difference in the level of abstraction comes down to two big modeling strategies: process engineering and system biology. 


With system biology, we are talking specifically about the metabolic network. Metabolic networks enable the coherent organization of large datasets into biological networks and provide non-intuitive insights on biological systems that in vivo experiments alone cannot provide.
These networks take into account all of the reactions that happen in specific organisms, linking the genotype (information encoded at the level of the DNA) with the phenotype, (observation occurring at the bioreactor level). They can also be used to overlay diverse  -omic information.

While systems biology tools have been proven of paramount importance in drug discovery and cell engineering, their use for process DOT is still at a very early stage. We, at Sartorius, believe that the next generation of bioprocessing will be based on the hybrid modeling concept bridging this gap between process engineering and systems biology to provide an integrated picture of the bioprocess components:  the bioreactor and the living organism.

Moving Toward Biopharma 4.0

The term 4.0 was introduced in 2011 at a German conference talk about the “smart factory” and the computerization of manufacturing. This transition was predicted to have a huge impact on the drug development timeline. Unfortunately, COVID apart, such predictions have yet failed.  One reason lies in our heavy use of digital tools based only on numerical data. The Biopharma sector will not succeed in the 4.0 digital transformation without merging knowledge with statistics. What’s next is not focused only on numeric information, the data, but also on the metadata – the features. This also highlights the need for a shift in the existing paradigm for defining regulated biomanufacturing processes from data- to knowledge-based approaches.

The big picture then –  and the takeaway for biopharma companies – is to move toward an enhanced, optimized approach to upstream process development that makes use of existing information to bring transformation, optimization, and ultimately, profitability. The key dynamic behind all of it is an integration of advanced data analytics, process knowledge, and digital tools that transcend the traditional method of process monitoring and move toward digital twins powered by a systems approach of bio-simulation. 


The [AI] field's heavy use of statistical techniques to pick regularities in masses of data is unlikely to yield the explanatory insight that science ought to offer”. Noam Chomsky, 2012

This content was presented by Anne Richelleat World Bioprocessing Pharma 4.0 Summit– October  2021 and developed with the help of Chris McCready and Johan Trygg

Want to Know More? 

Download the presentation deck (from World Bioprocessing Pharma 4.0 Summit) to learn more about the shifting paradigms in biopharma R&D strategy and how bioprocess modeling can be of value.  

Download Presentation

Subscribe to Get Updates From Sartorius Data Analytics

Subscribe to Get Updates From Sartorius Data Analytics