When it comes to biotherapeutics manufacturing, downstream processing groups tend to get “dumped on.†Advances in cell lines, bioreactors, and culture media formulations have increased production output, providing both higher expression titers and greater volumes, but the filters and chromatography columns on the downstream side haven’t kept pace. These century-old technologies haven’t evolved as much and are reaching their limits. Regulatory agencies have contributed to innovation stagnation because they are cautious about manufacturing process changes for fear of undermining quality and consistency of final product. But inconsistency is a hallmark of many current processing methods. For example, chromatography resins can vary in their behavior from batch to batch and degrade over time.
As a drug candidate inches closer to licensure, the pressure mounts on manufacturing. “You’ve got people on one side tweaking the cell culture,†says Shawn Anderson, global product manager at Bio- Rad, “and they want as much product as possible to analyze [and track] how well that cell culture is performing. Then you have formulation, with stability issues and storage on the downstream side. Then if someone is doing [studies], they want reproducible product. The pressure ends up on the people doing the production.â€
“The pressure has moved from upstream to downstream,†says Blanca Lain, a process development consultant currently working with Eleven Biotherapeutics, “and it was quickly observed that classical methods would not be able to take care of higher titers.â€
Downstream processors hope to stave off some of that pressure by increasing efficiency. That should free up human and instrumental resources, and it can increase yields. Fewer and faster steps will prevent product from languishing in suboptimal conditions (in-process hold) or from being whittled away through one too many columns.
Efficiency has wider implications as well. A smooth-running facility can provide resources and capacity for new products or pick up the slack for other facilities if they experience unexpected down time. “It’s also about having the capacity available to produce product in case of unanticipated increases in volumes or issues that you have elsewhere,†says Koenraad Swinnen, managing senior scientist and head of the purification group at Genzyme’s plant in Geel, Belgium. “It’s a little bit about being adaptable to changing conditions, but at the same time being cost efficient.â€
Competition from biosimilars is another factor. Greater efficiency translates to lower manufacturing cost. “If you’re launching an innovative product,†says Abhinav Shukla, vice president of process development and manufacturing at KBI Biopharma, “you have to make your process as efficient as you possibly can because in 10 years you will be facing competition from biosimilars.â€
Biosimilars appear poised to enter the US market, with the US Food and Drug Administration (FDA) having received applications this summer for Celltrion’s Remsima version of infliximab (Janssen’s Remicade) and Sandoz’s Zarzio version of filgrastim (Amgen’s Neupogen). “How do the name brands compete against it?†asks Justin Allison, principal developer for supplier Aroostook Pure. “Part of the solution will be reducing the manufacturing cost.†First and foremost, that means increasing yields on the upstream side, but purification costs and product losses are just as important.
Blueprint or Special Case?
Monoclonal antibodies (MAbs) are among the most straightforward biotherapeutics to process. They have a relatively long manufacturing history and a well-established purification routine that simply needs to be “tweaked†to suit a specific product’s qualities. This platform starts with clarification to remove cellular debris. A capture step follows — almost invariably protein A affinity chromatography, which is based on a cell- surface protein found in the bacterium Staphylococcus aureus that binds tightly to the Fc region of antibodies. Elution from that capture step is so effective that often all that’s left to do are ion-exchange chromatography in flow-through mode and a final virus filtration.
Other biotherapeutics (e.g., such as nonantibody proteins and enzymes) require chromatography steps that can be much more complicated. Antibodies are an evolving class, too, with variants such as antibody fragments, bispecifics, and antibody–drug conjugates (ADCs) now in clinical development. All those complex molecules can have isoforms that need to be selectively removed. For example, a bifunctional antibody meant to cross-react with two envelope proteins may be produced along with antibodies that double up on one envelope protein or the other. Such product-related contaminants need to be removed, but their properties are often very similar to those of the bispecific product itself.
Generally, companies meet such challenges with “brute force, experimentally evaluating all their traditional resins,†says Michael Phillips, director of bioprocess development at EMD Millipore. But he adds that “they’re looking for new chemistries [because] the current ones often do not provide the necessary selectivity to meet both purification and yield targets.â€
His company is developing novel multimodal chromatography resins that offer salt tolerance or present multiple modes of interaction within a given resin — but won’t yet divulge any specifics. Multimodal and salt-tolerant properties combined in a single resin could achieve separations that wouldn’t be possible separately and sequentially. “They need these multimodal resins,†says Phillips, “especially with these new types of molecules with which you’re trying to separate species that are very similar to each other. There might be only a very slight difference in hydrophobicity or charge, and you need to be able to exploit that.â€
Another approach is to extend the concept of affinity capture beyond MAbs and protein A. “The advantage of using something like an affinity- capture step is that it makes subsequent purification steps simpler,†says Steve Burton, CEO of ProMetic Life Sciences. “They tend to be flow-through, and usually you just require one or two more steps.†His company has developed synthetic ligands that can capture other biotherapeutics including plasma proteins. Those ligands are comparatively simple to make, and they can stand up to harsh cleaning agents that would damage complex ligand molecules such as protein A. “Using a combination of computational chemistry and high-throughput screening,†Burton explains, “we can develop an affinity capture step for virtually any protein, and [the hardware] can be used for many cycles.â€
Affinity chromatography has long been hailed for its exquisite ability to selectively purify proteins, and that’s still important. But now yield is king. “The real advantage of using affinity capture technology is the increase in yield that is possible,†Burton says. That’s because affinity reagents can bind and then release nearly all of the protein of interest, whereas other separation techniques divide protein mixtures into fractions that contain only a proportion of the protein of interest as well as other impurities that must be removed by subsequent process steps. ProMetic is using a sequence of affinity capture steps to extract therapeutic proteins from human plasma with what Burton reports as much higher yields than can be achieved with the traditional Cohn fractionation approach using cold ethanol precipitation.
Alternatives to protein A are increasingly attractive even for purifying MAbs. Protein A is expensive, and because only a few suppliers can make this complex ligand, there’s also the potential for shortage. WR Grace and Company’s ProVance prepacked protein A columns are disposable. The company has reduced cost by using a recombinant protein A and working with a proprietary platform. The result is a ~60% savings in cost compared with standard protein A columns, according to Kiran Chodavarapu, business development manager for bioprocessing at Grace.
Evolving Chromatography
“We’ve made small, incremental increases in efficiencies,†says Allison of Aroostook Pure, “and that hasn’t been enough. We have to make a quantum leap forward to fix the problem. Whether that can be accomplished or not. . . Well, it’s going to have to be. There’s no choice. Traditional chromatography is reaching its limits.â€
Shukla of KBI explains that there are two general ways to increase chromatography efficiency. One is to reduce the number of overall steps in the process, thus eliminating opportunities for inevitable product loss and occasional failures. Alternatively, companies can move away from chromatography altogether.
In some instances, KBI Biopharma is taking the latter approach by exploring sequential precipitation during the early stages of bioprocessing: Varying conditions such as pH, salt concentration, polyelectrolytes, and polyethylene glycol encourage impurities to drop out of solution. Shukla likens that approach to the number of theoretical plates in a chromatography column. “Can we combine or develop unique chemistries,†he asks, “or combine them in a single solution that can drive precipitation by multiple mechanisms? That’s under development.â€
Realistically, however, the flexibility and power of chromatography will keep it relevant. But the technology itself is going through some changes. Key drivers include new resins and advances in automation. There has been movement toward higher-capacity resins. Some are more rigid than past versions, which allows them to withstand higher flow rates that help maintain performance.
Automation is rapidly advancing, and it isn’t just about saving time. Automation also provides consistency. “You feel a level of comfort knowing that [using the same] separation you did the last time you purified this protein, you’re going to get the same [results] again,†says Jeff Habel, senior scientist in protein technologies R&D at Bio-Rad. “You remove the human-error factor because you may not always collect fractions exactly the same way if you’re doing it by hand. The biggest effect is time savings because you can run the instrument overnight, but I do think that consistency and reproducibility are a big issue, especially when you’re making a therapeutic.â€
Until recently, the main function of automation was to improve reproducibility and relieve operators of the repetitive task of loading samples and collecting fractions. Newer elements include failsafes that can automatically adjust instrument parameters. For example, Bio-Rad’s NGC chromatography system can detect spikes in pressure and automatically lower flow rates to compensate for viscous samples or blockages. Air sensors can detect when a sample has been fully loaded, so the system will move on to another sample or wash a column. The unit also can automatically shut down the flow and notify an operator in the event of a leak or if a buffer container isn’t properly loaded. The controller can even send a text or email notification of a problem.
Automation is also commonly used in process optimization, in which users run automated experiments to identify ideal conditions for separations, altering buffer conditions such as pH and salt concentration to determine which combination will provide the best purification results. “You can do a run at pH 6.75, 6.75, and so on,†Habel explains. “The software allows you to automate that as well. You can hit a button and walk away. That’s the current state of automation.â€
Decision-making may be the next stage. Two- dimensional or tandem chromatography would be a natural application. An in-line detector would collect samples and hold them for automatic loading onto a second column. “That’s kind of where the future is going,†Anderson predicts.
But the technology is not quite there yet. Decision-making will require innovation in supportive instruments. Detectors and pressure sensors haven’t changed much for decades, and many were built with manual operation in mind. Ultraviolet (UV) sensors produce chromatograms, but they aren’t designed to trigger sample collection. “The detectors just aren’t designed to be robust decision-makers,†Anderson laments.
Even pressure sensors are a product of the analytical high-performance (or -pressure) liquid chromatographic (HPLC) environment. They’re not as accurate at the lower pressures of bioprocess applications. “If we’re going to rely on these detectors rather than humans to make the decisions,†Anderson points out, “then we need to improve both hardware and software.â€
But as is so often true with new technologies, the same features that provide benefits also can give users pause. Fans of Disney’s The Sorceror’s Apprentice may envision an automated system going haywire with their precious stock, not unlike the movie’s over-eager brooms. “There’s this distrust of machinery,†Anderson points out. “Will it behave as it’s supposed to behave? The onus is on manufacturers to make reliable and robust products that ensure confidence.â€
Automation is a natural partner to continuous chromatography, which is receiving increasing interest from the industry. Rather than sequential loading of individual columns, continuous systems use a “Gatling-gun†style approach, with an automatic loader that injects sample to one column, then moves on to the next one when it detects that the first column has been loaded to capacity. This method uses smaller columns and can save on the cost of resins — a key factor when it comes to expensive protein A, for example. “We’re nearly there if not already there in some processes,†Anderson says.
Many such techniques are already tested in R&D-scale laboratories, where time savings can be more important than absolute purification consistency. “If [scientists] can use automation to save some time and not babysit a system, then that’s good,†says Anderson.
An intriguing development in chromatography is moving-bed chromatography. Traditional chromatography methods frequently overload columns, causing “tailing†effects that coelute product and impurities and force processors to rerun batches until reaching sufficient purity. But doing so takes time and inevitably leads to product loss.
Moving-bed chromatography doesn’t just send sample unidirectionally through a column; the separation medium itself moves in the opposite direction of a buffer stream. At the correct speed, slower-moving eluents actually can reverse direction and exit the opposite end of the column from the protein of interest. Such a system could be divided into linked but separate columns, each with individual countercurrent flow rates to maximize separation.
The challenge to creating a true moving-bed system is moving resin without disrupting it. Instead, some companies have developed simulated moving beds consisting of as many as eight linked columns in a closed loop. The resin remains stationary while the input lines for a raw product stream and outlet lines for target product and impurities all are shifted between the connected columns at designated time points. That line shifting is slower than the rate of buffer flow, which simulates a counter-flow in the resin. One example of this technology is Novasep’s Varicol system.
At least one supplier is focusing on the true moving-bed concept, however. Aroostook Pure’s “continuous countercurrent chromatography†system is a true moving bed, says the company’s principal developer Justin Allison. The technology is still at an early stage, but conceptually it consists of a loop of pipe divided into sections by a valve system. Resin moves around as “pucks†in a piston-like motion. At each section, the resin stops and undergoes a chromatographic process step.
That system should reduce resin and buffer use by 50% or more, according to Allison. With a smaller footprint than classic chromatography skids, this technology can be made to be disposable. The columns don’t have to be packed, and gradient elutions are possible (impossible with simulated moving-bed systems). A commercial launch is still some time away, however. “We are finishing construction of a larger developmental-scale unit to characterize the operation of the system and develop a set of functional specifications for it,†Allison reports. “Our funding is extremely limited, so progress is much slower than we [would] like.â€
Single-Use Technology
Over recent years, single-use technologies (designed to be used once and then discarded) have been gaining momentum in the biopharmaceutical industry. Bioreactors, holding bags for buffers and eluent, filters, prepacked columns, and many other devices can be disposable.
Disposables have a complex relationship with cost. Operating costs may increase when single- use technology is implemented. But it is cheaper and faster to set up a new facility using disposable columns and filters than one based on stainless steel holding tanks, bioreactors, and purification equipment. “A manufacturer doesn’t need to make large capital investments in a stainless steel plant with the costly cleaning and steaming utilities that are associated with that,†says Nick Hutchinson, global market development manager at Parker domnick hunter.
Single-use technologies have the potential to reduce cross-contamination in multiproduct facilities and even contamination between separate lots in single- product plants. As therapeutic proteins have become more and more potent over recent years, the “allowable carryover†limit has become an increasing point of concern — even more so with products such as ADCs that deliver a more powerful therapeutic punch. “That [allowable] amount is much lower the more potent a product is,†explains Paul Bezy, vice president of production at Genentech. “The cleanest solution is to use single- use technology.†Otherwise, manufacturers pursue stringent cleaning of stainless steel equipment. Disposables also eliminate much time and cost associated with cleaning and validating standard equipment.
But single-use products have drawbacks, too. “Single use is helpful,†says Bezy. “It does cure a lot of problems, but it brings some of its own as well.†For one thing, a biomanufacturer no longer has full oversight over the equipment involved in its processes when different single-use components come from different suppliers. “Being able to control that risk in the supply chain,†Hutchinson points out. “That’s one of the bigger fears that customers have, which is why they should work with the right suppliers with the right quality control systems to ensure that their supply chains are fully traceable.â€
Caveats: Leachables, extractables, and bag integrity are also concerns. Although single-use manufacturers do their best to make their products inert, they don’t always fully succeed. Disposable products may not stand up as well to some biomanufacturers’ specific conditions and handling. That can be particularly troublesome with bags that store cell culture or product solutions for in-process holding. “That’s something we’ve seen and been concerned about,†Bezy says, “especially in cell culture but even with purification. There can be variability between different bag vendors, and even with the same vendor over time if changes are not properly controlled and tested.â€
When it comes to final purified drug substance and formulated drug products, extractables, leachables, and leaks become even greater concerns. “We’ve definitely lost product due to leaking bags,†Bezy admits. “We are starting to move toward more storage of final bulk drug substance in single-use bags.†That puts even more pressure on understanding the extractables and leachables profiles of those materials, as well as strength and integrity of bag systems in real-world conditions, which can include shipping and storage at temperatures as low as –40 °C. Single- use storage of final product has economic advantages provided that there is no product interaction with the container and the bags remain integral and sterile over the shelf life of the product. But then, unfortunately, Bezy points out that “one leak will wipe out years of savings that a single-use storage system might offer.â€
Those issues are critical for therapeutic products, within which impurities can affect formulation integrity and potentially pose safety risk to patients. “You’d be surprised at how subtle a change can be relevant to a patient,†says Bezy. “We can see differences in glycosylation and product aggregation that can cause changes in pharmacokinetics or immunogenicity.â€
Serious Investment: However, most experts agree that single-use technologies are changing the landscape of bioprocessing, and most companies are incorporating disposables into their long-range plans. Facilities are being designed and built right now to take advantage of such systems. Amgen is building a US$200 million plant based on single use in Singapore, and Novartis is breaking ground on a $500 million Singapore facility that will involve both stainless steel and single-use systems. Those plants are expected to open for business in 2015 and 2016, respectively. Catalent completed construction of a facility in 2013. “More facilities are being built today,†Grace’s Chodavarapu says. “Companies are realizing that the investment needs to be made today.â€
Together, single-use technology and automation could be a powerful combination. “You can imagine that if you marry the idea of a disposable flow path with automation,†says Anderson of Bio- Rad, “you have a turn-key solution to companies trying to produce biotherapeutics quickly. All the big players are moving toward that,†he says, including his own company.
Chodavarapu points to general trends toward more outsourcing of manufacturing, with multiproduct facilities and more flexible solutions driving them. “You can see the connection between single-use technologies and the growth of the CMO market. They want to avoid cross- contamination issues.â€
The days of single-product facilities probably will be over soon. Biotherapeutics are trending toward higher specificity and smaller, niche markets. “It’s almost like the low-hanging fruit has been grabbed,†Phillips explains. “The indications are clearly more targeted, and the volumes and masses required are not nearly as significant.â€
Although it facilitates flexibility, single-use technology isn’t as well suited to larger-scale operations. “For large-scale manufacturing with multiple thousands of liters,†Swinnen of Genzyme says, “disposables are not always capable of dealing with those larger volumes. In large- scale manufacturing for legacy products, we haven’t seen [much adoption of single use].â€
Regulatory oversight also can present a barrier to adoption of some technologies. Some companies “want to see greater evidence from regulators that they’ll accept single-use processing for licensed drugs,†Hutchinson points out. He thinks that they eventually will get that empirical evidence. “The technology will prove itself.â€
Despite being less expensive to set up, single- use technologies also may represent higher ongoing costs in some operations. It may not be among the most environmentally friendly of options. “There’s a lot of waste,†Lain of Eleven Biotherapeutics cautions. “It might not be very green.†Some experts, however, point to the use of water and chemicals involved in cleaning stainless steel and indicate that decreasing them (together with choosing appropriate disposal options) can give disposables the edge (1–3).
Although filtration in the form of cartridges was one of the first types of single-use technology used in bioprocessing, chromatography is lagging far behind. But a few companies — e.g., Grace with its ProVance columns — are applying the disposables concept to chromatography and attempting to build a market. Bezy remains skeptical, however. “I just haven’t seen it work economically. It might work for some really niche situations. With filters, the economics work there.â€
Continuous Processing
Manufacturing strategies are evolving. Legacy facilities were built to produce one or a few blockbuster biotherapeutics, and their systems and processes tended to be big and inflexible: big bioreactors, big columns, and big filtration set-ups. “Trends toward local manufacturing and multiproduct facilities have companies rethinking their manufacturing strategy,†Phillips says, “with many beginning to have more reliance on flexible single-use systems and technologies. That’s a clear trend.†His company’s Mobius line of systems and equipment exemplifies that approach.
Similar trends favor continuous processing. “Adoption starts with bench-top systems,†Phillips explains, “where process-development scientists and engineers can become comfortable with continuous processing. Once they get the comfort level that they currently have with batch processing, you’ll begin to see more processes moving toward connected and continuous processing.â€
Not every company is so enthusiastic about continuous processing, however. Phillips continues: “I would say that it is almost there for the early adopters. If you were to poll people out there, some companies are hot about it, some would say they’ll never do it, and some are on the sidelines waiting. I think it’s going to be reality, but that’s not going to happen overnight. It’s going to be an evolutionary approach. People have a batch process, they start moving to single-use technologies, then they start expanding their toolbox to have some flow-through technologies, they see they can connect those to another unit, and all of the sudden they are almost at a continuous operation.â€
As usual, regulators will play a key role in encouraging (or discouraging) such adoption. “Regulators are comfortable with the batch method,†cautions Allison of Aroostook Pure. “There’s a beginning, middle, and end. With its indefinite mode of operation, [continuous processing] will have to be better understood by both manufacturers and regulators, and this will take some time.â€
The earliest adopters are likely to be companies making labile products such as enzymes, which need to be processed quickly. A continuous process that speeds up purification could conserve product. Phillips predicts that MAbs will also be likely candidates because their purification strategy is well established.
Process analytical technology (PAT) is a likely companion to continuous processing. The concept involves measuring critical process parameters in-line and in real time, monitoring performance and allowing processors to intervene or shut a system down before product loss occurs. For example, with chromatography columns, performance can decline as resins are reused and combined. Inline measurements could help companies monitor resin performance and other separation parameters and ultimately allow a chromatography system to automatically make adjustments and deal with inherent process variations. “That’s a real application of PAT,†Swinnen points out, “maintaining product quality consistency rather than accepting variations.â€
PAT also could help operators understand what factors have the greatest impact on the downstream process and again help them control variation. PAT doesn’t have to be simply “analyticalâ€; it can help bioprocessors make use of existing data that are routinely collected, such as UV traces for chromatography cycles. KBI Biopharma correlated in-line UV measurements with column performance and resin degradation, then built a training model to predict when UV signals predicted imminent resin failure (4). “Even though we wanted to use a resin out to 100 cycles,†Shukla says, “if we knew that it would expire on the 87th cycle, we could replace the resin then rather than lose a batch.â€
Interesting Times
It all adds up to a volatile landscape in downstream processing. Evolving business environments, the changing nature of biotherapeutics with bispecific antibodies and novel engineered proteins, and the inevitability of biosimilars will continue to put pressure on process developers. Those pressures are forcing industry’s hand toward change and disruption, with even reluctant companies considering new paths forward. “We’re entering into a little bit of an innovation cycle,†Shukla explains, “and there’s more acceptance of that innovation in spite of the fact that the regulatory hurdles haven’t changed.â€
References
1 Rawlings B, Pora H. Environmental Impact of Single- Use and Reusable Bioprocess Systems. BioProcess Int. 7(2) 2009: 18–25.
2 Rawlings B, Pora H. A Prescriptive Approach to Management of Solid Waste from Single-Use Systems. BioProcess Int. 7(3) 2009: 40–46.
3 Jobin JC, Krishnan M. Reducing the Environmental Impact of Single-Use Systems. BioProcess Int. 10(5) 2012: S66–S68.
4 Hou Y. Improved Process Analytical Technology for Protein A Chromatography Using Predictive Principal Component Analysis Tools. Biotechnol. Bioeng. 108, 2010: 59–68.
Based in Bellingham, WA, Jim Kling is a freelance science writer with a background in organic chemistry and over a decade of experience covering topics from biotechnology to astrophysics to archeology. His credits include WebMD, The Washington Post, Scientific American, Technology Review, and Science magazine. He also occasionally writes science fiction; jkling@gmail.com.
Developments in Viral Clearance Bioprocessing has evolved from multiple discrete discontinuous unit operations to a focus on process streamlining and optimization. This latter approach has multiple advantages: reduced process time, decreased handling and storage, lowered potential for introduction of adventitious agents, and overall cost savings associated with reduced numbers of buffers, hold times, and manufacturing personnel.
Downstream processing (DSP) must be designed as a sequence of steps that ensures removal of process-related impurities to safe levels in a final product. Viral clearance strategies are an integral component of DSP operations. Chromatography methods that form the foundation of most DSP processes may serendipitously provide some virus removal. But unit operations such as virus-removal filtration are specifically included for viral clearance. The Basics: A key enabler in bioprocessing is use of platform technologies for products with similar characteristics to leverage similarities in their biochemical properties and a knowledge base gained from prior experience with similar products. Such an approach has been applied to manufacture of monoclonal antibodies (MAbs), for example, when determining which steps to consider for viral clearance. The best approach to minimizing virus contamination is to prevent access of adventitious agents into a production bioreactor. That involves extensive testing of cell lines and other raw materials, including cell culture media, for the absence of viruses. Regulators and industry agree on a triad approach to ensuring virus safety: appropriate sourcing, demonstrating that DSP unit operations provide robust virus clearance, and testing product at appropriate process stages for the absence of contaminating viruses. In general, the industry has moved away from using animal-derived additives (e.g., serum, transferrin, growth factors). Technologies such as irradiation, high-temperature– short-time (HTST), and UV-C inactivation are used to treat raw materials and minimize viral burden. Although they are not a regulatory requirement, such upstream risk-mitigation approaches constitute a form of business insurance. Virus contamination of bioreactors and manufacturing environments are rare, but catastrophic when they do occur. Consequences of such events range from risks to patient safety and possible drug shortages to legal, regulatory, and financial implications. To date, no biotherapeutic product has yet transmitted a pathogenic virus, which lends credence to the fact that our current strategy is effective. The excellent record of safety in biopharmaceutical production has been achieved through use of several overlapping processes directed at eliminating or inactivating viruses and other adventitious agents. According to regulatory guidelines, each orthogonal method must operate by a different mechanism of action to help ensure virological safety of the biotherapeutic. Risk assessment and management are key to maintaining a high level of virus-safety assurance. Determining how much risk is acceptable requires negotiating a complex decision tree. Assessment of causality and management approaches are very different for biopharmaceuticals than for their small-molecule counterparts. Benefit–risk assessment for a nonbiologically derived drug typically involves collating a body of data to provide evidence beyond a reasonable doubt that an adverse event is or is not attributable to the drug. By contrast, each discrete reported case of potential virus transmission must be viewed as a possible indicator of an infectious biologic batch, which carries some level of probability that a disease may be transmitted to patients. Changes to bioprocesses operations will have to be undertaken with a continued eye on viral safety. Continuous bioprocessing is under evaluation. Although promising in many respects, it could pose challenges to viral safety because of concerns related to scale-down and process considerations. So from a virus safety standpoint, the concept is not ready for prime time. Some promising innovations include affinity chromatography with novel ligands that bind specifically to target viral particles. For example, trimeric peptide ligands have been developed to have significant selectivity for mouse minute virus (MMV). Other approaches include developments in virus-removal filtration that provide high-log reduction for small viruses and concomitant high flux rates achieved by optimizing pore size, membrane chemistry, and other membrane attributes. Virus safety assurance is a moving target. New and emerging viruses will continue to threaten the safety of biopharmaceuticals. Biological materials are naturally diverse in origin. The high number of known and unknown viruses and increased globalization — not just with travel across countries and continents, but also trade and commerce that involves raw materials and supply chain management — will continue to maintain our vulnerability. From risk analysis to benefit–risk assessment, a case-by-case approach is warranted. Ultimately, the protection of patients is of paramount concern. Hazel Aranha is head of virus safety at Catalent Pharma Solutions; hazel.aranha@catalent.com. |