Barring fire, major earthquakes, or volcanic catastrophe, concrete is good for centuries — the Pantheon has been in continuous use since 126 AD. The long expected life and high initial cost of biomanufacturing buildings and equipment builds legacy into the system from the start. And the imperatives of launching a new biotechnology industry in the 1980s led to the building of many facilities within a few years to produce the first wave of recombinant DNA products.
I spoke with several industry experts regarding legacy technologies in bioprocessing for a special report in the September issue of BPI (1). The discussions herein delve deeper into some examples among upstream production and downstream processing unit operations.
Concrete and Steel
Andrew Skibo (executive vice president of operations at MedImmune) pointed out some key factors that produced that clutch of concrete and steel dinosaurs. Most plants built in the 1980s were “designed by no more than three or four engineering firms — Jacobs Life Sciences, Foster Wheeler, Fluor, Bechtel — so plants tended to look alike even down to operating details.” Likewise, only a few vendors made steel vessels and instruments, and validation was virtually identical from plant to plant, so even total capital costs converged.
A typical commercial-scale plan was based on the “four-pack.” These legacy plants are heavier on fermentation than downstream capacity. That relates largely to expression technology. Skibo explained, “In the 1980s, production titers were so low that facilities were designed under the assumption that those would improve threefold by the time one came online. By 2000, the industry assumed that titers couldn’t get much better and designed plants accordingly. But titers kept improving! By the time those facilities came online, titers had gone up by a factor of three or four.”
Around the turn of the millennium the blockbuster era peaked (along with the biotech financial bubble), and the average projected revenue from newly approved biologics trended downward. This led to smaller bioreactors, shorter runs, and more products: plants with multiple production suites, each based on two 2,500-L tanks. And as the first wave of biologics patents started running out, the prospect of generic competition had to be considered. Skibo describes the change “from huge production and rapid scale-up to less instrumented, varying technology, higher titers, more automation, and lots of small-line products.”
Legacy or Albatross? In the past decade, with improved upstream yields and corporate consolidation, the industry has moved from a potential shortage of global bioreactor capacity to a glut of battleship facilities in the hands of fewer companies concentrated on the US eastern seaboard. The time it takes to engineer, build, and validate made some plants obsolete before they could go online (2). Prohibitive refit and revalidation costs got those plants mothballed, written off, or sold to other companies and CMOs.
Having been in the biomanufacturing business since developing bacterial production of lactic acid in 1893, Boehringer Ingelheim became a major CMO. The company bought an Amgen facility in Fremont, CA, and turned it into the largest contract mammalian cell culture facility on the west coast (3). By contrast, MedImmune (founded 1988 as Molecular Vaccines) took a different route. This maker of the Synagis blockbuster for pediatric RSV since 1998 is now (as the global biologics arm of AstraZeneca) developing a pipeline of new biologics to address unmet medical needs.
“Until recently,” Skibo says, “almost every company had two or three flagship products with very large protein demand. The big battleship manufacturing facility might run one campaign per year, making the same product for 12 months straight. In 2020, instead of two or three products, we’ll have 10 or 12. Instead of just making our own products, we’ll be sharing that capacity.”
Boehringer Ingelheim solved its own facilities issue by becoming an effective CMO. But Skibo says, “We didn’t believe that MedImmune could manage that as competitively [as others have]. In fact, we use companies like BI; CMOs will always be in our mix. So we elected to go in the trusted-partner direction. We looked for a partner with reciprocal capacity. We have no commercial microbial capacity; Merck does.” So mammalian products from Merck go to MedImmune’s Maryland facility, and MedImmune’s microbial products go to Merck. “Within six months of our announcing the Merck transaction, Amgen, Genentech, and Biogen Idec also announced various arrangements with outside companies to take better advantage of their capacity.”
The post-legacy future is one of constant change and flexible factories. Single-use technology is an important part of the changing bioprocess paradigm. Given the wide and increasing variety of single-use components, tools, and services now available, it has become more cost-effective than before to retrofit an old facility as a flexible factory with modular, portable, and integrated parts (4, 5). And the latest advances in instrumentation, automation, and control technology are easier to install as part of a rip and retrofit effort. If deftly performed, this can help a company to keep the hard-to-replace legacy assets of people and their process knowledge. In a process not unlike that of encapsulating legacy IT systems, MedImmune’s legacy upstream-heavy facility in Maryland now has an adjacent, much larger, 21st-century state-of-the-art flexible factory that houses multiple lines and the latest in process control, with training simulation and single-use technologies.
Expression Systems— Go with the CHO
CHO expression is a legacy in several senses. The cells are linear clonal descendants of established Chinese hamster ovary cell lines, some >50 years old (6). In use since the 1980s, CHO systems today produce four of the top five blockbuster biologic products and remain the most common cell expression technology (making ∼70% of biologics) (7, 8). How did a few immortal cells from an obscure Asian rodent become the dominant expression system and hold onto that dominance for a quarter century despite the hundreds of alternative expression systems that have emerged?
In 1982, the success of recombinant human insulin made at commercial scale by Eli Lilly under contract with Genentech using an Escherichia coli system encouraged research and attracted capital for developing other therapeutic proteins using recombinant DNA technology. One path that seemed especially promising was to create true Erhlich magic bullets: monospecific antibodies targeting particular antigens to treat diseases or correct faulty metabolic pathways. But no prokaryote has the intracellular machinery to produce such complex proteins.
As eukaryotes, yeasts were an obvious next choice: Saccharomyces cerevisiae had thousands of years of domestication and decades of use as a workhorse scientific mode
l behind it. However, S. cerevisiae tends to hyperglycolsolate secreted proteins and was thus not used (9). Later elucidation of its genome (in 1996) and advances since then in humanizing it and other yeasts (Pichia pastoris) are making yeast a much more attractive option today than in the 1980s (10).
Back then, bioengineers turned to mammalian cells. Rodents were familiar to genetic and medical research. Hybridomas were developed by Milstein and Kohler in 1975, who created fusions of B-cells that secrete an antibody of interest with immortalized myeloma (B-cell cancer) cells. That enabled monoclonal antibody (MAb) production based on inducing intraperitoneal tumors in mice (and rats, rabbits, or guinea pigs) secreting MAb-rich fluid (ascites) that could be drawn off and purified. Several early MAb therapeutics were made in ascites. The method continued in use through the mid-1990s, but it has serious disadvantages including nonscalability, and the use of whole animals in an (at best) uncomfortable procedure.
Susan Dana Jones of BioProcess Technology Consultants reminisced, “We started with ascites at Miles. CHO was in the mix by the late 1980s and the early 1990s, and slowly the benefits of CHO made it emerge as an industry leader. But in the late 1980s, if you asked which cell line to use to produce a particular therapeutic protein, you wouldn’t necessarily have heard CHO.”
A number of immortalized cell lines were already established in research: NIH 3T3 (mouse embryo, 1962), Vero (African green monkey kidney epithelial cells, 1962), HeLa (cervical cancer cells from Henrietta Lacks, 1951), HepG2 (human hepatocellular carcinoma 1979), and BHK (baby hamster kidney) cells (1961), as well as the first CHO lines (1957). With a relatively low diploid chromosome number (2n = 22), highly adaptable CHO cells were already widely used in biomedical research.
Creation of auxotrophic mutant CHO cells unintentionally made possible the DHFR (dihydrofolate reductase) expression system, which became a standard molecular biology tool for vector-mediated gene transfer. “This ability to transfect, select, amplify, and stably express biologically active heterologous proteins became an immense boon for biopharmaceutical companies involved in large-scale protein therapeutics” (11). Several DHFR-deficient cell lines have become established biomanufacturing expression hosts.
A suitable industrial host cell line should be easy to genetically engineer, should grow happily in suspension at large volumes, should be free of adventitious pathogens, and should perform appropriate posttranslational modifications (12). CHO fits all those criteria and has proven itself to be a remarkably safe host — most known human pathogenic viruses do not replicate in CHO cells — that is amenable to cost-effective downstream purification processing (12).
The first commercial rDNA therapeutic produced in mammalian cell culture (Activase enzymatic therapy from Genentech) was expressed by CHO cells. In a classic conservation of process, nearly every year after its launch in 1987, one or more recombinant therapeutics made the same way received FDA approval. More than a dozen were on the market by the beginning of the 21st century, including breakthrough treatments for previously untreatable Gaucher’s disease, rheumatoid arthritis, and multiple sclerosis.
Ideally, an industrial cell line should produce large amounts of a target protein. Initially, CHO was a modest producer — expressing up to hundreds of milligrams of protein per liter of culture. By the end of the 20th century, average CHO MAb titers had improved to 1–3 g/L — and to the surprise of most people in the industry, they have continued to improve (to 10 g/L and beyond), which contributes to the current glut in upstream capacity. That improvement was accomplished even without a thorough understanding of the genetic basis of protein hyperproductivity, which is now being addressed with ongoing efforts to sequence the CHO cell line and overall Chinese hamster genomes.
The persistence of CHO cell lines is not due to a lack of innovation in the field of protein expression. Several hundred alternative expression systems have been in development or production. These include whole organism systems: transgenic mammals (goats, rabbits) that express a protein of interest in milk or blood; research plants such as Arabidopsis; field crops from alfalfa to tobacco; and algae, duckweed, and moss species grown in photobioreactors. Alternative cell lines include many species of bacteria (e.g., Pseudomonas); algal chloroplasts; carrot, coconut, and tobacco cells; protozoa; insect cells transfected using baculovirus; several species of yeast; chicken stem cells; and a number of human cell lines. Some of these are already used in production; others (e.g., Glycofi’s yeast system) have been validated by merger. Merck paid $400 million to acquire Glycofi and is using its system to manufacture a biosimilar version of Amgen’s second-generation erythropoietin (13).
Many alternative-expression companies were founded with the expectation of beating the cost of CHO production, only to find the bar rising higher than when they were funded. “If you don’t know how much more efficient those new systems are going to be,” explained Dana Jones, “it’s really hard to evaluate them against something that has 20 years of installed base, both in knowledge and infrastructure.” Recent genomic advances are providing biomarkers and other tools that, along with modifications to batch processing, make CHO titers of 20–30 g/L seem possible.
Jones put this in context: “CHO is so dominant in the industry now, with serum-free media for CHO, and now that the genome is being sequenced, people are really starting to understand the metabolism and find biomarkers for CHO growth. The industry is partial to CHO, so we keep inventing new ways to get continual improvement out of CHO systems.” In addition to being a robust, resilient and familiar legacy process, the continued improvement of CHO productivity and consequent reduction in cost has produced a Red Queen’s race against alternative expression systems. Along with a comfortable familiarity among regulators, that is probably the secret of CHO’s continuing success — a lively legacy after nearly 30 years in use.
Batch Fermentation and Culture Media — The Oldest Legacy
Batch fermentation has by far the oldest roots of any legacy process in biomanufacturing, at least in terms of using living microorganisms to produce a chemical of human interest. The earliest physical evidence of fermented (carbohydrates-to-alcohol) beverages dates back nearly 10,000 years, coinciding with technical innovations (domestication of grain and fruit, development of pottery vessels) that enabled purposeful (if not quality controlled) manufacture. A “winery” in Anatolia produced parallel batches in clay-lined pits some 6,000 years ago. The Sumerians and Egyptians left documentary evidence of commercial-scale beer production, with the first quality regulations in the Code of Hammurabi (1772 bce). Batch production of wine and beer/ale was a major industry in most Old World cultures through Classical times and the Middle Ages to the present day.
Scientific understanding of fermentation begins with Lavoisier’s chemical equation describing the conversion of sugar to alcohol. Although Theodor Schwann correctly attributed alcohol fermentation to “sugar fungus” in 1837, the scientific community resisted his hypothesis that microbes ingested sugar and excreted alcohol and carbon dioxide, preferring instead to credit extracellular enzymatic processes. The paradigm shift was inspired by adventitious contamination, when a manufacturer of industrial alcohol consulted Louis Pasteur in
1854 about batch failures. Fermentation has made many other useful chemicals: tartaric, citric and lactic acid, vitamins and antibiotics. Inventors of such novel bioprocesses founded many pharmaceutical companies in business today.
All those were batch processes. The idea of continuous processing has origins in the ancient use of a chain of pots to remove water from excavations — and more recent assembly line concepts. However, continuous processing came late to brewing, arguably in 1953 with the patenting of a continuous process for beer by New Zealander Morton Coutts.
In modern biotechnology, “fermentation” of single-celled microbes and “cell culture” for animal cell systems are similar: keeping cells alive and growing in a tank of raw materials that they metabolically process into a product of interest. Bioreactors have operated in batch mode since their invention in the 1970s. Scale-up has generally meant moving from bench-top to roller bottles or 2-L pilot bioreactors and up batch by batch to 2,700-L or 15,000-L tanks.
Consider MAbs. In the 1980s, ascites from whole animals were sometimes used (also a batch process), but today all but a very few MAbs are made in legacy batch or fed-batch cell culture systems (with a few semicontinuous perfusion processes) (13). The very dominance of batch cell culture, especially CHO systems, makes for a big installed base of existing batch process capacity.
Bill Whitford (senior manager for the bioprocessing market in Thermo Scientific Cell Culture and Bioprocessing at Thermo Fisher Scientific) explained how that freezes batch as a legacy: “Even if someone is intellectually convinced that continuous would be better, if everything’s working fine, with legacy or pioneer drugs, why incur the cost and risk to change it? Also, even though some upstream continuous processes have been available for 20 years, for the most part people don’t know them, haven’t even considered them. People have been working on improving fed-batch culturing for so long that it’s a distraction from moving to continuous.”
The FDA specifically mentions batch processing as part of “conventional pharmaceutical manufacturing” in its process analytical technology (PAT) initiative of 2004 (14). So this is likely to be one legacy the agency had in mind when encouraging the adoption of innovations in its pharmaceutical industry guidance for the 21st century (15). As yet there has been no industry stampede into even semicontinuous processing. Ironically, one commonly cited reason is concern about regulatory delays.
Despite the obvious commitment from FDA leadership to drive towards such innovations, Whitford points out that many bioengineers think “everybody’s been using batch, and maybe my agency reviewers won’t like this if I am one of the first to go to a continuous process.” He offered a long list of other reasons innovations are resisted, including the need for new testing and monitoring capabilities as well as new strategies in process control and knowledge management. It’s especially difficult to move away from batch process systems when a combination of installed base and business/financial/regulatory imperatives continues to weigh against major innovations.
Culture Media — A History of Thawing
The culture medium is an integral component of a fermentation or bioreactor process. Originally, the medium was the product: Grape juice became wine, for example. But once the germ theory was accepted, the understanding that different inputs (or differently timed inputs) affected the final product transformed the engineering approach to fermentation. Early bioengineers also found that adding oxygen and supplements as well as controlling pH and temperature could improve process yields.
A practical understanding of controlling nutrient media developed in parallel with development of bacterial and fungal fermentation processes to make various chemicals, especially antibiotics. That provided a scientific and engineering foundation to the greater challenge of feeding living mammalian cells in bioreactors. Bill Whitford provides a historical perspective: “Early on, when we first began culturing mammalian or animal cells outside an organism, we used a cocktail of amino acids, carbon sources, and sugars plus animal sera. This was already established practice in research when they began using cell culture in bioproduction about 30 years ago.”
But mammalian cells are not at home in a culture vat (unlike yeast, for which 10,000 years of domestication has produced species whose natural habitat is bread dough, barley mash, or grape juice). So once a medium formulation worked for a particular recombinant cell line, bioengineers tended to stick with it, especially when that formulation became part of the market authorization.
“Let’s say we want to create a specific supplement that improves titer,” Laurel Donahue-Hjelle of Life Technologies explained. “We’ve got to create this using components that are already used in cell culture media so that anyone looking at that list of ingredients could say there’s nothing problematic, nothing that hasn’t been taken through the FDA, the EMEA, and used in other medium formulations. You can tweak the combination, how it’s formulated, a number of things. This conservatism is driven by safety concern for patients and the need to have a robust, reproducible process. That leads to a desire to not change unless you’re really driven to.”
On the other hand, Whitford explains, “These cocktails were variable in composition and hence in performance — especially serum. Every lot of serum is significantly different, not just in performance. If you do a chemical analysis, you find different levels of many key components based on what the cows were eating, the time of the year, and so on.” In mammalian cell culture systems, such variability can spoil a batch or dramatically change yields. That makes quality control engineers unhappy — and biologics regulators more so. Even frozen formulae are subject to unwanted variability. Animal products also are susceptible to contamination by infectious agents.
As Whitford points out, bioengineers began to seek alternatives to whole sera because, although “risk from adventitious agents such as contamination by virus was a factor, it was more a matter of consistency and performance. So we moved away from serum, but media still contained many animal components, including meat hydrolysates. Many amino acids also came from animal sources, including some for which the major source was human hair!” So although media formulations tended to “freeze” legacy-wise, other forces were pushing for thaws. As is so often the case, disaster was one of those.
In the 1980s, an epidemic of bovine spongiform encephalopathy (BSE) broke out in the United Kingdom, spread by commercial cattle feed containing meat and bone meal from slaughtered cows that was contaminated with the BSE prion. A spike in the number of cases of sporadic Creutzfeldt-Jakob dementia prompted growing anxiety about human transmission from infected beef. As infectious agents, prions are even more frightening than viruses. They cause inevitably fatal degenerative neurological diseases with long incubation periods, making the potential harm enormous and cumulative. They are smaller than viruses, harder to detect (especially if you aren’t looking), prone to sporadic emergence, and can cross species barriers. Several TSEs were known by the 1960s, both transmissible (e.g., scrapie in sheep and kuru, spread through ritual human cannibalism) and familial (Creutzfeldt-Jakob disease). The prion-protein hypothesis was proposed in the 1960s, but it was not until 1982 that the first prion was identified. The high-profile crisis alerted bioengineers to yet another previously “unknown unknown” in animal-source ingredients (1). And no one was comfortable with beef hydrolysates anymore.
Whitford said, “Then we formulated media with no serum and no animal products, but these media were still undefined because we were adding, for example, plant hydrolysates from rice or wheat, which were continued sources of variability. Each lot of hydrolysate performs a little differently, and you can’t control that. Our goal became that the only inputs in culture media would be known reagent chemicals: serum free, protein free, animal-product free, and chemically defined. Then we increased our focus on, ‘What are your cells actually doing? What do they specifically need? And what’s going on genetically or in terms of systems biology? What is controlling the metabolisms here?’”
Those are particular issues with CHO systems, for which the genomics are only now becoming known. And that poses another concern, one that Donahue-Hjelle explained: “Now that we are able to deep-sequence our host cell lines, we’re probably going to uncover things in those cells that we weren’t able to test for previously. But if you’re actually looking for sequences to see whether there are any viruses or remnants of virus present, these are surely going to show up. What’s going to be our response collectively as an industry and in our regulatory bodies, what do those mean? I think we’re going to get the data ahead of knowing what to do with it, and how to interpret it.” That has implications for media as well — having the data, knowing what they mean, and knowing what to do with that information are three separate things.
Downstream Legacies— Protein A
Protein A affinity chromatography is frequently mentioned as an example legacy in biomanufacturing. Downstream processing is adapted to each upstream process. At any time in the past 10,000 years, careful and lucky makers of beer or wine need do little more than pour their finished products into clean containers for consumption. For centuries, simple precipitation has been the major means of separation for the byproduct chemicals of wine such as tartaric acid. The downstream processing of proteins is far more complicated. For over 50 years, blood plasma fractionation has used precipitation with organic solvents — and more recently, chromatography (16).
For biopharmaceuticals, chromatography and membrane filtration are the main purification tools (16). Affinity chromatography uses resins or other supports with attached ligands that attract and bind the protein of interest, sequestering it from the byproducts of bioreaction and making it available for further purification steps. Staphylococcal Protein A was one of the first discovered immunoglobulin-binding molecules.
A very high affinity for antibodies made it the ligand of choice for MAb chromatography.
Donahue-Hjelle told me why. “Protein A, like many other legacy systems, remains in place because it was the first platform for that use that really worked well. Once it gets locked into multiple processes, then it has history, and then it is a legacy. So many different companies have used it now, it’s been approved in so many products, that the FDA, EMEA, and all the other regulatory bodies are comfortable with it. And we know it’s been safe. We know there haven’t been adverse events tied to that product or that process. It just takes on a life of its own.” And that’s the essence of a legacy system: It’s there, and it works. People are afraid to change, regulators are not asking for a change, and cost pressures have not yet pushed it away.
Bryan Monroe of Primus Consulting commented on cost and rising titers: “It’s really ironic. The motivation for using Protein A made sense in the 1990s because you were looking for two grains of sand in a five-gallon bucket. Today, the primary material that’s in supernatant is your protein of interest, so why are you still using affinity? There’s nothing to prevent you from finding your protein of interest, yet we still use affinity simply because it’s well-established and easy. But unfortunately it’s also really expensive.”
Tom Ransohoff of BioProcess Technology Consultants explains why that high cost hasn’t yet pushed Protein A from use: “Protein A is expensive when you look at it per liter. But when you look at what it does to reduce development time, enable platform processes, and combine several steps into one, there is an argument that it’s still pretty cost-effective technology. The value is still there, which is why it remains the most commonly used capture step for antibody purification.” If the essence of what makes a legacy technology is that the cost of changing is higher than the cost of continuing, then Protein A purification is certainly a type specimen.
Author Details
Ellen M. Martin is managing director of life sciences for the Haddon Hill Group Inc., 650 Kenwyn Road, Oakland, CA 94610; 1-510-832-2044, fax 1-510-832-0837; emm4@pacbell.net, emartin@haddonhillgroup.com. She is also a principal consultant for Kureczka I Martin Associates, www.kureczka-martin.com.
REFERENCES