Manufacturing Strategy

View PDF

MFGWhat’s on the minds of leaders involved in manufacturing and operations strategy in 2014? They want to maximize efficiencies and cost savings across their manufacturing networks by implementing potentially disruptive technologies, operational excellence and new facility concepts. But they also have to be flexible to the production needs of diverse pipelines in a multiproduct environment. New concepts in biomanufacturing can help companies reach those goals through technology transfer, scale-up/scale-down, continuous manufacturing, single-use applications, and possibly relaxing some biomanufacturing facility environmental controls.

The BioPhorum Operations Group (BPOG) is a consortium of senior leaders (500 representatives from 18 member companies) of the biopharmaceutical industry. They come together to share and discuss emerging trends and challenges facing their industry and establish best practices for a wide range of biotech drug-substance operations. This year, the group has spearheaded a session highlighting its initiatives, discussions, and best practices from four strategic workstreams: continued process verification, microbiological monitoring, variability in raw materials, and the single-use supply chain.

Scale-Up, Scale-Down, Technology Transfer: As new products move through the development pipeline, companies are taking a more strategic approach to process scale-up than ever before. Decisions are made with future needs in mind as well as consideration of current realities. Now most facilities have to be multiproduct in nature, something with which contract manufacturers have long experience.

“CMOs face unique challenges when producing multiple products for multiple customers,” says Bob Munday (vice president of technical operations at CMC Biologics). “Some challenges include extremely rapid development-to-manufacture timelines. A flexible, robust transfer process must be implemented to support customer demands and increasing industry pressure to execute at-scale good manufacturing practice (GMP) batches without at-scale engineering batches.” On Tuesday morning, Munday will present lessons learned and improvements implemented during development and rollout phases at his company.

Larger biopharmaceutical companies are also sharing related experiences at the BPI Conference. On Wednesday morning, presenters from Pfizer and Genentech/Roche will discuss scale-up of both animal cell culture and microbial fermentation processes.

Single-Use Technology: One strategy that’s helping biopharmaceutical companies modernize is implementation of disposables throughout. On Thursday morning, Venkatesh Natarajan (senior process engineer III in global engineering sciences at Biogen Idec) will describe his company’s investment in a flexible manufacturing facility for drug-substance production. This ballroom-style plant runs clinical campaigns entirely with single-use systems in a controlled, nonclassified space.

“Multiple programs have been executed in this facility,” says Natarajan. “In addition, cost models have been developed for this facility to enable a variety of comparisons and evaluations.” In his case study, he will discuss those cost models and address operational, regulatory, engineering, and facility validation of executing a clinical program there.

Berthold Boedeker (chief scientist at Bayer Healthcare) is a well-known proponent of the ballroom concept. That same morning, he will elaborate on the “facility of the future concept” with less segregation and parallel operations for several products at a time. Advances in closed processing combine with disposables and continuous processing to support such manufacturing approaches.

One thing that may well be necessary for such ideas to become widespread reality is standardization of single-use systems. But this is still a highly contentious issue. So those presentations will be followed by a repeat of last year’s town-hall meeting with partcipants from multiple industry organizations. This year, they will identify the types of equipment that are most amenable to and most in need of alignment for minimum requirements and/or best practices. Some progress has been impeded by this lack of standardization.

BPI’s marketing and digital content strategist, Leah Rosin, conducted the following interviews as the conference program came together this summer. Participants addressed antibody–drug conjugates (ADCs) and continuous approaches to downstream processing. Here is what they had to say.

Rakesh Dixit (Medimmune)

Rakesh Dixit (vice president of R&D and global head of biologic safety assessment at MedImmune) will be joining us for the “Antibody–Drug Conjugates: Developing ADC Technologies to Increase Therapeutic Windows” symposium on Monday, 20 October 2014. His presentation is titled “Off-Target Toxicities of Highly Targeted ADCs: Risk-Mitigation Strategies.”

Abstract: Technological advances in ADCs with innovative targets, payloads, linkers, and warheads have revolutionized opportunities to treat deadly cancers. Despite new linker technologies with innovative cancer antigen targets, the off-target toxicities of ADCs continue to pose significant safety challenges limiting selectivity and therapeutic index. I will provide case studies and discuss both off-target and on-target toxicities of next-generation ADC molecules as well as opportunities for risk mitigation of toxicity to improve therapeutic index.

Can you explain how much of a problem off-target toxicity is for drug developers interested in ADCs? There are multiple parts in an ADC: an antibody (which is highly targeted), and in most cases, a highly potent cytotoxic drug, and then some means of conjugation. That involves linkers and other mechanisms that make sure that the combined antibody and cytotoxic drug remains intact in circulation. Basically, the antibody is acting as a vehicle to get the drug inside a tumor. But despite key advances made in linker technologies to conjugate drugs to antibodies, we still get off-target toxicity: toxicities other than in tumors. The goal is tumor cytotoxicity alone, but in real life these molecules can enter normal healthy cells. Once they get into healthy cells, they can release their payload and then cause toxicity. This is a huge problem for ADCs, both in clinical development and approved molecules.

Off-target toxicity also means that you might attain the maximum tolerated dose (MTD) at relatively low doses. Thus, not enough antibody dose could be delivered to tumors. This limits what we can deliver inside tumors, and that limits efficacy. That’s why this is a huge concern.

Can you share some details of the case studies you’ll be presenting? In principle, these highly targeted ADCs target antigens present on tumors but not much present in healthy tissue. However, in our experience — despite a low presence of these antigens in normal, healthy tissues — we still see quite a bit of off-target toxicity. This suggests that despite high tumor targeting, some of these molecules could produce off-target toxicities through some nonspecific mechanism by which the antibodies enter normal, healthy cells and produce damage.

When the ADCs were being developed, most of our concern surrounded on-target toxicity. We focused on targets that are predominantly expessed on tumor cells, with limited expression by normal healthy cells. But in real life, we find that to be a lesser problem. Off-target toxicity is much more of a problem because of how antibodies can enter normal, healthy cells. Once we start using very potent molecules, then even at very low concentrations they can produce toxicity in normal, healthy tissues.

So two examples I’ll give cover tumor antigen targets and off-target toxicities seen with both molecules. Those will be the case studies.

Can you briefly discuss opportunities for risk mitigation? Mitigation approaches are coming with the more stable linkers, those that cannot be broken down easily by normal, healthy tissues or in circulation. We also want to look for target antigens that are disproportionately more present in tumor tissue than in normal, healthy tissues. We are learning that it is impossible to find target antigens that will be exclusively present only in tumors. There will be some presence in normal, healthy tissues.

So what we are trying to do is make a case of more dominant expression in tumor tissue. Also, we are trying to mitigate by learning how tumor antigens internalize ADCs: any differences in the way that normal, healthy tissue internalizes them from how tumor tissues do so. If we can understand those mechanisms, we can create better targeted molecules.

Thus, one method is to come up with better conjugation technology, and the second is to get a better understanding of the way that ADCs are internalized and processed in both healthy tissues and tumor tissues. A third method would be trying to make these products better from a manufacturing perspective: better, more stable conjugates that do not disintegrate or release small-molecule cytotoxins either in serum or even just in formulation. I’ll be discussing some of these approaches.

What other challenges face ADC product developers? Another challenge is the conjugation itself. On paper, it sounds simple to take antibodies and cytotoxic chemicals, then put them together through some linker and you have an ADC. But in real life, conjugating chemicals on specific sites of antibodies is not that simple. In fact, this is probably the slowest process in ADC development. Creating a stable ADC that will survive systemic circulation and biometabolism is the biggest challenge. Chemists and biologists and protein chemists are working very well together now to come up with more stable ADCs and more site-specific conjugation.

Not all sites on an antibody are equally protected from degradation; some sites are more protected than others. We are learning by trial and error (and some other ways) to conjugate toxic chemicals only at those sites that are not easily broken down by enzymes in a patient’s body. This is where the challenges lie — and opportunities come along with that.

Why are you attending the BPI Conference? My interest is in ADCs, and the list of speakers and topics really impressed me a lot. That made me excited about this meeting. Also it helps me to speak at this conference.

Robert Gronke (Biogen Idec)

Robert Gronke (senior principal scientist in technical development at Biogen Idec) will be joining us for the “Continuous Processing for Manufacturing” session on Wednesday afternoon, 22 October 2014. Gronke’s case study is titled “Using Continuous Precipitation for the Purification of High Titer Monoclonal Antibodies.”

Can you describe the general difficulties in purification of high-titer antibodies and why standard purification is problematic? Expression titers have been increasing to more than 5 g/L. Generally, the bottleneck in the process is the capture step, which is typically a protein A chromatography column that can handle antibody loading ratios of maybe up to 50 mg/mL. But, you have to start using larger and larger columns. That’s expensive. You have to cycle it more, and the time gets longer, so we are trying to figure out ways to alleviate this bottleneck in the process.

What purification challenges generally come with continuous downstream processing? Most processes are batch designs, whether fed-batch or batch cell culture. Using a perfusion bioreactor would generate more of a continuous process. How do you process things downstream when the upstream is fixed-in-time processing, such as batch culture? One way is to break it into pieces and continuously process small portions, then move that downstream while other batches are being processed upstream. That’s how you might try to turn a batch process into a continuous process.

Did you develop your method specifically for a single process or product? Will it be implemented for future high-titer monoclonal antibodies, as well? The beauty of continuous precipitation is that it tends to work better with higher-titer product streams. The higher the titer, the easier the method works. So with the industry moving to higher and higher titers, this looks more and more attractive. And as titers increase, the processes work better.

We developed this initially for a single product. But once we understood the conditions and reactions and all the necessary elements, we then started looking quickly to see how “platformable” this is. Will it work for every monoclonal antibody, and will it work for products beyond monoclonals? We’ve been successful in demonstrating that it works for a number of monoclonal antibodies.

Are there any challenges or downsides to using continuous processing this way? There are some challenges in using continuous processing downstream if you are not used to running unit operations in which you carry out something and then pause and then carry out something and then pause, that type of thing. But the continuous polyethylene glycol (PEG) precipitation process really works best in continuous mode rather than with batch cultures because of its ability to deliver reagents and get precipitation to occur uniformly. If you try to do this with batch culture where you add a reagent and don’t get good mixing in large tanks, then you can get inconsistent precipitation.

Can you talk about the regulatory side of this process? How do batch records work in ensuring product quality and consistency? Well, batch records would be designed to carry out a continuous process. They would be written in such a way that you would start a step with in-process monitoring to be sure that everything is consistent. Then there would be some pooling of intermediate at the end, and it would be analyzed again. The whole downstream is not continuous, but the PEG precipitation portion is.

Regulators are really concerned about how to control the process and maintain parameters within a desired state. I think you just have to have the right tools. You can take samples early, middle, and late (for example) to demonstrate that your process has been controlled the entire way. Or you can take alternative approaches: Maybe there is some type of online monitoring tool that you can use to ensure that some parameter is being maintained throughout the unit operation. Flow rate is a simple one, but maybe there is even a level of quantization around the precipitation as it occurs in real time.

Lastly, why are you attending the BPI Conference? I think this conference is going to be very interesting based on the agenda, and I’m really happy to be presenting. And it is fairly convenient for me: I live in the Boston area.

David Rabuka (Redwood Bioscience)

David Rabuka (chief scientific officer and president of Redwood Bioscience) will be joining us for the “Antibody– Drug Conjugates: Developing ADC Technologies to Increase Therapeutic Windows” symposium on Monday, 20 October 2014. His presentation is titled “Generation of ADCs Conjugated Site Specifically with Distinct In Vivo Efficacy and PK Outcomes.”

What is SMARTag technology? The SMARTag approach is a platform that essentially provides a way to site-specifically modify a protein of interest and then conjugate it where you’ve programmed it to occur. Essentially this is a chemoenzymatic method. We use formylglycine-generating enzyme (FGE) to create a chemical “handle” on the surface of the protein. Then we can very selectively and specifically conjugate.

For example, we would take an antibody and engineer it to have our chemical handle in the position where you want to conjugate. Then we would selectively conjugate our payload to that particular site to generate a site-specifically modified ADC.

What is the background behind this technology? It came out of UC Berkeley a number of years ago while I was in the laboratory of professor Carolyn Bertozzi. We were working on a class of enzymes called sulfatases. In their active site, have this interesting amino acid called formylglycine with an aldehyde side chain. In about 2004, it was discovered that a second enzyme was responsible for generation of this particular side chain on sulfatases. That was called formylglycine-generating enzyme.

That enzyme recognizes a very specific consensus sequence of five amino acids. It binds to that sequence and then oxidizes a cystein embedded in that sequence to the formylglycine. So you’re going from a phyol to an aldehyde. Our leap of logic was, “What happens if we take that consensus sequence (found only in sulfatases) and insert it into a protein of interest? Will FGE still do its job?” It turns out that it does.

You can then use standard molecular biology to program that consensus sequence into your protein. And as that protein is produced, the enzyme will create formylglycine on the protein, and that’s where we do our conjugation to make a site-specifically modified conjugate. Essentially the “Aha!” moment was seeing that FGE recognizes a particular sequence.

At Berkeley, we spent a lot of time thinking about bioorthoganal chemical handles. By bioorthoganal, we mean chemistries that are completely unique to that particular site without reacting with any other side chain or amino acid that you naturally find on proteins. So this formylglycine ended up being the perfect bioorthoganal “handle” that we could program into the protein. The big discovery for us was taking an enzyme out of its natural context and leveraging it for other purposes.

Has this technology been used in products at the clinical level? So far, the technology is used in preclinical-stage discovery, both by Redwood Bioscience and its partners.

Does Redwood have other products under development? We’ve developed this SMARTag platform and work with a number of partners to help enable their pipelines. At the same time, we have a number of products that Redwood is developing. We have a bit of balance in working with partners while developing a couple of our own compounds.

Why are you attending the BPI Conference? We have the ability to meet with customers and attend valuable presentations. For me, it’s always exciting to get exposed to new industry trends and new technologies. This is one of the larger bioprocessing events. So we can meet with key decision makers who attend, we can drive awareness of the SMARTag technology to prospective customers and collaborators, and we can have the opportunity to evaluate new partnering opportunities. At the same time we can strengthen our existing relationships, given that we are all going to be under the same roof for the meeting.

Andrew Zydney (Penn State University)

Andrew Zydney (department head and Walter L. Robb Family Chair for the department of chemical engineering at the Pennsylvania State University) will be unable to join us for the “Continuous Processing for Manufacturing” session on Wednesday afternoon, 22 October 2014. However, his colleague Oleg Shinkazh (founder and CEO of start-up company Chromatan) will present their material: “Design and Optimization of Countercurrent Tangential Chromatography (CTC) for Monoclonal Antibody Purification.” Although Shinkazh will present their new data, BPI’s marketing and digital content strategies, Leah Rosin, spoke with Zydney this summer.

Can you briefly describe how CTC works and list its advantages? CTC is a new technology for purifying proteins (particularly monoclonal antibodies) using conventional chromatography resins. Separation steps are performed on a flowing resin slurry instead of a packed column. Contact between resin and product stream occurs in a static mixer, and separation between particles and the surrounding fluid occurs in hollow-fiber membrane modules. This allows resin to flow directly from a binding stage through a washing stage into an elution before then recycling back into a mixing tank, where the resin can be used all over again.

There are three main advantages of CTC: First, because the resin is used as a flowing slurry in hollow-fiber membrane modules, the system operates at very low pressures and thus requires no stainless steel equipment. So this technology can involve single-use disposable flow paths.

Second: Because resin is pumped through the system, it can run continuously. That allows CTC to be connected directly (for example) to a perfusion bioreactor. Finally, because the system has resin in operation at all points in time, it generates results at much higher productivities than conventional chromatography. So it uses significantly less chromatography resin for the same separation.

Are there any disadvantages to CTC? The main challenge right now similarly faces all new technologies in bioprocessing. Everyone tends to want somebody else to go first in implementing a new downstream process technology. Right now, there are no commercial data for CTC because the process is still in development. In addition, some small technical hurdles need to be overcome. They include the fact that CTC does require slightly larger volumes of water than conventional chromatography because the resin is used in a suspended slurry instead of in a packed column.

Can you tell me a little more about the process used that you’ll focus on in your talk? Our talk is going to focus on initial product capture from harvested cell culture fluid. We are targeting smaller-scale production levels typical of what you would see in production of a clinical batch. That’s something between 10 and 100 L. And our focus is on monoclonal antibody purification using a protein A resin. That’s typically used in the initial capture step right after clarification of cell culture fluid.

Why and how does the CTC yield tenfold higher productivity than traditional column chromatography? The productivity in this case is defined as grams of product that can be purified per liter of resin per unit of time. The main difference between CTC and conventional chromatography is that all resin in a CTC system is used effectively at any instance in time. Some resin is mixed with the feed stream to bind an antibody product while some resin is washed and other is eluted, regenerated, and then reequilibrated for use all over again.

By contrast, in a traditional packed column only a small fraction of resin is used at any instance in time. At the beginning of a chromatography column process, resin near the exit of the column is just sitting there not seeing any antibody at all. Toward the end of that process, when most of the column is saturated, the resin in the top of the column isn’t active. Only a narrow band of resin functions in terms of binding product in any instant in time.

In addition, you don’t move on to the washing and elution steps until the entire column is saturated. In continuous tangential chromatography, although those steps in the process occur simultaneously, and all resin is actively used at all times. So we typically see at least tenfold higher productivity than what you can get in a traditional column chromatography system.

Why are you attending the BPI Conference? What else do you plan to get out of it besides speaking? Obviously, I’m very interested in the opportunity to introduce this new technology to other people. This conference is also an opportunity for us to obtain feedback from end users who might be interested in using this technology in their own downstream processes. We are also at a point where we are beginning to identify alpha and beta test sites where we can actually use CTC on feed streams across the industry to demonstrate the capabilities of the technology. This will be a great opportunity to talk to representatives from companies who might be interested in ultimately adopting or at least testing the technology on their product streams in the near future.

Listen Online! These interviews have been edited from transcripts for space and style. You can access the original conversations at www.bpiroundtables.com/bpi14.