Special Report – 11th Annual BioProcess International European Summit

View PDF

13-3-SpecialRpt-CoverNow in its 11th year, the BioProcess International European Summit will host more than 400 bioprocessing professionals from adademia and industry. Five major streams, seven keynotes, poster presentations, and an exhibition hall will take place 14–15 April at the Swissôtel Düsseldorf Neuss in Germany. Together, these opportunities fulfill the conference’s mission: to be the event where the biopharmaceutical industry connects to share new ideas and innovations across all phases of bioprocess development.

In two days, the BPI European Summit will offer presentations from more than 90 speakers, including representatives from major industry facilities and regulatory agencies. BPI editors asked some of them to share perspectives on key issues facing the bioindustry today, including processing innovations, novel technologies, and therapeutic advancements.

Keynotes
This year’s conference will include keynote addresses during both days, before and after sessions (see “Keynotes” box). Manufacturing topics will focus on quality management, low-cost and high-volume processes, flexible and continuous processing, future technologies, and efficiency strategies. The three presentations highlighted below focus on FDA’s breakthrough designation process, on future technologies, and on the next generation of biologics.

Keynotes
Tuesday, 14 April 2015 8:30 am: Managing Product Quality Across a Manufacturing Network (Anthony Mire-Sluis, Amgen)9:10 am: Innovation That Drives Process Efficiency (Craig E. Smith, Thermo Fisher Scientific)9:50 am: Supporting Projects with FDA Breakthrough Designation (Niklas Engler, Roche)5:35 pm: Challenges and Opportunities in Vaccine Manufacturing (Rahul Singhvi, Takeda Vaccines)6:05 pm: Future of Biomedicines (Günter Jagschies, GE Healthcare)
Wednesday, 15 April 2015 9:00 am: Manufacturing Plants of the Future (Roman Necina, Baxter GmbH)9:30 am: The Future of Biologics CMC (Jens Vogel, Boehringer Ingelheim)

Accelerating Development and FDA Approval: The FDA’s new breakthrough designation process intends to expedite the development and review of drugs for serious or life-threatening conditions. As of 29 December 2014, the agency had approved 16 drugs designated as “breakthrough therapies,” 11 of them first-time approvals for novel drugs. In November 2013 Roche’s monoclonal antibody (MAb), Gazyva, became the first drug product approved through this process. Niklas Engler (head of technical development, biologics Europe, at Roche) shares his company’s experience in this process in his keynote address, “Experiences and Strategies to Support Projects with FDA Breakthrough Designation for Biologics.”

How does the FDA’s breakthrough designation affect bioprocessing and technical support teams? “The big challenges we face at the moment aren’t necessarily in protein production; that is fairly well understood. Rather, the challenges are in how technical teams support variations in clinical trials, whether it’s an accelerated program such as FDA breakthrough designation or heavily gated programs where you don’t do any technical support for a long time and then have to catch up very quickly.

“The FDA’s breakthrough designation has created a direct dialogue between sponsors and the agency. For Gazyva, which attacks targeted cells both directly and together with the immune system, we had weekly exchanges with the FDA. The speed in which the FDA was working with us was really impressive. Obviously, the financial and personnel investments are huge for such a product, so you have to make sure all the departments are aligned, including technical, clinical, and nonclinical development. If you are not prepared, then you can slow down any accelerated advantage you once had. My presentation will report some recent experiences and strategies to prepare for the acceleration.”

What is the future of bioprocessing? “I think the biomanufacturing industry has come of age. A lot of the technical challenges have been solved in terms of titer and productivity. Now it’s more about speed and scale. The key is using the right technologies and analytics. More than ever before, manufacturers need to focus on speeding up and preventing slip-ups. In previous years, you could afford to lose ground because you knew you could make up it later. That option isn’t available anymore.”

Current and Future Trends: In just a few decades, the industry has experienced great advancements in processing, manufacturing, and analytical methods and technologies. In his keynote address, “Future of Biomedicines,” Günter Jagschies (senior director of strategic customer relations at GE Healthcare Life Sciences) reviews the milestones of the industry and provides his perspective of what the future may look like.

What are the significant trends in the industry? “At the moment, the industry still has different compartments. Some manufacturers are making recombinant proteins and others MAbs, for example. Increasingly that is diversifying. Now, a MAb is no longer simply a MAb; it keeps its functionality, but the molecule’s structure may change. The industry is developing bispecific MAbs, single-chain antibodies, antibody–drug conjugates (ADCs), various kinds of fusion proteins, and so on.

“At the same time, we are seeing a revitalization of the vaccine industry. Ebola has recently reminded us that we still need to cover many more infectious disease threats, but also there is a new hope that vaccines could become a profitable undertaking for companies. Everyone understands that vaccines are so much more cost efficient for healthcare because they prevent disease rather than treat patients. For example, the success of Prevnar (Pfizer) and Gardasil (Merck) have shown that manufacturers can, in a relatively short time, get a blockbuster vaccine. And they have shown it can be done with a modular, platform approach, whereas in the past you needed a new manufacturing plant and platform for every vaccine coming to market. The industry also is experiencing a consolidation of the vaccines industry (e.g., Novartis and GlaxoSmithKline). Research is quite active: People are modifying and improving their processes and targeting some of those 30–40 diseases that don’t yet have a treatment. A vaccine for Dengue fever seems to be close to market and will meet a very significant demand.

“The industry is also facing greater competition. Biosimilars are coming, as evidenced by the approval for Celltrion’s Remsima (infliximab) as the first biosimilar MAb filed with the FDA. It’s to be expected that selling price pressure will increase. Then even more focus will be on manufacturing and R&D spending.”

What do you see as the future of bioprocessing? “Advanced biomanufacturers can produce titers in the 5–10 g/L range. Those companies are increasingly looking to grow cells at very high cell densities. The seed-train part of the process can be shortened by having a high-density n – 1 seed train that can inoculate a large production bioreactor. We have tested that method in our laboratories. With a high-density culture we can inoculate 500 L from a 1-L seed-train bag. This saves you steps and time in upstream processing, which is still slow overall.

“Increasingly, manufacturers are taking an interest in either semi- or fully continuous processes. Those approaches can apply to the capture step alone or combine several steps into one. That intensifies the use of equipment and reduces hold times and nonproductive stages of the process. My company has just introduced a laboratory-scale system for testing a continuous chromatography process. We are also helping people connect several columns into one operation. Continuous downstream processing is reaching manufacturing now.

“However, I would be careful to say that continuous processing isn’t going to come in a rush. It’s still something you either believe in or don’t. There are still people who question whether it makes sense, but everyone is testing it and creating a proof-of-concept for later implementation. If you consider continuous processing as a concept in which all equipment is kept busy to a maximum degree, there is little disagreement though that this is the future direction.

“In general (and this is my personal view), the industry will see an intensification of vaccine development and an expansion of therapeutics affordability to new markets and regions. Access is driven by lower prices even as incomes grow in the emerging markets. To penetrate those markets, the industry will have to go local with their products and their marketing strategies.

“I believe that the industry will see more therapies such as antibody–drug conjugates and other, more potent oncology products. Some manufacturers will explore new mechanisms; they will get a better tool kit. There will be more cancer treatments to treat patients in a better, longer lasting, truly lifesaving way. With first- generation biologics, there was a gap between the business and the understanding of disease mechanisms. The industry will close the gap. We will see the same for autoimmune disease. I think we will have the first cell therapies coming through — maybe not in my work life, but they will come as the next level of therapy, lifting us from treatment to cure.”

Future Biomanufacturing Facilities: The need for shorter timelines and reduced costs have affected the way bioprocessing facilities are designed. Roman Necia (vice president of process science and technical operations at Baxter Innovations GmbH) will present “The Future of Biomanufacturing.” He will discuss the importance of flexibility in today’s manufacturing processes for addressing clinical trial risk and meeting specific process requirements.

What are the key challenges and opportunities within the framework of bioprocessing? “Over the past five to ten years, manufacturing, testing, and commercialization of recombinant proteins have become more and more standard activities. It is much more predictable, especially for MAbs or large-volume products. That is what the biopharmaceutical industry is very good at.

“But in the future, we probably will not have as many blockbusters. Future processes will be manufacturing small-volume products and products that are much more complicated than recombinant proteins. The industry will have chemically modified proteins and new gene therapy vectors, for example. That means that the bioprocess industry will need to come up with new solutions for complex new biological entities. From a manufacturing perspective, the biopharmaceutical industry must become much more flexible to accommodate different products in one plant.”

How does flexibility relate to process development and analytics for this next wave of more complex molecules? “From a process and assay development perspective, we have to broaden our range of available technologies and unit operations. So far, we have been focusing on unit operations that are low risk and scalable from a process perspective. They are good enough to meet our requirements. But the more complex and small-volume products we have, the faster and more cost efficient we will have to be. So costs will become a huge driver for innovation and for simplification of our relatively complex manufacturing processes today.”

What is your perspective on the most exciting parts of the industry to come? “The biopharmaceutical industry is nowadays much broader in terms of the molecules we are currently focusing on. We can broaden the area of therapies that we can offer to patients, have new molecules in place with efficient manufacturing and testing processes in place. That should have a positive impact on the affordability of therapies.”

What technologies are you excited about? “We are very excited about continuous manufacturing technology because that enables us to significantly reduce the footprint of future plants. We can reduce the capital needed to build facilities and decrease the overall time from plant design to operation because continuous manufacturing plants are smaller.”

Stream 1: ADCs, Novel Therapies, and Biosimilars
ADCs and biosimilars are poised to become the next industry game changers, following on the success of monoclonal antibodies. By linking the targeting specificity of MAbs with the potency of small-molecule drugs, ADCs represent new opportunites in both disease treatment and market growth. Their development, however, presents unprecedented challenges for researchers and manufacturers. Likewise, emerging therapies, biosimilars, and biobetters now in the pipeline are driving advancements in analytics and development.

ADCs, Novel Therapies, and Biosimilars Sessions
Tuesday, 14 April 201511:35 am – 1:00 pm: Quick and Clean ADC Process Development2:15–3:45 pm: Manufacturing ADCs in Multipurpose Facilities4:30–5:30 pm: Novel Therapies and Novel Production Methods
Wednesday, 15 April 201511:05 am – 12:10 pm: Biosimilars Market/Setting Standards2:00–3:30 pm: Bringing Biosimilar MAbs to Market4:00–5:00 pm: Process Development for Biosimilars

ADC Process Development: Minimizing manufacturing costs means reducing development time from bench scale to full good manufacturing practice (GMP) production while addressing project constraints. Eric Lacoste (ADC team leader at sanofi-aventis R&D, France) will discuss QbD strategies for developing a scalable ADC process in his presentation, “From Bench to GMP, Developing Quick and Clean ADC Processes.”

What is the biggest challenge in scaling from a bench-top process to a phase 1–3 clinical trial process? “The main challenge is to make sure that we have a downscale model or at least a representative model that is effective to perform small-scale experiments (to minimize raw material need). That will help us predict what we target at bigger GMP scales and facilities. With different clinical phases, the challenge is to ensure that we produce sufficient and robust data to file a dossier.”

How does quality by design (QbD) address that challenge? “First, we need to reduce time to market with as few resources as possible. Second, we need to secure process robustness to deliver targeted product quality. We try to have a better understanding of the product and process all through development. Therefore, we need to have a good design of process research. Applying QbD rules seems to be an effective way to do it. Indeed, moving along project phases with a rational and/or standardized approach, we will increase product knowledge. Therefore, we expect to generate sufficient data to secure process robustness and thus maximize confidence for dossier approval.”

Biosimilars Development: The biosimilars industry took a major leap forward in 2013 when Celltrion’s Remsima (infliximab) became the first biosimilar MAb to receive regulatory approval from the European Medicines Agency (EMA). On Wednesday, Elizabeth Pollitt (vice president and head of CMC regulatory affairs at Celltrion) will present on “CMC Challenges in Development of Remsima.”

Would you share more about Remsima and its indications? “CT-P13 is a chimeric murine human immunoglobulin G1 (IgG1) MAb that binds with high affinity to TNFα, both soluble (sTNFα) and transmembrane (tmTNFα) forms. Remsima was developed to be a biosimilar of Remicade in terms of active pharmaceutical ingredient (infliximab), formulation and finished product strength and presentation. Clinical studies were conducted in Ankylosing spondylitis and rheumatoid arthritis patients. Extrapolation to the additional indications of Remicade, namely psoriatic arthritis, plaque psoriasis, Crohn’s disease, and ulcerative colitis, was achieved fully in the European Union, Japan and Korea; whereas partial extrapolation of indications was accepted by Health Canada.”

What do you think the future holds for biosimilar products? “From a scientific and regulatory perspective, the future of biosimilars is bright. Modern analytic techniques are sufficiently sensitive to identify differences between a biosimilar and reference product where they exist, providing assurance to the regulatory authorities that products demonstrated to be biosimilar will be biosimilar in the clinical setting. The experience of companies and regulators reduces uncertainty surrounding regulatory and data requirements and the design of studies necessary for development. The potential benefit of biosimilars in increasing patient access and reducing healthcare costs are clear. However, biosimilars face competition from originator products, in addition to competition from novel product and changing standards of care that all products face.”

What are some regulatory concerns when dealing with biosimilarity? “The basis underlying biosimilars in all regions is that a molecule shown to be highly similar to a reference product can be anticipated to behave like the reference product in the clinical setting. The principal regulatory concerns center on the level of data required to demonstrate biosimilarity. For example, the number and age range of batches of reference medicinal product required to be included in testing can differ as well as the statistical hypotheses and tests used to confirm similarity or the absence of a difference. In my talk, I’ll show some analytical data in the context of the regulatory background in different countries to illustrate how the same data can be interpreted differently by regulatory agencies, based on different legislation and regulatory requirements.”

What was the biggest challenge of working within a global regulatory landscape? “In some regions, there is no specific legislation or specific regulatory pathway for a biosimilar, although in some cases there is guidance or guidance adopted from other regions. So existing legislation may be adapted, and the legislative requirements for a biosimilar and demonstration of biosimilarity can be unclear. In many regions, additional data are required to show comparability of the reference medicinal product used in studies with the biosimilar and the reference product sourced from the local market, adding to the analytic comparability studies required.

“Where there is a long history of legislation, guidance, and experience of reviewing biosimilar applications, the regulatory and data requirements are clear. This is exemplified by the approval of 19 biosimilars in the European Union and revision of the EMA scientific guidelines that incorporate the experience gained by EMA and EU assessors in the past 10 years. By contrast, the US legislation on biosimilars (The Biologics Price and Competition and Innovation Act of 2009) was signed into law in 2010, and the FDA draft guidance was issued in early 2012. Although several biosimilars have been submitted to the FDA in the past year, it remains to be seen how it will be interpreted in practice and, importantly, whether biosimilars will be accepted by the public and physicians.

“In some regions, there are specific nonproprietary naming requirements. In 2014, the World Health Organization (WHO) published a proposal to add a biological qualifier code to identify active substance from a given source. It remains to be seen whether such a scheme will be widely adopted by regulatory authorities and the impact such a scheme might have on collection of pharmacovigilance data and allaying the understandable concerns from patients and physicians about switching and interchangeability.”

Stream 2: Cell Culture, Upstream Processing, and Process Analytics
In this session, presenters will focus on methods for improving upstream processes through QbD and process analytical technology (PAT) as well as on strategies for overcoming scale-up difficulties and preventing cell culture contamination.

Cell Culture, Upstream Processing, and Process Analytics Sessions
Tuesday, 14 April 201511:30 am – 3:45 pm: Translating Scale-Down Models to Scale-Up Success4:30–5:30 pm: Controlling Pluronic in Cell Culture
Wednesday, 15 April 201511:05 am – 12:10 pm: Latest Developments in PAT and QbD2:00–3:00 pm: Choosing the Best Online Analyzer4:30–5:00 pm: From Theory to Lab

Developments in PAT and QbD: Julian Morris is a professor and technical cirector at the Center for Process Analytics and Control Technology (CPACT) at the University of Strathclyde. His presentation, “Improving the Efficiency, Controllability, and Robustness of Pharmaceutical Manufacturing through Smart PAT,” will discuss PAT calibrations and models, calibration transfer, process performance monitoring, and building sofware- sensor models.

Can you briefly describe the concept of smart PAT? “Some of what the pharmaceutical industry does that is considered new in terms of PAT and QBD has been practiced for some time in the chemicals, materials, formulations, electronics, and other industries. But for biologics and larger molecules, PAT can be much more complicated because of the complexity of both development and processing as well as the range of raw materials that are used. Those factors introduce confounding variabilities into a final drug product. A big issue in the industry now is finding ways to help manufacturers get a handle on that variability. Process analytical technologies and new miniaturized analytics help with that.

“Investing in PAT involves considerable cost. For a near-infrared analyzer for example, costing say 80,000 euros (even more for a Raman device), there are similar additional costs for installation and on-going costs for the chemometrics and signal processing and expertise. That’s a difficult step to get over, and it gets even more difficult in biologics.

“However, coming to the rescue is miniaturization, specifically miniaturized probes and spectrometers. Spectrometers now use microelectronic mechanical systems (MEMS) technology. Hand-held spectrometers can now be connected to miniaturized probes that are about the size of a pencil. New developments are exploring multiple spectrometries — such as near infrared (NIR), UV, and potentially Raman — together in a single probe. A new challenge is being able to integrate all of those data. Smart PAT aims at taking those analytical technologies and using data fusion along with smart chemometrics, smart signal processing, and smart statistics to provide enhanced product and process understanding and know-how. That is what I call smart PAT.”

Would you describe an industrial case study that you will be presenting and what it generally shows about using PAT? “We have had a number of European projects working with various companies and universities. In one study, we had a process that included a chemical reaction, but the chemistry was going wrong within the reaction. It was common practice to take samples from the reaction and analyze the chemistry offline in a laboratory and then apply corrective action to the reaction process. Offline analysis involves taking samples to a laboratory and using laboratory analyzers to tell you what’s happening within your process, which takes time to do. An on-line NIR was installed in one of their reactors with the aim of using the calibration to provide information on the chemistry changes and control them without having to do lab analysis. Unfortunately, the single calibration for reaction composition did not provide the information needed to quickly detect the chemistry changes.

“The solution was to understand the reaction chemistry and develop a method of using that information to advise process operators of reaction changes in real time so problems could be corrected without having to do a laboratory analysis. We used a signal processing technology called wavelets to decompose the NIR spectra into a small number of wavelet coefficients that reflected the chemistry changes and the impact of the reactor temperature on the NIR spectra. “The end result was online monitoring with NIR — not with calibration prediction but with wavelet coefficients reflecting the process chemistry. Operational staff were then able to monitor what was going on in the reaction and perform real-time corrections witin the batch. That eliminated extended batch completion times and potential rework. I’ll discuss that in the presentation.”

Would you provide more information about the difficult-to-measure critical process parameters and how software sensor models can help process scientists overcome these challenges? “Soft sensors or software sensors use modeling techniques such as partial least squares (PLS), neural networks, or a combination of known process mechanisms such as dynamic mass and energy balances with data modeling of unknown reaction kinetics. Software sensors make use of ‘easy to measure variables’ to predict ‘difficult to measure parameters’ or ‘quality measures’ analyzed in a laboratory.”

What is the purpose of using software sensors? “Measuring critical quality parameters can be difficult. If we can’t measure them properly, or we have to measure them off-line, then our process and product understanding is not as good as it could be. Measurement difficulties can have many causes, including a lack of appropriate online analytics or online instrumentation and process operations that depend on laboratory assays. Depending on how those laboratory analyses are carried out, they can also be subject to reliability problems.

“Online sensors may be available, but they might be subject to long delays. For example, a gas chromatograph may have a cycle time of 20 minutes. So you are waiting 20 minutes for the chemical or biological measure.

“We started looking at software sensors in the late 1980s, and the recent rebirth of this concept is very interesting. Back in 1989–1992, we developed software sensors for on-line prediction of biomass (the mass of cells in a fermentor making penicillin). We also developed software sensors for continuous fermentation of a mycoprotein process, which was known as Quorn — an artificial meat. It was produced in a 140,000-L continuous fermentor. The software sensors were used online to predict biomass in that fermentor. So instead of running an assay every eight hours, process operators were able to get a five-minute review of process biomass.”

Stream 3: Downstream Processing and Continuous Manufacturing
Manufacturers continue to take interest in strategies for optimizing their downstream operations. Continuous and semicontinuous processing, automation, and advanced purification technologies are major points of discussion in this session.

Downstream Processing and Continuous Manufacturing
Tuesday, 14 April 201511:30 am – 12:05 am: Removal of Process-Related Impurities and Aggregates During Harvest Pretreatment12:05–3:45 pm: Development and Implementation of Next-Generation Purification Technologies4:30–5:30 pm: Acceleration of Purification Process Development and Characterization Using High- Throughput Automation
Wednesday, 15 April 201511:05 am – 2:30 pm: Implementation of Continuous Downstream Processing2:30–5:00 pm: Innovative Purification Techniques for More Complex Proteins

Chromatography: Manuel Carrondo (professor of chemical and biochemical engineering and director of Instituto de Biologia Experimental e Tecnológica, or IBET) will present a case study in the implementation of continuous downstream processing.

Would you describe the semicontinuous chromatographic process that you’ll be sharing in your talk? “I will present work that we’ve carried out on semicontinuous chromatographic processes for purification of viruses. The semicontinuous process — what hase been referred to as simulated moving-bed chromatography — has been used for much smaller molecules in the chemical industry for decades. Recently it has been applied to proteins — which, of course, are much larger and much more complex than the small molecules typical of chemical engineering. Now we are also applying that process to viruses, which are even more complex than many large proteins. But the basic technique is the same: With a couple of columns and different valves, you change continuously when you feed, when you add buffers, when you remove material, and so on. By doing that, you are simulating a continuous process; that is, the proteins or viruses ‘see’ the process as being continuous.”

How does this method compare with the standard batch chromatography method? “The method already has been used in the biopharmaceutical industry, and it is starting to be used more generally. Semicontinuous chromatographic methods have been shown to produce higher yields and productivity and even better quality of a final product than traditional methods.

“Furthermore, the industry is becoming more robust — and from that point of view, it is getting closer to the chemical engineering industry. So if you run your chromatography step continuously and you run your fermentation continuously, you can eventually link the two together to create a closed system. A closed system requires fewer intermediate steps and less stringent air control outside of the process. Those are long-term trends that are going to take place.

“For the method’s application to a particular virus, I will show that both productivity and yield improved and the number of purification steps can be reduced.”

Automation: Sibylle Herzer (principal scientist, biologics development and recovery and purification at Bristol-Myers Squibb) will present “Acceleration of Purification Process Development and Characterisation Using High-Throughput Automation.”

Can you describe the problem regarding design- space mapping for tangential-flow filtration (TFF)? “Because of the rapid codevelopment of the production process and formulation, process and formulation incompatibilities are sometimes discovered at the 11th hour, which can put timelines in jeopardy. There is currently no high- throughput option available for screening out TFF conditions, which makes screening and optimization time and resources intense.”

Why is microscaling of TFF fluid mechanics such a barrier? “In my opinion, this is a result of several factors: First, membranes are cast as large sheets. Even when membranes are cut to 50 cm2 surface areas, discrepancies can be seen between scale-up and small scale because of the differences in both the hydraulic resistances and membrane selectivity profiles. Too small an area is simply not representative enough. Such discrepancies may be attributed to differences in membrane properties, and also to difficulties in having the same channel configuration and length for smaller devices.

“So although we have very good mechanistic models for TFF, bridging the gap to a true microscale-down model is not trivial. And although the medical device industry uses small- scale devices regularly, that industry requires demonstrating comparability at scale, not scale-up, which I believe is not an indicator that this gap could be easily closed. For high-throughput screening (HTS), miniaturization is required. It is our ability to replicate while minimizing material consumption that allows us to develop robust design spaces early in process development when material is still limited.”

Purification Technologies: One major difficulty in bioprocessing is addressing the problems of protein unfolding. But manufacturers are making strides in this area. Linda Gombos (postdoctorate at Boehringer Ingelheim RCV) will present “High- Pressure Refolding as an Alternative Technology in Protein Manufacturing.”

Can you briefly describe traditional approaches to protein refolding and what limitations they have? “High-level bacterial protein expression often results in the accumulation of incompletely folded aggregates, known as inclusion bodies. The formation of inclusion bodies facilitates convenient and effective purification of recombinant proteins. However, inclusion body proteins need to be solubilized and refolded to regain their biologically active native structure, which represents one of the most challenging steps in the production of biotherapeutics.

“In general, inclusion bodies are solubilized using high concentrations of denaturants such as urea or guanidinum hydrochloride along with a reducing agent, then refolded by removal of the denaturant in the presence of an oxidizing agent. Because refolding is limited by aggregate formation, low protein concentrations (typically in the range of 0.1–1 g/L) are required to achieve acceptable yields, necessitating in very large process volumes. Moreover, refolding often requires long process times (up to 48 hours) and may still result in poor recovery of bioactive protein, rendering this step a bottleneck in the manufacturing process.

“By contrast, high hydrostatic pressure. (1–3 kbar) maintains preexisting secondary structure and enables dissociation of aggregates and protein refolding under nondenaturing conditions, thereby inhibiting aggregate prone intermediates. Consequently, application of high pressure allows for high protein concentrations (up to 30 g/L) in relatively small vessels.”

Why was high hydrostatic pressure applied? “The application of high-pressure technology for protein refolding from aggregates was first published in 1999. Since then, there has been a growing body of literature illustrating the potential of high pressure to refold proteins. Proteins refolded by high pressure belong to different structural classes and vary greatly in size as well as the number of disulfide bridges — suggesting that high pressure may be a universally applicable technique for protein refolding. Besides, high pressure is used in the food industry as an alternative food preservation technology for several types of products such as fruit juices, guacamole, and seafood. Consequently, pressure vessels are available at industrial scale, varying from 55 to 525 L.

“So we set out to evaluate this emerging technology for our specific manufactured molecular formats. We entered a collaboration with BaroFold Inc. and installed a reactor at our microbial facility in Vienna, Austria. It gives us the opportunity to leverage this new technology on a number of proteins and carefully identify its potential economic benefits compared to conventional refolding techniques.”

In your abstract, you mention that there is significantly higher productivity and less alteration of quality using this method. Would you elaborate on that? “When optimizing process parameters, we focus on maximizing throughput by increasing protein concentrations and minimizing refolding times, while maintaining yields. Our experience shows that high-pressure refolding can be performed at protein concentrations between 2 and 30 g/L, typically in two to six hours. The combined effect of higher protein concentrations and shorter process times compared with chaotrope-based refolding improves productivity by one or two orders of magnitude. In addition to evaluating manufacturing feasibility, it is of paramount importance to address concerns on therapeutic efficacy and safety. Therefore, we compared structure, stability, and bioactivity of protein variants refolded using high pressure with those refolded using conventional methods and found no significant difference.”

Stream 4: Manufacturing, Single-Use, and Quality Management
Increasing optimization in upstream and downstream processes is driving a need for flexible facility designs. Often, those designs include single-use systems, for which standardization and testing strategies continue to be major discussion points for both manufacturers and suppliers.

Manufacturing, Single-Use, and Quality Management
Tuesday, 14 April 201511:30 am – 2:15 pm: Interpreting the Latest FDA Guideline on Contract Manufacturing2:15–4:30 pm: Successful Project Management of Outsourced Partners4:30–5:00 pm: Win–Win Relationship with Your CMO5:00–5:30 pm: Update on Continued Process Verification Workstream
Wednesday, 15 April 201511:05 am – 2:00 pm: Facilities of the Future2:00–5:00 pm: Improving Manufacturing Operations

Single-Use Standarization and Testing, BPOG: Kenneth Wong (Mtech process technology extractables and leachables lead at Sanofi in Swiftwater, PA) will present the work of the BioPhorum Operation Group (BPOG) in developing a standardized extractables protocol (SEP) the bioprocessing industry.

Would you talk about the BPOG SEP and what methods it includes? “In November 2014, BPOG published a consensus article in the Journal of Pharmaceutical Engineering on standardized testing protocol for single-use systems. The SEP applies to several big areas, including sample preparation, model solvents and time points used for extraction processes, analytical methods used to analyze the extracts, and data reporting to customers.

“Although the SEP applies to standardization, it still provides a lot of flexibility so that you can make adjustments when necessary. For example, if you know that your material is not compatible with a certain model solvent that we propose, you can document that and provide some rationale for not using that solvent. Also, when selecting your study, you can use a “family approach.” For example, if a single-use supplier uses one material to construct many different single-use products (e.g, a bag), then that supplier can select a worst case on which to perform a single study. The supplier would then provide that information to its customers, including very concise information of what the data represent and how the data can be used.”

Can you share some discussion points around the BPOG proposed SEP? “Let me go back to why we want to have a standardized extractable protocol. The current situation is that all customers need extractable data. So when we request information from a supplier, we tell them that we need a specific condition because that condition will match our needs. All suppliers are hearing very different messages from different customers. So it is very hard for suppliers to execute a single study that will satisfy all customers.

“When BPOG members come together (at this point we have almost 27 companies, end users, and contract manufacturing organizations), we look at ‘worst cases’ for how single-use components are used by our own members. We then discuss how we can simplify requests to suppliers and come out with a standardized extractable protocol that everyone can live with.

“There are three key concerns. The main one is the model solvent: why we choose the six model solvents compared with others that are proposed elsewhere. The second point is the extraction condition: how the sample is to be treated. And the third is the time point: how many time points are needed to analyze specific component categories, such as tubing, bags, connectors, sensors, and so on.”

Single-Use Standarization and Testing, BPSA: Jerold Martin (senior vice president, global scientific affairs at Pall Life Sciences) will discuss the work and perspective of the Bio-Process Sysytems Alliance in single-use standardization.

Can you talk about how the Industry Consensus Standards Development works? “Starting as far back as 2007, BPSA recognized that to facilitate the development of single-use industry and single-use applications, it would be quite helpful to coordinate the activities of suppliers and somehow standardize their work. That way, end users and regulators would not be confused by competing positions of different suppliers, which would create uncertainty about what best practices should be used.

“In 2007, BPSA began publishing a series of white papers to highlight the best practices and standardization in the industry. That included our first publication, which was just to assemble the existing standards that were already being applied within the pharmaceutical and biotech industries and vaccine industries relevant to single-use equipment. That was published back in 2008. We are actually going through a revision of that (to be published in 2015). The revision updates some of the earlier standards we commented on with regard to tubings, filters, connectors, and biocontainers, expanding those into some areas that really didn’t even exist very much back in 2007 (such as sensors and chromatography membranes and prepacked columns that might be used in single-use). Those have been published along the way.

“We highlighted a piece on gamma sterilization following the ISO 11137 standard and expanding on how it would apply to larger-scale process equipment. We also published two best-practices guides on extractables and leachables back in 2008 and 2010. We met with the FDA and established some best practices that were implemented by a number of bioprocess companies. Those have worked well to date. Now we are focusing on newer standards areas related to extractables testing, control of particulates, and other quality issues as the industry becomes more established. End users want to use single-use technologies not only in the media and buffer preparation — the area where single-use technologies started — but also in bioreactors on the upstream side and all the way down to final bulk drug containers and final filling systems.

“All of these standards help the industry determine best practices, reducing the kind of extra effort that end users have to put in when suppliers operate with different methods or provide results in different ways. We are trying to harmonize all of that moving forward.”

Quality Agreements: A successful relationship between a manufacturer and a contract manufacturing organization (CMO) starts with clearly defined responsibilities for each party. On Tuesday, Peter Calcott (president of Calcott Consulting) will present “Implementing Quality Agreements in Your Operations and Assuring They Meet the New FDA Expectations.”

Would you describe the evolution of quality agreements between sponsors and CMOs? “I first became aware of quality agreements about 16 years ago while working at a relatively small company in the United States. My company worked with a CMO in Europe, and when we approached the CMO to set up systems to get our product into its facility, the CMO brought out a document that was totally new to me: the quality agreement. As I was going through the agreement with the CMO, I became aware that it was a great idea. I took the concept back to my company, and we started implementing quality agreements. It was before the year 2000 that we put them in place. Interestingly, we discovered some US CMOs had already begun using them, as well. It was something that started by accident and then moved to become very mainstream.

“Today, if you work with essentially any European CMO, that company will demand that a quality agreement be put in place. It’s becoming a lot more common in the United States — definitely with CMOs — but also with a lot of pharmaceutical and biopharmaceutical companies. It really has evolved to be mainstream.”

How did the FDA’s guidance change the playing field in the United States? “As I mentioned, quality agreements are becoming common in many companies. In fact, if you look at the quality agreements in place today, many are driven to put into place clear understandings of roles and responsibilities. The practice is becoming a requirement now with the FDA guidance, but it’s also a very good business practice. In many respects, you can look at it as the FDA guidance putting into print what’s been quite common in the industry for a long time, particularly when it involves Europe, but for US companies as well.

“The guidance is changing the playing field because it solidifies expectations for all parties. Obviously, those companies that already have such agreements will read the guidance and say, “Yes, we’re doing that.” Other companies might be contemplating the practice, and the guidance might be the element that pushes them over to do it. Then, of course, for some companies this practice will be totally new. They might see it as a regulatory burden when, in fact, it’s really not a burden whatsoever. It’s a reflection of what I consider to be a good business practice.”

Outsourcing Market: As biopharmaceutical manufacturing capacity use continues to increase, so too does the demand for contract manufacturing services. William Downey (president of High-Tech Business Decisions) will highlight the current and future outsourcing market for CMOs in his presentation, “Biopharmaceutical CMO Market Overview.”

What is driving the increase in the demand for outsourced biopharmaceutical manufacturing services? “Primary factors affecting the growth of outsourced biopharmaceutical manufacturing services include the increased use of biopharmaceutical therapeutics (especially MAbs), the pipeline of biopharmaceuticals in development that are reaching phase 3 clinical trials, and the continuing trend of large pharmaceutical and biotechnology companies to outsource more of their biopharmaceutical production. From our research, we see that large pharmaceutical and biotechnology companies expect to increase the proportion of their manufacturing spending for outsourced activities. However, the expected higher spending levels by those companies are dependent on successful clinical results and commercial launch.” 

What is one concern that biomanufacturing directors experience in choosing among CMOs? “The biggest concern for biopharmaceutical manufacturing directors is mitigation of project risk. Thus, they look for CMOs with experience producing biopharmaceuticals that are technically similar to their project. Furthermore, biopharmaceutical directors look for CMOs with good quality and quality systems. Thus a CMO can understand their processes and demonstrate process control.

“In addition, sponsors look for CMOs that have a track record of meeting project timelines. When we ask CMOs the reasons why they win business, they most often mention flexibility, technical expertise, and their track record of on-time delivery.”

Would you provide an overview of how CMOs are changing their strategy and what has spurred that? “Over the years, we have seen biopharmaceutical CMOs offering more services to their clients, not just access to production facilities. We see more companies offering one- stop shop services and products and lower costs because of better production efficiency. We also see CMOs investing in information technologies that provide their clients with more insight into the process characteristics of their projects. In short, client–CMO relationships are becoming more collaborative in nature.”

Stream 5: Analytics, Quality Control, and Formulation
Real-time Analytics and online tools can help manufacturers understand complex processes and formulations. In this session, speakers will discuss strategies for analyzing complex molecules and high-concentration formulations and review approaches for improving efficiency and robustness.

Analytics, Quality Control, and Formulation
Tuesday, 14 April 201511:30 am – 1:00 pm: Analyzing Complex Molecules2:15–5:30 pm: High-Concentration Formulations
Wednesday, 15 April 201511:05 am – 12:40 pm: Process Analytics2:00–4:30 pm: Choosing the Best Online Analyzer 4:30–5:00 pm: From Theory to Lab

 

Protein Interactions: Current research into high-concentration formulations involve studies of their functional properties through the use of an energetic framework. Thomas Laue (director, Biomolecular Interaction Technologies Center (BITC), University of New Hampshire) will discuss his work in this area during his presentation, “Mechanisms of Protein Association and Their Impact on Solubility and Viscosity.”

Can you briefly describe the proximity energy framework and how it relates to formulations development? “Scientists tend to consider aggregation using a thermodynamic framework, where we infer the mechanism of protein–protein interactions in terms of the initial and final states of the protein. That view leads to a focus on contacts made between the protein surfaces. However, molecules interact through space electrostatically as well as through interactions between their solvent shells. Such interactions take place over distances of 0.1–10 Å and lead to a potential-energy-versus-distance profile that describes those kinetics of the interactions. Negative potential energies result in attraction between adjacent molecules, and positive energies result in repulsion. The profile may have a minimum at very short range (from contact to a 0.1 Å or so), and a maximum at 1–3 Å. The maximum potential energy results from charge– charge repulsion. If the maximum is high enough (10–20 kT), then the molecules will never get close enough to reach the minimum, so they cannot aggregate.

“In protein formulations, the solubility of a protein often is dependent on there being sufficient charge–charge repulsion to keep the molecules from colliding. As the concentration of the molecules increases, the height of the maximum decreases due to “molecular crowding,” but the depth of the minimum is not affected. Hence, problems with solubility increase at higher protein concentrations. Such problems may be diminished by increasing the measured charge on the protein and by manipulating the solvent to increase the charge–charge repulsion. There also are interactions between the solvent shells around proteins. Those interactions may lead to viscosity problems at high concentrations and may promote aggregation by decreasing the height of the repulsion maximum. Again, these solvent-shell interactions may be altered by manipulation of the protein and solvent conditions.”

What tools/equipment do you use to determine the thermodynamic and kinetic views of proteins in formulations? “We use analytical ultracentrifugation to monitor protein interaction thermodynamics. There are other methods for assessing protein-protein interactions. In particular, dynamic light scattering (DLS) measurement of the apparent diffusion coefficient as a function of concentration is very useful for assessing the interaction energy. HIC and self- interaction chromatography also are useful. In my experience, charge measurements provide invaluable insights into how a molecule will behave and how to mitigate unwanted behaviors.”

Mass Spectrometry: A well-established technology for studying the structural characterization of complicated “molecule-like” antibodies, mass spectrotrometry remains a fundamental tool for analysts. Alain Beck (Center for Immunology at Pierre Fabre and Associate Editor for MAbs) will present “Emerging Mass Spectroscopy ‘Toolbox’ for Antibodies, Biosimilars, Bispecifics, and Antibody–Drug Conjugates.”

Would you describe the toolbox mentioned in your title? “The idea is to adapt different types of mass spectrometry and different tools that tend to have different resolutions of different information. For example, we can have a special focus on glycosylation (which is very important for the activity of cytotoxic antibodies), or we can use special techniques to resolve hydrophobic molecules that are used for ADCs. So for each type of molecule, you can have an oscillometric technique.”

What are top-, bottom-, and middle-structural assessment? “Antibodies are very large molecules (about 150 kDa). So first they are analyzed at the top level without any sample treatment. At the bottom level, you can digest antibodies using enzymes such as trutein. In that case, you will analyze fragments ranging from 0.3 to 4 kDa. Having all this information together in parallel, you can really have a clear idea of the structure of a molecule.”

Assessing Critical Quality Attributes (CQAs): Markus Haberger, group leader, characterization, at Roche Diagnostics, will present “Assessment of Chemical Modifications of Sites in the CDRs [complementary-determining regions] of Recombinant Antibodies: Susceptibility versus Functionality of Critical Quality Attributes.”

What is important about understanding the susceptibility and functionality of critical quality attributes in recombinant antibodies? “A stress model system can help manufacturers identify potential chemical degradation sites of therapeutic proteins, such as for asparagine deamidation, methionine oxidation, and so on. But it is important to know that not all highly susceptible sites are critical for protein function. In my presentation, I will show examples in which the functionality of a protein was and was not affected by chemical posttranslational modifications. None of the degradation products we investigated led to a complete loss of functionality of our therapeutic protein.”

In your abstract, you mention a number of methods and tools that are used to measure the effects of stress conditions. Would you elaborate on why those tools are useful? “In the first steps after applying harsh stress conditions, it is very important to prove that your protein integrity is still as desired. Therefore, we used classical biochemical analytical tools like size-exclusion chromatography and ion-exchange chromatography. For the detection and relative quantification of chemical modification sites, we used liquid chromatography–mass spectrometry (LC–MS) tryptic peptide mapping. In our case, we used a ultraperformance liquid chromatography (UPLC) system combined with an Orbitrap or time-of-flight mass spectrometry (TOF–MS) system. That combination provides very high resolution and sensitivity. Most modification sites can be detected only with specific ion chromatograms. They cannot be detected in the total ion chromatogram of the MS system or by UV detection. For functional testing, we are used surface plasmon resonance (SPR) and bioassays, but my presentation will focus only on SPR.”

Would you briefly describe your research? “In my presentation, I will describe our generic internal strategy on how to identify potential degradation sites of a therapeutic antibody and which specific stress systems we are using for the identification of chemical degradation hot spots. Those stress systems were adjusted specifically for the slightly different molecules we have in our pipeline.”

Monday, 13 April 201510:00 am – 4:00 pmContinuous Processing: Strategy, Technologies, Economics and Challenges (Margit Holzer, scientific director chez Ulysse Consult, France, and Roger-Marc Nicoud, founder and CEO at Ypso-Facto, France)Extractables and Leachables: Study Design for Disposables System (Kenneth Wong, MTech/process technology — extractables and leachables at Sanofi and BPOG representative, USA)Practical Quality By Design for Biopharmaceuticals, (Richard Dennett, director, Voisin Consulting Life Sciences, France)
Thursday, 16 April 2015, 9:00 am – 3:00 pmThe Challenge of Protein Aggregation and Subvisible Particles in Biopharmaceuticals (Professor Tudor Arvinte, School of Pharmacy Geneva–Lausanne, Switzerland and CEO of Therapeomic Inc.)Process Analytical Technology (Professor Julian Morris, Technical Director Centre for Process Analytics and Control Technology, CPACT, University of Strathclyde, UK)Outsourcing, Technology Transfer, and CMO-Client Relationships (Firelli Alonso-Caplen, Senior Director, External Supply, Pfizer, Inc., USA, and Morten Munk, vice president of business development, CMC Biologics A/S, Denmark)

Maribel Rios is managing editor of BioProcess International, mrios@bioprocessintl.com.

Listen Online! These interviews have been edited from transcripts for space and style. You can access the original conversations at www.bioprocessintl.com/BPIEU-2015.