Since the concept of “well-characterized biologics” entered the biopharmaceutical industry’s vernacular late in the 20th century, increasing emphasis has been placed upon product and process characterization. Analytical laboratories have always been vital to bioprocessing, whether they were performing in-house quality assessments, preformulation and other types of product characterization, and process scale-up and optimization support, or outsourced viral safety and product testing. Recent advances in analytical technologies such as high-performance liquid chromatography (HPLC) and mass spectrometry (MS), as well as other forms of spectroscopy, have made it possible for companies to know more about their protein products and manufacturing processes than ever before. And regulators, in turn, are asking for more data when it comes time to inspect facilities and review market applications.
What’s Different: For most of the BPI Conference’s history, analytical topics were sprinkled throughout the program. Cell-line characterization and media formulation happened upstream, process optimization and product characterization downstream. Scale-up had its own track, and formulation had its own colocated conference. For 2013 the latter remains a track apart while analytical and quality issues have come together in sessions of their own.
Look back at early programs, and you’ll be hard-pressed to find mention of design of experiments (DoE) or biostatistics anywhere. Another rarity was discussion of extractables and leachables from plastic, but those subjects were certainly brought up in the first questions attendees asked when confronted with single-use technologies. Meanwhile, process analytical technologies (PATs) and quality by design (QbD) were little more than twinkles in the regulators’ eyes. Now they’re finding their way from process development and manufacturing suites into analytical and formulations laboratories — just like good manufacturing practice (GMP) itself did many years ago.
What Remains the Same: Every biomolecule is different. Every protein, peptide, oligonucleotide, cell therapy, therapeutic or vaccine, diagnostic reagent or parenteral drug, liquid or lyophilized or otherwise, each poses unique challenges to the analytical laboratories tasked with characterizing them. Despite all efforts in platform technologies, even similar molecules such as monoclonal antibodies (MAbs) cannot be expected to behave the same. They react differently to changes in temperature, pH, osmolality, and so on. Some proteins are more stable in solution than others. Some are prone to aggregation in formulation or numerous types of degradation in vivo. Simply shake a vial, and you might watch clouds of valuable product appear and settle uselessly to the bottom. The more we learn about these products, the more we realize just how fragile they can be — and how best to provide them with the tender loving care that they need. Analysts have more (and more powerful) “toys” to play with now — but with great power comes great responsibility.
Where Do We Go from Here? Under QbD, you can’t determine your product’s design space without extensive characterization work. Your process won’t maintain itself within those specifications without some kind of monitoring and controls to keep it there. You can’t prove your follow-on product’s biosimilarity to an original without intense analysis of both. You can optimize your culture feed, determine the best separation parameters, and find the right excipients for your product formulation without laboratory automation and DoE, but it’s going to take a lot longer to do so! And as the old adage says, “time is money” — especially in the biopharmaceutical industry.
As always, the BPI Conference is all about the people involved in the industry and the work that they do. IBC’s Jennifer Pereira spoke with several presenters this past summer about their topics as well as experiences with this event over the years. Here, in Q&A format, is what they had to say.
Parastoo Azadi (University of Georgia)Parastoo Azadi, PhD (technical director of the University of Georgia’s Complex Carbohydrate Research Center), will be joining us for the “Analytical Strategies for Biosimilars” session on Tuesday morning, 17 September 2013. Azadi’s presentation is titled “Challenges in Comparability Studies of Carbohydrate-Containing Biosimilars.”
What glycoconjugate biosimilars have been approved and for what market? Well, no actual biosimilar has been approved in the United States. The Enoxaparin low–molecular-weight heparin has been approved, but not under exactly the same guidelines as a biosimilar. It is really the only kind of carbohydrate biosimilar that has been approved. But in Europe, a couple of glycoproteins and antibodies are approved for the European and other markets outside the United States. But biosimilars are still new, and the US market will be opening up. It is an unknown frontier for biotech companies and the US Food and Drug Administration (FDA). So there are many things to learn about biosimilars and carbohydrates.
What data are most important to get biosimilar approval? Glycans are very heterogeneous. Not only is it important to know the composition of the glycans — you need to know how much mannose, glucose, and galactose is present — but you also need to know the linkage of those residues together, and you need to know the sequence. So there are really three or four steps to the structural characterization of biosimilar glycoproteins. You need to know the composition of the glycans that are attached, and you need to know their linkages. You also need to know the oligosaccharide sequence, the sequence of the polysaccharide, and the percentage of overall carbohydrate in a sample. Is it 5% carbohydrate, or is it 100% carbohydrate? What is the overall sequence?
Why do some laboratories need help in fully characterizing their glycan structures? Not many people have specific expertise in carbohydrate analysis. That specific expertise is needed. Even larger companies need special expertise for looking at the oligosaccharide sequence and polysaccharide sequence, any noncarbohydrate constituents that may be present, and whether molecules are sulfated or acetylated. The position of those noncarbohydrate constituents is quite difficult to determine. So I picture a future in which companies need a lot of help and expertise in looking at these intricate details of carbohydrate analysis.
Where is the technology heading with carbohydrates and biosimilars? We will need to determine the exact structure of a carbohydrate within the biosimilar and compare that with the innovator. So we need very accurate data. Probably, the future is using mass spectrometry in combination with chromatography. We are separating out carbohydrate components and using a number of new fragmentation capabilities in mass spec to sequence oligosaccharides. Of course, nuclear magnetic resonance (NMR) and 2-D NMR will always be there to accompany some of those mass-spec data when they are ambiguous or fragmented. Combining maybe an LC–MS with NMR will be important going forward for the biosimilar characterizing.
What processing challenges come with carbohydrate-containing biosimilars? Very subtle differences in the manufacturing process will make a difference in the carbohydrate sequence, structure, and composition, depending on the production cell line. If the biosimilar company’s cell line is in any way different from the innovator’s, then different carbohydrates will be put on that glycoprotein. In the case of polysaccharides — e.g., heparin and low–molecular-weight heparin — subtle differences in manufacturing will make a difference in their final molecular weight and maybe the position of the sulfate group.
Intricate details will be very different. So it is very
important to ensure that the sequence and noncarbohydrate constituents are properly determined for a biosimilar. We need to make sure that they are as close to the innovator as possible using the highest technology.
Dave Kolwyck, MS, MBA (principal scientist in material science at Amgen) will be joining us for the “Consistency Through Control of Raw Materials and Media Optimization” session on Wednesday morning, 18 September 2013. His case study is titled, “Controlling Process Variability Due to Raw Material Variability.”
Can you describe some challenges that pharmaceutical companies face in managing their raw materials? One challenge we have is the breadth of suppliers to the industry. Some have products and services that are focused specifically for the biopharmaceutical industry; others became suppliers to the industry indirectly because biopharmaceutical companies adapted their products and services. Some may be unknowing suppliers to the industry. They may make process changes that they don’t think would have an impact on ours because they never intended their products necessarily for this industry.
Analytical and Quality Sessions
Tuesday, 17 September 2013
8:00–8:45 AM Balancing Analytical Requirements with Limited Resources
8:45–10:45 AM Comparability and Similarity
10:45–11:45 AM Analytical Strategies for Biosimilars
11:45 AM–12:45 PM Concurrent Technology Workshops
1:45–3:30 PM Analytical Technologies for Diverse Molecules
Wednesday, 18 September 2013
8:00 AM–12:00 PM Product Attribute Control (PAC): A Future Vision for Protein Process Development and Manufacturing
12:00–12:30 PM Concurrent Technology Workshops
1:45–3:30 PM Analytical Strategies for Synchronized Formulation and Device Development
Thursday, 19 October 2013
8:00 AM–12:00 PM Quality Risk Management and Quality Auditing
12:00–12:30 PM Concurrent Technology Workshops
1:40–3:15 PM Control of Process and Product Quality
3:45–5:15 PM Advances in Analytics and Models to Aid Downstream Process Development
A good example would be bulk chemicals. They may contain various impurities, such as trace metals and organics. Those might have an impact on products in an industrial application, but they do affect our cell culture processes due to the very sensitive nature of cells and processes using those raw materials. So that is a challenge. An opportunity for the industry is to reach out to these suppliers and begin to better understand the realities of their manufacturing processes. How can we work with them to reduce the variance in the raw materials they supply to our processes and reduce that variance specific to how it can affect our processes? How can we take the information that we get from those suppliers and engineer robustness into our own manufacturing processes so they are not as sensitive to that variance?
I think there is an opportunity for dialogue and to improve our processes both ways — both on the supply side and the end-user side. And I think the BPI Conference provides a great opportunity for us to engage in those discussions and have that dialogue in a meaningful way.
What technologies offer solutions to current raw material management challenges? If you look at the bill of materials for a manufacturing process, there can be literally hundreds of raw materials on that list. Some are actually aggregates of multiple raw materials themselves. A good example is cell culture media. Chemically defined media often contain 60–70 individual components that are blended together. That “recipe” of chemicals is what’s used to grow our cells. Each of those individual components may involve a certain amount of variance, and it can be very difficult if each one has 8–10 different specifications. To be able to track all of those, now you’re talking about 8–10 specifications for 50–60 components, and lots of those raw materials could vary throughout the year. Now we’re looking at thousands of attributes and trying to trend across lots of subcomponents to an individual or primary component in a cell culture process.
Digital transfer technologies help us manage that large amount of data. All of this information is imported into databases so we can use tools such as multivariate analysis or principal-component analysis to look for trends that correlate to outputs in our manufacturing process. Through those trends, we can start to gain knowledge and insight as to what individual components in our bill of materials may actually be driving variability in our process. That just wouldn’t have been feasible if we were manually trying to review all the data and look for correlations and relationships between individual raw materials in our manufacturing output.
How can suppliers support raw material variation monitoring? We are looking in particular at analytical technologies that support impurity profiles. Several pharmaceutical companies have published on trace-metal impurities: on copper levels, manganese levels, magnesium levels, and some of the divalent cations. Those are some things we can look at in the raw materials, particularly as they relate to upstream cell culture processes. The divalent cations are known to have various stimulatory and signaling activities in cell cultures and activatie catalyst processes that drive metabolism and product-quality attributes (PQAs), particularly carboxylation.
Historically, cell culture raw materials used upstream were characterized for bioburden (endotoxins). Some suppliers have been able to create a theoretical contribution based on individual levels of endotoxin and bioburden in each component used to create a culture medium. You could envision that those same suppliers — if they were characterizing trace metals — could create a total theoretical contribution for trace metals from impurities into media formulations, as well. We could use those data, then, to look at the sensitivities of our cell lines and processes and engineer them in such a way that they will more robust and less affected by subtle variations and trace metals that come along with media components.
Organic and polymerous materials have been known to affect cell culture processes and buffers and dissolution of raw materials. Spectral-based methods — NIR, Raman, and so on — aren’t applicable to every media component. (For instance, Raman does not work with inorganic salts. You need a certain level of vibrational energy to detect a Raman spectral shift.) But those technologies are useful in generating a rich data set around raw material components in media and maybe to indicate potential process effects.
What are the dangers of not taking this seriously enough? Sometimes in the pharmaceutical industry we don’t understand what changes at the supplier level could affect our manufacturing processes. We develop a very rigid state in which we essentially try to force suppliers not to make any changes. Whether or not those changes could be for good or for unknown reasons, we try to enforce no changes. I think that starts to constrain the ability of our suppliers to improve their manufacturing processes. And it ultimately indicates a low level of understanding regarding how those raw materials impact our process. Ultimately if a change does occur, it could take us some time to react to that.
I think that these types of technologies — if implemented proactively — will allow us to begin to understand and appreciate what would be the normal variance of raw materials used in our processes. As suppliers work to make changes, we have a better opportunity to understand what their potential
impact will be on our processes.
This benefits us in two ways: One, we can first assess whether a change is likely to affect our process and what the likely output will be. So we don’t necessarily have to do a full requalification of a given data set ahead of time, knowing that change is not likely to have an impact. Two, if a change would potentially affect our process, we are probably in a better position to design experiments that would more accurately assess the impact of that change and ensure that our process will remain constant despite it. We could reduce the number of novel studies we have to do by leveraging the data we’ve already generated, and that allows us to develop studies that are more precise in assessing the impact of a given change based on data we’ve already collected.
Both of those things are critical in ultimately moving more toward a QbD approach with our suppliers as they work to make changes to make their processes and these raw materials be more amenable, more focused on the biopharmaceutical market. We can assess their changes and manage efficiently, allowing them to make timely changes. We can benefit from those changes in terms of a lower level of variability in our own manufacturing processes.
Can you tell us about your experiences with the BPI Conference? I’ve been at BPI Conferences both as a supplier and now as an end user in the pharmaceutical industry, which probably provides a unique experience. One of the neat things about the conference is that it is an excellent way to discuss new ideas, to see some new technologies being developed by suppliers to help out the industry. And it’s a good place to provide feedback on those new technologies. I think it’s a good way for the industry as a whole to share needs and concerns with the supplier base that supports it.
If you have some regulatory input into that conversation, I think it’s particularly useful. Then you have all three parts of the biomanufacturing triangle: the manufacturers, which are the biopharmaceutical companies; the suppliers who supply that manufacturing unit; and the regulatory bodies that oversee it and ensure patient safety. If all three can be together in this conversation, you can really have some beneficial outcomes. Especially with the dialogue and questions that arise after presentations, you can gain some insight and increased understanding across the whole industry regarding ways to move forward.
The most interesting thing is always new technology. I remember the first 2,000-L bioreactors that were rolled out at the BPI Conference for disposable manufacturing systems. I think it will be interesting to see how disposable manufacturing technologies have evolved. There have been some announcements in the industry about leachables and extractables and understanding the impacts of raw materials pre- and postirradiation. We are getting more sophisticated there in understanding the polymer chemistry around packaging systems. Now what will be interesting to see is how other raw material suppliers respond to that, both as developers and users of disposable technology.
The other interesting area is analytical instrumentation. I’m looking for more real-time monitoring systems. With improvements in disposables and digital data acquisition, now there is a real opportunity to truly realize the dream of continuous monitoring for active control of bioprocesses. I’m interested to see how QbD and analytical monitoring equipment have progressed, particularly in relation to disposables.
Robert Steininger (Acceleron)Robert Steininger (senior vice president of manufacturing at Acceleron Pharmaceuticals) will be joining us on Thursday afternoon, 19 September 2013, for a town-hall style forum on “Harmonization of Single Use Systems: Extractables and Leachables and Other Initiatives.” He will be offering an ASTM perspective on leachables and extractables.
How and why is the American Society for Testing and Materials (ASTM) involved in setting standards for single-use technology? Both the users and suppliers of single-use technology would like to see more consistency between the information that is provided and what is actually requested. Also, there could be more consistency in the methods by which that information is gathered and transmitted.
The process for setting a standard requires consensus among all the members of ASTM. That involves a very controlled and balanced set of voting members — both users and makers of a technology — as well as consultants and regulators. In this case, each organization has one vote, and a rather tough vetting process ensures that all comments and objections are addressed in the final draft. That draft has to be approved by 90% of all members. So the process is recognized as being very fair. Standards are acknowledged by the US Food and Drug Administration (FDA) and other regulatory authorities as the opinion of experts in the industry. As such, the draft carries considerable weight. Regulatory authorities can follow it as a way to minimize their own workload.
Who’s involved in setting this standard? We have enlisted quite a varied group of individuals, including some members of BioPhorum’s Committee of the BioProcess Operational Group (BPOG), which is primarily made up of users from major pharmaceutical companies; BioProcess Systems Alliance (BPSA) committee members; and members of the Parenteral Drug Association (PDA) and the International Society of Pharmaceutical Engineers (ISPE), who have been involved in developing documents on single-use technology. The core of the biopharmaceutical sub-team within E55.04 — ASTM’s biopharmaceutical team — comprises users, consultants, and suppliers of products and services who are very familiar with existing documents being generated by other groups.
Formulation and Delivery Sessions
Tuesday, 17 September 2013
8:00–11:45 AM Particle Identification and Characterization for Realizing Stable, Safe and Effective Formulation
11:45 AM–12:45 PM Concurrent Technology Workshops
1:30–3:15 PM Analytical Strategies for Determining Formulation Stability
Wednesday, 18 September 2013
8:00–9:45 AM Product Attribute Control (PAC): A Future Vision for Protein Process Development and Manufacturing
10:30 AM–12:00 PM Strategies for Achieving Stability for High-Protein Concentration Formulations
12:00–12:30 PM Concurrent Technology Workshops
1:45–3:30 PM Analytical Strategies for Synchronized Formulation and Device Development
Thursday, 19 October 2013
8:00–10:15 AM Strategies for Achieving Stability for High-Protein Concentration
11:00 AM–2:15 PM Biosimilar Formulation and QbD Considerations for Biologic Development
12:00–12:30 PM Concurrent Technology Workshops
2:15–5:15 PM Localized and Targeted Delivery Strategies
What are the main industry concerns? The major concern is how to agree upon a standard approach. It is a huge consensus-building task, and our goal is not to debate too long so that our effort results in standards that might be useful in the near term. Te major aspect is setting time lines that allow us to focus on getting standards out there for people to comment on, and I hope we can get them approved in a relatively short period.
There are many areas I think the biotechnology industry could agree upon and should look at simply because we’ve done these things so many times. One specific area that ASTM is also focusing on is setting standards for virus-removal process steps to take advantage of the nearly 30 years’ worth of data. We’ve already generated the data and can use those results to set standard processes that could be used consistently by others.
How will standards affect proces
s efficiency, product quality, and costs associated with manufacturing? I hope that by setting standards, our industry doesn’t have to reinvent the wheel time and time again. What we do, particularly in validation, is often to confirm data that we already have. By preventing that, we could save money, time, and effort. This could also provide a standard way for new companies coming into the business to be assured that their own processes for biotherapeutics can be safe and deliver products that are safe for patients, particularly in clinical trials.
Author Details
Listen Online! These interviews have been edited from transcripts for space and style. You can access the original conversations at