An Interview with Chris Roberts, Professor of Chemical and Biomolecular Engineering at the University of Delaware (UD)

Chris Roberts

We sat down in December 2020 over a virtual cup of coffee to interview Chris Roberts, Professor of Chemical and Biomolecular Engineering at the University of Delaware (UD). He received a Bachelors of Chemical Engineering degree from UD, and a Ph.D. in Chemical Engineering from Princeton University. His research includes experimental and theoretical fundamentals and applications of protein physical and chemical stability, (mis)folding, aggregation, statistical mechanics, and kinetics to address questions of biopharmaceutical product design, stability, and manufacturing.

Editor:
Why did you focus on protein physical and chemical stability, misfolding, aggregation, statistical mechanics, and kinetics to address questions of biopharmaceutical product design, stability, and manufacturing. as your area of study?

Chris Roberts:
When I was working as a formulation scientist in the late 90s/2000s, a perennial problem for us was protein aggregation. There were only a few academic labs in the U.S. that worked heavily on industrially relevant protein aggregation problems (for example the John Carpenter lab and the Ted Randolph lab), and not many people working heavily in this area compared to today. My background is in chemical physics and biophysics and I saw a lot of analogies to protein folding in the field. I had the opportunity to move to academia, to one of the top chemical engineering departments in the world. So I decided to build my lab around protein aggregation. I had been working to try and attack the issue from a fundamental level, while also developing a solution that would be practical for industry to deploy. Many labs approached the problem in hindsight, trying to explain the results after the fact, but few were trying to be truly predictive.

There was, and still is, little focus on addressing the challenge to combine high quality experimental analytics and formulation science, with cutting edge modeling approaches, to give more of a physics base. The field has evolved since we were doing this in the 90s, but there is still a long way to go.

Editor:
What would be the one major area of focus you recommend for the future?

Chris Roberts:
It would be to try to be predictive of a range of indicators of biophysical stability of therapeutic proteins, for example aggregation, in the sense of partially unfolded species which are difficult to reverse. These aggregates can be soluble, insoluble, or so-called sub-physical particles that may be formed through adsorption through bulk interfaces (such as liquid-liquid, solid-liquid, vapor-liquid). This may also include behavior due to protein-protein cross interactions as one looks at more complex products. My lab is interested in both experimental predictive measurements and fundamental computational predictors, and both have value from an empirical or phenomenological approach vs. first principles.

Similarly, studying changes in conformational stability is important, which ties into the need for spectroscopy solutions. Microfluidic Modulation Spectroscopy is an example for examining those factors in protein solutions.

Editor:
What have been the biggest challenges the protein characterization sector has faced over the past few years?

Chris Roberts:
It depends on what stage of the drug development process you are in. At the early stages, it is how to be very sample-sparing. In early discovery, a major issue is that sample material is limited, but you need to go through many candidate molecules for selection.

At the early stage, you need a prediction of how the molecules will behave at the late stages, as quickly as possible, and with as little material as possible. Whereas at the later stages, you will have locked in the molecules to focus on from a clinical perspective, and one needs to focus on how to formulate around any problems the molecule might have. So, at the beginning stages, you are trying to pick the best molecule with very little material. Later, it is what are the best experimental indicators about whether a given molecule will behave well in standard formulations.

One can create models that correlate data, but a key challenge is that we need to conduct many assays to truly characterize the protein. We can't completely characterize proteins in the same way as with small molecules, where a solid state characterization with an NMR structure, as well as a validated HPLC method may be most of the key techniques. With proteins, one can have dozens of assays to properly characterize the product Add in the issues around sample sparing constraints, the need for fast, high precision analyses, a large range of detection, the ability to use technology without having to dilute the sample, as well as to work at high protein concentrations. One would benefit from instruments that achieve these simultaneously; many of these are perhaps fine to be done off-line, but if they can be done at-line or in-line that will benefit the connections between development and manufacturing efforts.

Editor:
What are the challenges you see of market adoption of this type of technology being used In-line?

Chris Roberts:
One thing would be how complicated or "dirty" samples can be handled? Can you take something directly off a bioreactor? Do you need to do a filtration step to remove cell debris? How messy can the media be? How many components can be present that add to background effects? If it is going to have to flow through a sample cell, do you have back-pressure issues? Another issue is the detection range: if you are looking at multiple species, how do you tease out the good versus the bad and use that to quantify species (e.g., aggregates at low levels compared to a background of monomers, and which aggregates can you detect)? By quantify I mean how to get a reproducible or reliable number to build a decision tree around. Some people do this by qualifying key components such as raw materials using Raman spectroscopy or related techniques, and it can be a multi-variate fitting exercise. But you need a large enough data set. It then ties into the data science and efforts in big data.

It will require real world samples that people can test against. A major challenge for instrument designers is how do they test against the real challenging samples if they can't get access to them. Instrument companies don't make biologic therapeutics, so how do they get access to them, other than buying off the shelf, and if so, those final products are already purified. How do you deal with dirty samples that would be found during manufacturing upstream? This is a major challenge for analytical industries. Industry-based consortia are one of the only mechanisms that exist to help solve this.

Editor:
What have been the technological challenges faced by biophysical characterization scientists? How are these now being addressed?

Chris Roberts:
Most of the biophysical techniques have been developed using relatively clean samples. For example, you have a monomer at 99% and you want to pick up aggregates at the 1% level or 0.01% level, what about the 1 part per million? You can say that someone took a purified aggregate and saw a beautiful signal, but can you quantify and detect those at low levels of aggregate or other degradant species compared to the monomer parent molecule background? That is extremely challenging and highly unlikely with typical assays. Many techniques such as light scattering, microscopy, and spectroscopy struggle with this.

Microfluidic Modulation Spectroscopy doesn't necessarily need sample purification first, because it automatically does background subtraction with relatively high precision. However, the technology needs to work out operating ranges of how "dirty" can the baseline system be? For example, if I currently have a placebo compared with an active in a placebo background, you are dealing with totally different scenario compared to pulling a sample from a flow stream entering or leaving a chromatography column during manufacturing, or even earlier in the process than chromatography steps. This likely will not be solved by an instrument alone, but It will come down to how you design the preparation of the material and choice of where to use a given instrument in the process. There needs to be a collaboration between instrumentation manufacturers, academia and industry to solve these challenges. These are challenges that no one area of expertise is going to be able to solve alone.

Editor:
How have things changed since you started working with FTIR on protein characterization? How do you think the industry will develop technologically?

Chris Roberts:
It has changed quite a bit: particularly with the introduction of instruments that allow one to work in non-deuterated solvents. Those were developed in the early 2000s, but often with manual background subtraction, and Microfluidic Modulation Spectroscopy is a next generation that allows that to be done automatically and in essentially real-time. Those earlier instruments were still then limited with sensitivity to working with on the order of 10mg/ml or higher protein concentrations for liquid samples to obtain the signals you wanted. Historically, the samples are run in "batch" mode, and while that is still the case with the most recent advancements, technologies such as MMS seem well suited to be used in a flow mode (in-line). The most recent developments with MMS also show that it has a significantly larger working range from low to high concentration, compared to older technologies.

Thank you Chris, for your great insights.