The first part of an in-depth interview exploring the use of flow imaging microscopy (FIM), a new technology used for subvisible particle characterization in biologics.
Introduction
In Part 1 of this two-part interview, we speak with Austin Daniels, application scientist for Yokogawa Fluid Imaging Technologies, about the subject of flow imaging microscopy (FIM).
Ensuring the quality and safety of biologics requires precise monitoring of subvisible particles, which can impact drug efficacy and patient health. Traditional analysis methods, such as light obscuration (LO) and membrane microscopy (MM), provide limited data on particle types and morphology, making it difficult to pinpoint contamination sources.
Flow imaging microscopy (FIM) offers a more comprehensive approach, capturing high-resolution images and quantitative data on particle size, shape, and composition. Instruments like FlowCam enable detailed characterization, distinguishing protein aggregates from contaminants and monitoring process stability. Recognized in regulatory guidelines such as USP <1787> and <1788>, FIM enhances compliance and quality control in biotherapeutic development (1–16).
In this interview, we discuss the advantages of FIM over traditional techniques, its role in regulatory compliance, and real-world applications for improving biologic safety and efficacy. Join us as we explore the impact of this innovative technology on subvisible particle characterization.
How was flow imaging microscopy (FIM) developed and what are some of its unique characteristics?
Flow imaging microscopy’s origins are unique in that it was not primarily developed as a tool for pharmaceutical development. It was invented as a tool for imaging phytoplankton and other algae found in marine research. The core idea at its inception was to combine the insight microscopes provide researchers about the types of objects in a sample with the quantitative concentration and sizing data high-throughput particle counters like flow cytometry provide. The resulting instrument essentially is a high-throughput light microscope. During a measurement, a liquid sample flows into a microfluidic flow cell which centers the sample and any particles it contains in the focal plane of a light microscope. The instrument then captures digital microscopy images of the sample at regular intervals which can be processed to determine the concentration, size, and morphology of particles in a sample.
The problems the inventors were trying to solve in marine research are very similar to the problems many researchers characterizing large subvisible particles (2-100 μm) in pharmaceuticals wanted to solve. In the late 2000s, there were concerns that changes in the number and types of these particles were responsible for some of the adverse reactions occurring to these therapies in the clinic, most notably including the formation of anti-drug antibodies. These changes were not detected by the common particle analysis techniques of the time: LO and MM. Thus, researchers were interested in investigating new methods for characterizing subvisible particles. Flow imaging microscopy (FIM) ended up being adopted by the pharmaceutical community as part of these efforts.
The most unique characteristic of FIM is its ability to simultaneously capture numeric information about a sample’s particle content (for example concentration and size) and information about particle morphology, such as shape and color. The latter is often related to particle type and source—critical information in assessing the risks particles may pose for patients and reducing the particle content of the therapy. This differs from many legacy techniques used to characterize subvisible particles. Some like LO only provide quantitative particle data. Others like MM, provide morphology info. Very few provide both.
Are there some surprising discoveries you have encountered since you have explored this technology for different applications?
The biggest surprise on my end is just how many samples that either myself or another researcher thought were particle-free ended up containing particles from an unexpected source. For example, while in grad school my lab got a new FIM instrument and needed to clean the fluidics of the instrument with particle-free water. A labmate gave me what he thought was particle-free ultrapure water, only to run it through the instrument and see dozens of bacterial cells. It turns out he had been using the same water sample for years without replacement which led to it becoming contaminated. After that incident the lab got a lot better about using fresh ultrapure water for experiments when possible.
This type of situation pops up regularly in the research I’ve been involved with—with FIM, researchers run what they’re convinced is a particle-free sample and they see dozens of particles. This isn’t to say that these particles are necessarily dangerous (though they certainly can be), but they are still not something we want to administer to patients if we can avoid it. It’s very similar to the ongoing public scrutiny of microplastics in our water supply.
How does FIM outperform traditional methods like LO or electron microscopy (EM) in terms of data captured and throughput, especially for biologic particle analysis?
In addition to combining count, size, and morphology data, the main improvement FIM provides over traditional methods is its sensitivity to translucent particles, those with a similar refractive index as the media they are suspended in. Translucent particles are extremely common in many biologics since aggregates of the active ingredient like proteins and viral vectors are often very close in refractive index to the buffer. These particles are challenging to analyze via most older particle analysis techniques. These measurements are also performed with the particles suspended in their native formulation buffer. Buffer removal is required for several particle analysis techniques, most notably membrane microscopy. Removing the buffer makes it easier to detect some translucent particles, but it often changes the number, size, and types of particles observed. Possible effects include removing “soft particles” like silicone oil droplets or inducing the agglomeration of nanometer-sized particles into micrometer-sized particles that would not be detected by subvisible particle methods otherwise. FIM avoids these issues as a solution technique while still being sensitive to translucent particles.
FIM is also a relatively high-throughput technique for processing particle morphology data. While not as fast as some “pure” particle counters, it can process a lot of particles per unit time relative to forensic techniques like electron microscopy. This makes it a great screening tool for identifying broad types of particles in a sample while also getting concentration and size information.
What optical technologies or spectroscopic methods does FIM use to enhance the detection of protein aggregates or transparent particles?
FIM’s sensitivity towards transparent particles primarily comes from the use of digital imaging as part of its measurement principle. Most established optical techniques for particle analysis, such as LO or laser diffraction, use only the amount of light blocked or scattered to assess the size of particles. This strongly depends on particle refractive index and often requires a user to assume that their particles have a similar refractive index as the particles used to calibrate the instrument. Imaging-based platforms like FIM instead use the area a particle takes up in an image to detect and size particles. This greatly reduces the dependency of its measurements on particle refractive index. You still require a very small difference in refractive index between the particles and the buffer so they are visible when imaged. Once that criterion is established, FIM can then always be used to capture an accurate concentration and size distribution measurement of particles.
How does FIM distinguish between intrinsic, extrinsic, and inherent particle types in biologics, and how does this help optimize formulations?
This mostly comes down to the image data FIM provides for each particle. When a sample contains different particle types, each type will look slightly different when imaged. That image data can then be processed in different ways, including using methods like artificial intelligence (AI), to help assess the particle types that may be there.
Knowing the types of particles that are present in a sample can help scientists developing drug products identify the relevant sources of particles in their therapy formulations. That data can be helpful to get the particle content of a new formulation or drug product under control. For example, if FIM shows a lot of glass particles, it may suggest an incompatibility between the container and the rest of the formulation. This type of optimization to reduce particle content can make the product easier to manufacture and potentially perform better in the clinic.
What challenges does FIM face in real-world applications, especially with automation, calibration, or sample handling, and how does this affect compliance with USP <788>?
The main practical challenge that faces FIM at the moment is taking full advantage of all the information it provides users about their samples. In a typical experiment, FIM instruments can easily give researchers thousands of images and often over a gigabyte of data for processing. Many users will often just focus on the particle concentration and size data they capture, but there’s a variety of untapped information FIM provides users about the types of particles in a sample. Improving the tools for processing the image data FIM already captures would make the method even more useful than its users already find it.
The other challenge is trying to standardize the analytical technique in general. FIM is still a relatively new analytical technique and there aren’t as many standard methods for calibrating the instrument and performing measurements. In contrast, for LO USP provides very explicit recommendations for how the instrument should be calibrated and used to get consistent subvisible particle data. Given how important its measurements are in biopharmaceutical development, standardizing FIM as an analytical technique would benefit everyone using the technique. It would also pave the way for this technique to be used as part of standard quality control testing.
How can FIM be adapted to analyze novel particle sources, like lipid nanoparticles and viral vectors, in gene and cell therapies?
There is very little work required to adapt FIM to analyze particles in gene and cell therapy products. One key feature of the technique is that it can analyze most particle types so long as they are in the minimum detectible size range of the instrument and where there is at least a small refractive index difference between the particles and background. While the exact particle types are different and may require some different strategies to identify them, for example, adeno-associated viruses (AAVs) and lipid nanoparticle (LNP) aggregates, a standard operating procedure (SOP) for analyzing particles in a parenteral therapy via FIM will look nearly identical regardless of the active ingredient.
How does FIM complement LO in meeting USP <788>, <787>, and <1788>, particularly for particle morphology analysis?
FIM has two main advantages over LO: its sensitivity for important particle types in biopharmaceutical samples and its access to morphology data. The former is important as it helps researchers track particle types like protein aggregates, viral vector aggregates, and fatty acid particles that LO tends to undercount and undersize. Since FIM is more sensitive, it will often report that samples have particle concentrations outside the range that is allowed by USP <788> and its equivalents—even if the sample would meet the criteria if tested via LO. However, minimizing the particle content measured via FIM can help ensure that the sample will meet pharmacopeia guidelines when tested via LO. As some translucent particles are thought to pose the greatest risks for safety, doing so may also have a positive impact on efficacy.
The morphology data is beneficial as it is often needed to identify the sources of particles in a drug product and its manufacturing process. There are often situations where a scientist needs to minimize the particle content of therapy either as part of formulation design or following batch rejection in manufacturing. If you just have size and concentration measurements like LO provides, it can be challenging to know how to adjust the formulation or process to reduce the particle content. With the morphology data FIM provides, it is possible to identify potential sources of particles in a sample and make targeted changes to reduce a sample’s particle content.
How does FIM help identify protein aggregates versus other particles like silicone oil droplets, and how does morphology data contribute to ensuring product safety?
This primarily comes from the morphology data FIM provides. Silicone oil droplets are generally round and circular when imaged via FIM, while protein aggregates are more amorphous in shape. This stark difference in morphology can be used to recognize particles of both types either visually or by using various automated approaches including artificial intelligence (AI).
The ability to do this is often helpful since protein aggregates and silicone oil droplets differ in their impact on patient safety. While its impacts on patient safety are an area of active discussion, silicone oil is generally thought to be less dangerous for patients than protein aggregates. Quantifying both particle types can help scientists better assess the safety risks of a given sample’s particle content and make formulation and quality control decisions accordingly. If it is then desired to eliminate some particles in a sample, knowing which particle type is dominant can suggest strategies to pursue to achieve this.
How does FIM’s ability to combine LO and FIM improve efficiency in lot release testing, in line with USP guidelines?
Some unique instruments offer simultaneous FIM and LO measurements. Processing samples with such a combined instrument can streamline getting both measurements, especially regarding sample volume. This makes it straightforward to get the LO data needed for regulatory compliance while also getting the more accurate concentration and size data as well as the morphology data FIM provides.
The biggest opportunity for instruments like this is in bridging studies comparing legacy LO measurements with FIM measurements. We don’t have a great sense of how FIM data correlates with LO data, so having an instrument that is capable of directly comparing the two techniques can help companies establish new acceptable particle limits for FIM based on legacy LO data.
References
(1) Molina, S. A.; Davies, S. J.; Sethi, D.; et al. Particulates Are Everywhere, but Are They Harmful in Cell and Gene Therapies? Cytotherapy2022, 24(12), 1195–1200. DOI: 10.1016/j.jcyt.2022.07.014
(2) Rosenberg, A. S. Effects of Protein Aggregates: An Immunologic Perspective. AAPS J.2006, 8(3), E501–E507. DOI: 10.1208/aapsj080359
(3) Kotarek, J.; Stuart, C.; De Paoli, S. H.; et al. Subvisible Particle Content, Formulation, and Dose of an Erythropoietin Peptide Mimetic Product Are Associated with Severe Adverse Postmarketing Events. J. Pharm. Sci.2016, 105(3), 1023–1027. DOI: 10.1016/S0022-3549(15)00180-X
(4) Shibata, H.; Harazono, A.; Kiyoshi, M.; Ishii-Watabe, A. Quantitative Evaluation of Insoluble Particulate Matters in Therapeutic Protein Injections Using Light Obscuration and Flow Imaging Methods. J. Pharm. Sci.2021, 000. DOI: 10.1016/j.xphs.2021.09.047
(5) Calderon, C. P.; Daniels, A. L.; Randolph, T. W. Deep Convolutional Neural Network Analysis of Flow Imaging Microscopy Data to Classify Subvisible Particles in Protein Formulations. J. Pharm. Sci.2018, 107(4), 999–1008. DOI: ht10.1016/j.xphs.2017.12.008
(6) Rosenberg, A. S. Effects of Protein Aggregates: An Immunologic Perspective. AAPS J.2006, 8(3), E501–E507. DOI: 10.1208/aapsj080359
(7) Kotarek, J.; Stuart, C.; De Paoli, S. H.; et al. Subvisible Particle Content, Formulation, and Dose of an Erythropoietin Peptide Mimetic Product Are Associated with Severe Adverse Postmarketing Events. J. Pharm. Sci.2016, 105(3), 1023–1027. DOI: 10.1016/S0022-3549(15)00180-X
(8) Srivastava, A.; Mallela, K. M. G.; Deorkar, N.; Brophy, G. Manufacturing Challenges and Rational Formulation Development for AAV Viral Vectors. J. Pharm. Sci.2021, 110(7), 2609–2624. DOI: 10.1016/j.xphs.2021.03.024
(9) Gambe-Gilbuena, A.; Shibano, Y.; Krayukhina, E.; Torisu, T.; Uchiyama, S. Automatic Identification of the Stress Sources of Protein Aggregates Using Flow Imaging Microscopy Images. J. Pharm. Sci.2020, 109(1), 614–623. DOI: 10.1016/j.xphs.2019.10.034
(10) Wright, J. F.; Le, T.; Prado, J.; et al. Identification of Factors That Contribute to Recombinant AAV2 Particle Aggregation and Methods to Prevent Its Occurrence During Vector Purification and Formulation. Mol. Ther.2005, 12(1), 171–178. DOI: 10.1016/j.ymthe.2005.02.021
(11) Saggu, M.; Bou-Assaf, G. M.; Bucher, R.; et al. Evaluating Clinical Safety and Analytical Impact of Subvisible Silicone Oil Particles in Biopharmaceutical Products. J. Pharm. Sci.2024, 113(5), 1401–1414. DOI: 10.1016/j.xphs.2024.01.002
(12) Chisholm, C. F.; Nguyen, B. H.; Soucie, K. R.; Torres, R. M.; Carpenter, J. F.; Randolph, T. W. In Vivo Analysis of the Potency of Silicone Oil Microdroplets as Immunological Adjuvants in Protein Formulations. J. Pharm. Sci.2015, 104(11), 3681–3690. DOI: 10.1002/jps.24573
(13) Mazaheri, M.; Saggu, M.; Wuchner, K.; et al. Monitoring of Visible Particles in Parenteral Products by Manual Visual Inspection—Reassessing Size Threshold and Other Particle Characteristics That Define Particle Visibility. J. Pharm. Sci.2024, 113(3), 616–624. DOI: 10.1016/j.xphs.2023.10.002
(14) Liu, F.; Hutchinson, R. Visible Particles in Parenteral Drug Products: A Review of Current Safety Assessment Practice. Curr. Res. Toxicol.2024, 7(June), 100175. DOI: 10.1016/j.crtox.2024.100175
(15) Telikepalli, S. N.; Carrier, M. J.; Ripple, D. C.; et al. An Interlaboratory Study to Identify Potential Visible Protein-Like Particle Standards. AAPS PharmSciTech2023, 24(1). DOI: 10.1208/s12249-022-02457-9
(16) Amara, I.; Germershaus, O.; Lentes, C.; et al. Comparison of Protein-Like Model Particles Fabricated by Micro 3D Printing to Established Standard Particles. J. Pharm. Sci.2024, Article in Press. DOI: 10.1016/j.xphs.2024.04.011
About the Interviewee
Austin Daniels is an application scientist for Yokogawa Fluid Imaging Technologies. He received his Ph.D. in Chemical and Biological Engineering from the University of Colorado. His research focused on flow imaging microscopy and similar subvisible particle imaging techniques combined with artificial intelligence-driven image analysis tools. These methods were used to compare protein aggregates generated via different stress conditions in biotherapeutics. Currently, he is working on exploring and improving applications for flow imaging microscopy in biotherapeutics development and beyond.
About the Interviewer
Jerome Workman, Jr. serves on the Editorial Advisory Board of Spectroscopy and is the Executive Editor for LCGC and Spectroscopy. He is the co-host of the Analytically Speaking podcast and has published multiple reference text volumes, including the three-volume Academic Press Handbook of Organic Compounds, the five-volume The Concise Handbook of Analytical Spectroscopy, the 2nd edition of Practical Guide and Spectral Atlas for Interpretive Near-Infrared Spectroscopy, the 2nd edition of Chemometrics in Spectroscopy, and the 4th edition of The Handbook of Near-Infrared Analysis. Author contact: JWorkman@MJHlifesciences.com ●
Exoplanet Discovery Using Spectroscopy
March 26th 2025Recent advancements in exoplanet detection, including high-resolution spectroscopy, adaptive optics, and artificial intelligence (AI)-driven data analysis, are significantly improving our ability to identify and study distant planets. These developments mark a turning point in the search for habitable worlds beyond our solar system.
Using Spectroscopy to Reveal the Secrets of Space
March 25th 2025Scientists are using advanced spectroscopic techniques to probe the universe, uncovering vital insights about celestial objects. A new study by Diriba Gonfa Tolasa of Assosa University, Ethiopia, highlights how atomic and molecular physics contribute to astrophysical discoveries, shaping our understanding of stars, galaxies, and even the possibility of extraterrestrial life.
Illuminating Robotics and the Role of Optical Sensors in Continuum Robots
March 19th 2025A recent review published in Sensors explores the dynamic field of continuum robotics, with a particular focus on the advances in optical sensing technologies. The study, led by researchers from the Technical University of Košice and the University of Texas at Austin, highlights the dominance of optical fiber sensors in tracking robotic shape perception and environmental interactions, demonstrating spectroscopic applications and future potential.
Smarter Food Processing with AI, Optical Sensors, and Robotics Enhance Quality Control
March 17th 2025Researchers at Oregon State University explore how machine learning, optical sensors, and robotics are transforming food quality assessment and processing, improving efficiency and reducing waste.