The indefinite particle

23 March 2018



Jonas Hoeg Thygesen, research scientist at the Microanalysis Centre, Novo Nordisk, speaks to Kerry Taylor-Smith about methods and techniques for particle analysis and characterisation. He discusses the strengths and weaknesses of current analytical methods for particle identification and characterisation, and examines future methods.


Knowing what is in a product and how it might behave is important not just in the pharmaceutical industry, but in every sector. There are numerous techniques available to study the composition of, for example, a new drug, and getting down to the nitty gritty is often necessary to determine the actions of the particles within.

Defining the properties of particles offers manufacturers the chance to gain a better understanding not only of the product, but also its ingredients and processes. It also enables them to see how a product might be affected by external variables. It might mean they can improve product performance, troubleshoot manufacturing and supply issues, optimise efficiency, increase yield or output, and stay ahead of the competition. It presents the opportunity to gain better control of product quality, because its behaviour is mapped out, which might offer an economic advantage and demonstrate compliance.

Determining a particle’s properties, and understanding how these might impact a product and its processes is often critical to the success of many manufacturing businesses. The action involves identifying various particles and grouping them into categories based on criteria such as size, shape, surface properties and microstructure.

In good shape?

The size of a particle can have a direct influence on a material’s properties, such as the reactivity or dissolution rate in a tablet, or the efficacy of delivery of an asthma inhaler. Particles are found in various forms: powder and granules, emulsions and slurries, and in aerosols and sprays. They are three-dimensional objects, and unless they are perfectly spherical – like in an emulsion or bubble – they are hard to describe using a single dimension such as radius or diameter. To simplify, it’s convenient to define particle size using the concept of equivalent spheres – the particle is defined by the diameter of an equivalent sphere with the same properties, such as mass or volume. This works well for regularly shaped particles, but is more complex for irregular shapes.

Shape can affect the performance and processing of particulate matter. It can also impact the reactivity and solubility of pharmaceutical actives, or powder flow and handling in drug delivery systems, for example. The shape of a particle can be used to determine the state of dispersion of the particulate materials, particularly if it clumps together or if primary particles are present. Shape can be defined employing simplifications using imaging techniques; these are characterised using simple parameters, such as aspect ratio, to determine regular symmetry or different dimensions along an axis.

Differing techniques

There is a wide range of particle characterisation techniques, each with its own strengths and limitations. There are several criteria to consider when choosing which technique to use: which particles properties are important; what particle size range is needed; how quickly are measurements needed; and do they need to be measured at a high resolution?

All techniques will require a degree of subsampling, and it is important that this subsample is as representative of the whole as possible in order to ensure that the analysis is not skewed. Many techniques require the sample to be analysed in a dispersed form, where individual particles are spatially separated. Samples might undergo wet dispersion – in a liquid, which lowers the surface energy and reduces the attraction between touching particles – or dry dispersion, usually in a flowing stream of clean dry air.

“Traditional techniques would include particle counting and sizing; for example, by light obscuration (LO) or optical microscopy as described in USP 788 and Ph.Eur. 2.9.19, and laser diffraction as described in USP 429 or Ph.Eur. 2.9.31,” explains Jonas Hoeg Thygesen, research scientist at Novo Nordisk. “Both these methods are very well established at many pharmaceutical QC laboratories – they are generally accepted compendial methods for particle sizing and counting, often with a large amount of historical data for comparisons. The methods have higher throughput than many of the other methods available for particle analysis and characterisation.”

LO, or single particle optical sensing, is a high-resolution analysis technique that results in the overall size of particle distribution. It works by passing a dilute stream of particles in liquid suspension between a light source and a detector. The detector measures the reduction in light intensity and, employing a calibration curve, processes the signal to determine particle size. Laser diffraction measures particle size distribution by measuring the angular variation in the intensity of light that is scattered as a laser beam passes through a dispersed particulate sample. Large particles scatter light at small angles, relative to the laser beam, while small particles disperse particles at large angles. The angular scattering intensity data is then analysed to calculate the size of the particles responsible for creating the scattering pattern.

“The drawback of these methods would be the lack of morphological and chemical identification or characterisation,” says Thygesen. “Flow microscopy, also known as MFI or FlowCam measurements, may be used to address this. This method has the same high throughput as LO, for example, but comes with the added benefit of images of the individually detected particles. This allows image-based classification of the particles.”

Go with the flow

Flow microscopy is considered the de facto industry standard for subvisible particle characterisation – that is, particles below 100μm that are often found in pharmaceuticals and biopharmaceuticals. It combines the imaging capabilities of digital microscopy with the precise control of microfluidics. It offers precise counts and sizing, with morphological detail for all subvisible particles. The images are captured and analysed to give a database of particle size, count, transparency and shape. FlowCam is a fluid imaging technology that analyses particles using digital imaging, flow cytometry and microscopy to measure the shape and size of microscopic particles in a fluid medium.

The main drawback of flow microscopy is, in my opinion, the lack of chemical identification using the method.

“The high throughput of flow microscopy, combined with the added benefit of the image information, does indeed make it a very attractive method for subvisible particle characterisation,” says Thygesen. “The main drawback of flow microscopy is, in my opinion, the lack of chemical identification using the method. As previously stated, image-based classification is possible using the method. This classification is often done on a basis of historical data. The appearance of silicone droplets, for instance, are well known using flow microscopy. I think, however, that solely basing identification of unknown particles on morphology may be a risky business. Both protein aggregation and excipient, or polysorbate, deterioration may form translucent particles that may be hard to distinguish.”

So, what could be used instead? Thygesen says a range of different methods are available that may differentiate between polysorbate and protein particles, including Raman and Fouriertransform infrared (FTIR) microscopy, which may provide chemically specific spectra of the different analysed particles.

“Combining either Raman and/or FTIR with energy dispersive X-ray spectroscopy, known as EDS or EDX, may provide further insight into the composition of the particles,” he says. “While EDS provides a fingerprint of the elements present in a given sample, Raman and FTIR especially are sensitive towards organic components in the particle. EDS analysis of a given particle will therefore allow the identification of metals that may be present. I therefore see the combination of Raman and FTIR with EDS as a very powerful tool for particle identification.”

Big limits

Characterising the very small can be complex, and the literature is often under review to help address limitations. USP 787 evolved in response to the drawbacks of USP 788 for therapeutic proteins, and provides a smaller-volume testing framework to address proteinbased particles and the immunological effects of a sub-10μm particle load. It came into force in May 2015, and requires pharmaceutical and biopharmaceutical manufacturers making injections and infusions to follow rigorous rules regarding the quantity of particles present in final drug products. This regulation focuses specifically on the development of more costly protein formulations and the unique sensitivities inherent in producing protein-based therapeutics.

“USP 787 specifically addresses therapeutic protein injections and related preparations, allowing the use of smaller test product volumes for LO. This is advantageous, as the often more potent biologics are commonly supplied or available in smaller volumes,” explains Thygesen. “May 2015 also marked the official date for a new general information chapter: 1787. This is a very interesting chapter that, among other things, discusses different technologies for subvisible particle measurement and characterisation. It lists the different techniques principle of operation and their range, as well as advantages and limitations.”

While there are already well-established methods for particle analysis and characterisation, Thygesen sees three different trends for the future. The first, he believes, will be an increase in the specificity of flow microscopy. “The lack of chemical fingerprint is, as previously mentioned, one of the drawbacks of flow microscopy,” he says. “Several new technologies are now on the market where fluorescence labelling may be used while performing flow microscopy. This will allow increased specificity of flow microscopy where, for example, the potential for differentiation between translucent polysorbate and protein particles is increased.”

There will also be a smarter use of existing data, Thygesen suggests. “Deep learning, chemometrics and similar data analysis tools are, in general, improving analytical chemistry. These methods are also finding interest in particle analysis and characterisation where, for instance, examples of the application of artificial neural networks have been seen as a way to improve flow microscopy results,” he says.

And finally, he says, there is the emergence of higher-throughput methods. “EDS and Raman will, as previously discussed, allow chemical fingerprinting of the analysed particles. However, the issue with these methods are their lower throughput as compared with flow microscopy. This is being addressed by increased automation. So, vendors of EDS and Raman equipment are now offering automation options that may increase the throughput of the methods,” concludes Thygesen.

Jonas Hoeg Thygesen works with microscopy and microanalyses, and develops new analytical methods for material identification. Among his areas of expertise are spectroscopy, microanalysis and advanced data analysis.
Flow microscopy counts subvisible particles, giving precise details on their shape, size and transparency.


Privacy Policy
We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.