In this article, the authors discuss ways to ensure that users of varying skill levels will achieve the same results when analyzing mixtures.
Many analytical challenges to modern laboratories involve mixtures, whether formulated or contaminated. The infrared spectra of mixtures exhibit peaks from each component, making separation of peaks due to specific components an essential part of the analysis. This can be accomplished in two ways — spatial separation of components through microscopy or spectral separation through multicomponent searches. In either case, the final result should not just be a chemical image or spectrum, but actionable information. Further, users of varying skill levels need to obtain the same results, so consistency must be a core metric of the problem-solving tools.
Analytical and forensic laboratories are inundated with samples from many origins — contaminated product, competitive materials, or crime scene evidence. Their task often focuses on identification. In infrared, this is based primarily on comparison of the sample spectrum to library spectra. This was once done by overlaying spectra on a light box. Computers accelerated the process but did not change the fundamental approach of spectrum-by-spectrum comparisons. Both methods are still useful but experience one major shortcoming — a limited ability to handle mixtures. Digital spectral subtraction filled that gap partially, but suffered when totally absorbing peaks occurred, or when environmental changes altered peaks slightly. Because mixtures represent such a large fraction of the sample load listed above, new approaches have been sought. The two leading approaches involve spatial separation and multicomponent searching.
Infrared (IR) microscopy permits the spatial analysis of mixtures through the collection of spectral images. For example, a tablet consists of various components pressed into a solid piece. The individual components are not distributed continuously on the micro scale, but reside in domains within the tablet. Thus, IR images of the tablet show spectral variations over the surface. In a real sense, how the image is collected — point by point or using an array detector — is not important (except as regards time). What matters is the analysis of the final image to extract information that can be acted upon. The key answers are identity, size, and distribution of the material in the tablet and relative concentrations of the components. Armed with this information, the scientist can make intelligent decisions.
In contrast, the IR spectra of homogeneous mixtures exhibit simultaneous absorptions from the constituents. Simple searching followed by spectral subtraction works in some cases but can result in erroneous starting points and difficult-to-handle residuals. Subsequent searches might return the same result again, or might be affected by derivative-shaped bands or totally absorbing peaks. True multicomponent searching would not rely upon subtractions or other spectral processing. The result would be a list of candidate constituents and a visual display, which would again represent actionable information.
Consistency must be an essential component of mixture analysis. Subtraction requires use of a variable factor, which can lead to poor agreement between different users. Completely removing this step is thus required to reach the consistency needed to enable less technically advanced users to obtain useable results. Further, the increased throughput demanded of analysis laboratories, coupled with the multitechnique skill sets required of the staff, mean that these tools must be automated and reliable. The following sections examine heterogeneous and homogeneous mixtures with automated analysis protocols in both micro and macro domains.
The Thermo Scientific Nicolet iN10 MX FT-IR imaging microscope with OMNIC Picta microscopy software (Thermo Fisher Scientific, Madison, Wisconsin) was used to collect images from pharmaceutical tablets. The ultra-fast scanning of this microscope allows collection of several square millimeters in a few minutes, with a 25-μm spatial resolution. The complete analysis presented used the OMNIC Picta Random Mixtures Wizard.
The Thermo Scientific Nicolet iS10 FT-IR Spectrometer, Smart iTR diamond attenuated total reflection (ATR) accessory, and OMNIC spectroscopy software were used to collect the IR spectrum of a pharmaceutical powder and a computer monitor cover. The thermogravimetric analysis (TGA) data were collected using the same spectrometer equipped with an internal TGA accessory. Data were exported to OMNIC Specta spectroscopy software for the multicomponent search.
Many solid materials display granularity in the 5–200 μm range — pharmaceutical tablets, fisheye distortions in polymer sheets, and sectioned museum artifacts are common examples. Consider the fisheye example: the questions may be what contaminant caused the fisheye? How prevalent is the contamination? How is it distributed in the sample? An image alone does not convey all this information. Consider Figure 1, which shows an image of a two-component pharmaceutical tablet. The color coding represents the intensity of the signal at the scroll bar location in the bottom spectrum (about 1680 cm–1) — red shows an intense signal, and blue shows a weak signal. The image communicates some distribution and identification information, but there is still untapped potential in the data. Further, if there were three components in this region, the image would not convey that information.
Figure 1
Figure 2 captures an intermediate step in the wizard-driven analysis of another tablet sample. The spectral image has been digitized into a red–gray image, with red points correlating strongly to a particular library spectrum (acetaminophen, in this case). This form of image processing, based upon X-ray imaging techniques, allows additional physical information to be extracted. The analysis shows the spatial distribution but now calculates area percentages for the selected component. Cycling through the matches allows visualization of the distribution for each component.
Figure 2
The whole solution — what, how much, how distributed, and where — is answered in the report shown in Figure 3. The report identifies each component, colorizes the image, and provides a calculation for the total area occupied by each component. The area percentages provide a calibrationless semiquantitative analysis of the tablet. While semiquantitative information can be somewhat useful in production QC/QA applications, in forensics laboratories this goes beyond being valuable to being essential, since only one sample might be available and no calibration standards might exist.
Figure 3
Two conclusions are apparent. First, the analysis has mined the data extensively. The scientist now has the answers upon which to build action decisions. Formulations, blending, and tablet pressing modifications can be proposed based upon the identity, concentration, distribution, and domain sizes of each component. Second, the entire process occurred automatically, all the way to the final report shown in Figure 3, so any user in the laboratory would obtain the same results. This level of consistency and reproducibility greatly enhances the power and utility of IR microscopy in the laboratory.
Homogeneous mixture analyses typically are done using transmission, ATR, or diffuse reflectance infrared Fourier transform spectroscopy (DRIFTS) methods. The ability to extract component information relies upon the additivity of the individual spectra. Essentially, the spectrum must — to a good approximation — appear as a sum of the pure component spectra multiplied by relative concentrations. In gases, this is generally true because intermolecular interactions are weak. In solids, the assumption is less true, but many materials still come close to the ideal. In liquids, intermolecular interactions, like hydrogen bonding, can cause large shifts in spectral features. Some liquids mix almost ideally, while others interact strongly — acetone and water, for instance. In the latter case, special considerations are required. In this article, only mixtures approximating the ideal are considered.
Traditional mixture analysis begins with a first search and choice of best match. Spectral subtraction follows, and the residual is searched. There are four problems with this approach. First, the top match in the list might not be the actual best component. Figure 4 shows a pharmaceutical mixture of known composition searched in this way. The top matches are not present in the mixture, and use of these in a subtraction would lead to an erroneous beginning.
Figure 4
Second, the molecular environment affects the IR spectrum strongly. Gas-phase spectra show small shifts in bandwidth and relative peak intensity with temperature, and can shift with pressure due to collisions. Subtraction in these cases leads to derivative-shaped peaks. Third, totally absorbing peaks, such as seen in many transmission spectra of polymeric films, are very difficult to remove. Most algorithms choose to ignore these peaks. Fourth, the ability to do subsequent subtractions deteriorates as the noise in the residual spectra increases at each stage.
A much better approach is to use cumulative searching. Figure 5 shows the result of multicomponent searching applied to the same spectrum shown in Figure 4. In a sense, the same sort of microgranularity as considered in the previous section is present in this mixture, so the spectra are highly additive. The resultant three-component composite spectrum covers most, if not all, of the peaks in the IR spectrum and represents the known composition very well. Critically, the only input to the algorithm was the choice of library.
Figure 5
Figure 6 shows the same algorithm applied to a heat-treated plastic component of an electronics device. The first search showed the ABS plastic, but the series of small peaks below 1400 cm–1 were seen to be missing. Adding the second component yielded the flame retardant, which would allow the staff to reject this material from the WEEE/RoHS perspective (1,2).
Figure 6
Two component searches are reasonably straightforward. As the number of components increases, however, the potential for interferences also increases. However, gas-phase spectra are highly additive, and four component searches have yielded excellent results. Figure 7 shows an analysis of a mixture arising from outgassing of an epoxy in a thermal decomposition (TGA) experiment. Clearly, all of the main features are assigned.
Figure 7
In each of these cases, the visual agreement is very good. The match metric does not always reflect the quality of the result, due to some of the same arguments regarding spectral subtraction. Peak broadening, small shifts, or high noise will reduce this metric. However, noise has considerably less impact because there are no cumulative effects of subtraction. Totally absorbing peaks will affect the metric, but the visual comparison still shows good agreement.
The algorithm would be even more effective if quantitative information could be gleaned from the composite percentages. However, most libraries are normalized when constructed, so the relative Beer-Lambert absorptivities between two components are lost. Hence, the fact that 20% of one spectrum and 80% of the other spectrum are required to build the composite does not translate to a 20–80 composition of the actual materials in the sample. However, the software does permit users to access spectra from the hard disk — not just data encoded in libraries — where normalization has not occurred. Use of various spectra collected in the same manner as the sample could then convert the composite percent values into good approximations for the actual concentrations, providing a reasonable quantitative analysis of the sample. As above, this analysis can be of extreme value in forensics laboratories, where it could represent the only hope for obtaining quantitative values.
The two primary key metrics for use of multicomponent searching are met. First, the information is presented in an actionable form. The search result provides a qualitative measure of the composition of the mixture. Second, consistency is assured, because any user in the laboratory would obtain the same results.
Analytical and forensics laboratories spend a great deal of time and effort on the analysis of mixtures. There has been a latent, urgent need for high-quality, automated tools to remove as much user-to-user variation as possible. The next-generation tools discussed here represent a major step in this analysis and provide the required consistency and automation.
Michael Bradley, Federico Izzia, and Simon Nunn are with Thermo Fisher Scientific, Madison, Wisconsin.
(1) Directive 2002/95/EC of the European Parliament and of the Council of 27 January 2003 (Restriction on Hazardous Substances).
(2) Directive 2002/96/EC of the European Parliament and of the Council of 27 January 2003 (Waste Electrical and Electronic Equipment).
Revealing the Ancient Secrets of Chinese Swamp Cypress Using Cutting-Edge Pyrolysis Technology
November 18th 2024A study published in the Journal of Analytical and Applied Pyrolysis by Yuanwen Kuang and colleagues used advanced pyrolysis techniques to reveal the preservation and chemical transformations of 2,000-year-old Chinese swamp cypress wood, offering valuable insights for archaeological conservation and environmental reconstructions.