Spectroscopy
This article discusses emerging trends in the design and use of spectroscopic instrumentation. It focuses on recent research using new or modified spectroscopic techniques that are advancing scientists’ capability to obtain high-content, high-resolution data from ever-smaller sample sizes. To illustrate this trend, the article surveys novel approaches to complex measurement problems across a wide range of critical fields such as disease research, food safety, environmental monitoring, and drug development.
This article discusses emerging trends in the design and use of spectroscopic instrumentation. It focuses on recent research using new or modified spectroscopic techniques that are advancing scientists’ capability to obtain high-content, high-resolution data from ever-smaller sample sizes. To illustrate this trend, the article surveys novel approaches to complex measurement problems across a wide range of critical fields, including disease research, food safety, environmental monitoring, and drug development.
There are many possible ways to frame a discussion of trends in spectroscopic instrumentation: market reports on sales figures and forecasts, literature surveys of techniques described in hot papers, summaries of major presentations at international conferences, and so forth. This article incorporates information from all of the above sources, but with a timely twist that pays small tribute to a larger-than-life figure in the field-Tomas B. Hirschfeld.
Thirty years ago this April, the international spectroscopy community was mourning Hirschfeld’s untimely death at 46. Hirschfeld was widely regarded as one of the most innovative minds in the recent history of analytical chemistry-a scientific Renaissance man known for predicting future trends with uncanny accuracy. A native of Uruguay, Hirschfeld had 100 patents, five R&D 100 awards, and an interdisciplinary legion of admiring scientists in his thrall. He dazzled his peers with one out-of-the-box idea after another. In the chemical analysis field alone, Hirschfeld either invented, contributed to, or inspired innovations in fiber-optics, process analysis, chemometrics, flow-injection analysis, separation–spectroscopy hybrids, near-infrared reflectance spectroscopy, Fourier transform infrared (FT-IR) and FT-Raman spectroscopy, and much more. His legend also looms large in fields such as analytical cytology, where his work at the interface of biology, chemistry, physics, and engineering would advance that technique as a life-saving medical diagnostic tool (1).
One of Hirschfeld’s last publications was a 1985 commentary in Science (2) predicting big changes in store for analytical instrumentation in the decade ahead. His bold-for-the-times speculations suggested a future fueled by rapid advances in computation, miniaturization, and optical engineering. A few of his predictions follow:
· “As the average instrument achieves a rather considerable level of intelligence, ‘dumb’ systems will become the exception, and we will eventually begin to become proficient in exploiting the resulting capabilities.”
· “More sophisticated understanding of measurement science and of actual measurement needs will drive instrumentation design advances such as miniaturized sensors and yet more ‘hyphenated’ instruments and ‘mapping’ instruments.”
· “The combination of sensor-based instrumentation and microminiaturization will make possible distributed measurement by allowing point-of-use measurements by nonexperts.”
In late 1985, the personal computer (PC) was just making its mark on science. The technical challenges of interfacing mass spectrometers with high-performance separation systems were daunting. Many of today’s workhorse imaging modalities were embryonic or had yet to be invented. Nanotechnology was unheard of. But the rapid and dramatic progress in each of these areas in the ensuing three decades-especially in the context of 21st century megatrends such as genomics and “Big Data”-continues to shape how analytical instruments are designed and used. The steps toward miniaturization, exceptionally high resolution and sensitivity, and automation he foresaw have led to ultrasensitive, ultrasmart spectroscopic instruments that are driving progress today-and putting portable spectrometers in the hands of scientists and engineers from diverse backgrounds.
On the 30th anniversary of Hirschfeld’s death, it is timely to focus on a few of today’s trends in instrumentation through that predictive lens he held up in the 1980s.
Needs Drive Design
The first prediction made by Hirschfeld that we discuss states that a “more sophisticated understanding of measurement science and of actual measurement needs will drive instrumentation design.” What Hirschfeld’s prediction looks like in real life depends on the “actual measurement needs” in question. In health care, doctors are now able to tailor personalized treatments for cancer and other diseases based on data from genomic analysis, multiscale cell and tissue imaging, and spectral characterization of molecular structure. In the food quality industry, a key task is to measure the presence and potential health impacts of nanoparticles. And in the electronics field, the perennial challenge to make smaller, smarter, faster, cheaper, higher-quality chips continues to drive the measurement needs. In meeting this need, as Hirschfeld pointed out, that industry’s dramatic size and cost reductions in the design of successively more powerful computers has also translated into a new generation of analytical instruments that measure far more with far less of a footprint than ever before. With so many ways to examine the big picture of the needs-drive-design paradigm, this article will zoom in on the small picture-single-particle, single-molecule, and single-cell spectroscopic analysis.
At Block Engineering in the late 1960s, Hirschfeld was contributing to some of the earliest commercial FT-IR spectrometers while also taking the nascent cell-sorting technology of flow cytometry to new heights. His work to enhance instrumental sensitivity enabled biochemists to detect and measure single viruses and individual molecules. Spectroscopy has also since emerged as a useful tool for measuring the physical and chemical properties, biological function, and structural architecture of individual cells, molecules, and particles. This level of resolution was once driven by instrument designers’ aspirations to see how far they could push the limits of microanalysis technology. But in a number of application areas-for example, biomedical research, nanomaterials fabrication, and environmental safety-there are now a number of large, practical questions to be answered within these small structures.
Single-Particle Analysis
One of the more contentious results of the nanotechnology revolution is a growing scrutiny of the wide range of engineered nanoparticles (ENPs) added to foods and other consumer products. Nanoparticles are used to impart useful properties to food products and packaging. They can enhance a product’s color or texture, combat odors, or fight off microbial contamination. But on the other hand, there are significant unanswered questions about their impacts on human health. The use of ENPs in foods has proliferated far more rapidly than the study of their health impacts or the development of regulatory guidelines (3).
ENPs made of such materials as silver, copper, gold, or titanium or zinc oxides are also used in cosmetics, sunscreens, nutritional supplements, and other products made to be ingested or applied topically. These tiny materials are more mobile through tissue than conventionally sized particulate ingredients. With fears of a heightened risk of infiltrating the blood-brain barrier, cell nuclei, or placenta, the presence of certain nanoparticles in foods and drugs is now under increased vigilance by the U.S. Food and Drug Administration (FDA). In Europe, where food producers are required to disclose the presence of nanoparticles on food labels, the European Union (EU) launched the three-year NanoLyse project in 2010 to develop and validate standard analytical methods for the detection and characterization of engineered nanoparticles in food. Teams worked to design a range of multihyphenated techniques to detect, size, and quantify various nanoparticle materials in foods. To acquire that wide range of data from such small constituents in a sample, they coupled field flow fractionation, ultraviolet–visible (UV–vis) detection, dynamic and multiangle light scattering, and inductively coupled plasma-mass spectrometry (ICP-MS) to detect, size, and quantify silver and silica nanoparticles in foods (4).
To help laboratories comply with federal rules and association standards, instrument manufacturers are developing instrument and software platforms customized for the job. Methods that measure the total analyte concentration, confirm the presence of certain particles in a sample, and characterize the size and number of particles are finding use in food, semiconductor, pharmaceutical, environmental, and other fields. Laboratories are turning to ICP-MS, energy-dispersive X-ray fluorescence (EDXRF), and specialized modes of electron microscopy to glean this hard-to-find data from complex samples.
Single-particle ICP-MS-used with or without a preliminary separation step-provides much of what the industry is seeking in a robust, cost-effective standard measurement method. It satisfies the regulatory need for data on particle size, concentration, and associated dissolved constituents in less than 3 min per sample with minimal sample preparation or method development requirements (5).
Apart from the regulatory issues, nanoparticles remain a critical subject of basic chemical investigation. Laser-induced breakdown spectroscopy (LIBS)-a technique that has flourished in the 30 years since Hirschfeld’s death-has recently been used to study metal-oxide nanoparticle aerosols. A team from Rutgers University and China’s Tsinghua University in Beijing developed a phase-selective LIBS variant using secondary resonance excitation from the same laser pulse to measure the flame synthesis of TiO2 nanoparticles (6). Led by Rutgers’s Stephen D. Tse, the group observed a number of experimental advantages to TiO2 measurement using their modified method, including a significant improvement in the detection threshold.
Silver is the most common nanoparticle used in consumer products, driving a number of new spectroscopic applications for techniques such as surface-enhanced Raman spectroscopy (SERS). For example, a group of agricultural chemists from the University of Massachusetts, Amherst tested the feasibility of SERS for the detection and quantification of silver nanoparticles in antimicrobial products (7) and concluded that SERS is a promising choice for studying environmentally relevant levels of silver nanoparticles in consumer products and related matrices.
At Rice University, where the modern nanotechnology field began with the discovery and characterization of Buckminsterfullerene, scientists have developed a variance spectroscopy method to analyze carbon nanotubes in solution (8). They used the technique to zoom in on small regions in dilute nanotube solutions to take several thousand rapid spectral measurements, which are compared to reveal the types, numbers, and properties of the nanoparticles in the solution.
Single-Cell Analysis
Spectroscopists have been capable of characterizing single cells and even individual molecules for a few decades now. The rise of genomics and ultrahigh-resolution molecular imaging has been driving progress across the scientific landscape. In cellular biology, the ability to measure the myriad individual molecular components within a single cell feeds the growing understanding of how these structures work together as a system. In a tumor, for instance, researchers face a bewilderingly complex and heterogeneous matrix of healthy and diseased cells that are continually interacting with one another and with the extracellular microenvironment. Single-cell and single-molecule imaging and measurement tools are beginning to make sense out of the disease process. “Genetics has gotten very good at finding genetic variation associated with disease. Going from genetic results to biological understanding of disease is the next important challenge,” said Steve McCarroll, a Broad Institute member and director of genetics in the Stanley Center for Psychiatric Research at the Broad as well as an assistant professor of genetics at Harvard Medical School.
Several groups are exploring the use of FT-IR and Raman microspectroscopy for a range of diagnostic and pathology measurements. For example, Rohit Bhargava’s group at the University of Illinois Urbana-Champaign is developing FT-IR chemical imaging methods that help make sense of the molecular chaos within a malignant tumor. With the goal of identifying early molecular indicators of a potentially aggressive, likely-to-recur prostate tumor, the team records FT-IR data from patient biopsies to expose patterns of molecular expression that may be associated with recurrent or metastatic cancers. The researchers believe this method-which outperformed the two standard measures of recurrence risk-can be adapted for the assessment of other solid tumors (9). In the United Kingdom, a team at the University of Exeter led by Nicholas Stone has adapted Raman techniques for use as in vitro and in vivo tests in the treatment of cancer, diabetes, asthma, malaria, and other diseases. Stone’s group is using spatially offset Raman spectroscopy (SORS) and transmission Raman spectroscopy (TRS) to measure more deeply below the surface of an area of tissue. Whereas conventional Raman illumination penetrates tissues only down to a few hundred micrometers, SORS can measure volumes as deep as 10–20 mm by collecting the scattered laser light away from the illumination site (10). The emerging method combines surface-enhanced Raman with SORS (SESORS), creating the potential to measure much thicker tissue depths.
A technique developed at the University of Vienna, for example, has been used as a means of identifying physiologically active cells in microbial communities (11). Other workers are interested in adapting the technique as more specific way of selecting cells for single-cell sequencing. The authors found the nondestructive technique advantageous for its speed, high resolution, and compatibility with widely used tools such as optical tweezers or fluorescence in situ hybridization (FISH). Many groups are adapting Raman and IR spectroscopy in disease specific research. A group led by Max Diem at the City University of New York is exploring the use of IR spectral cytopathology (SCP) to screen for the presence of cancer in the upper respiratory and digestive tracts (12). Combining IR microspectroscopy and multivariate analysis, the technique reveals biochemical composition of a cell as it changes over time.
Hyphenation
Hirschfeld observed that many laboratories were encountering dramatically increased demands to provide more information in less time from increasingly complex, harder-to-measure samples. Although that is a perennial problem, demands in regulated environmental and pharmaceutical laboratories in the early 1980s were heightened in part by federal legislative landmarks creating the EPA’s Superfund for toxic waste site clean-up and establishing an abbreviated FDA approval process for generic bioequivalent drugs. Both in terms of sample volume and complexity, these and other socio-economic factors of the time underscored the need for fast, reliable analytical methods with broader capabilities. Hirschfeld viewed the rise of so-called hyphenated techniques the most likely solution.
“There is no one instrument that can solve, for example, the problem of doing a rapid, first-run qualitative and quantitative analysis of a complex, totally unknown sample,” Hirschfeld wrote. “However, if one instrument cannot perform this task, a combination of them can.”
In the mid-1980s, intensive experimentation in instrument coupling yielded a panoply of new systems and applications-some of which are ubiquitous today, while others have faded from the limelight. Hirschfeld predicted near-term commercial viability for techniques such as liquid chromatography–nuclear magnetic resonance spectroscopy (LC–NMR) and gas chromatography–ultraviolet (GC–UV) spectroscopy, while correctly forecasting a boom in the use of microcolumn capillary electrophoresis with MS in the life sciences. He identified optical emission spectroscopy as a likely contender for hyphenation, which came to pass with the coupling of an ICP source with an MS detector for elemental mass analysis.
The first commercial ICP-MS systems hit the market in 1983. The technique offered a step forward in elemental analysis, meeting or exceeding the performance of traditional atomic absorption (AA) or ICP systems in terms of detection limits, analytical throughput, and freedom from matrix interferences. As the technique has matured and the range of commercial instrumentation has broadened, ICP-MS has become a powerful choice for traditional materials and earth science laboratories and is finding important life science applications including genomics, bioimaging, medical diagnosis, and cancer biomarker analysis.
Current efforts to advance system performance are often focused on managing the conditions in the plasma source, especially for emerging applications in laser-ablation and nanoparticle analysis. Computational modeling can reveal how plasma properties are impacted by gas flow rates, power, the presence of certain accessories, and other factors. These insights guide users in properly calibrating their systems to optimize results. ICP modeling was a plenary topic at the 2015 European Winter Conference on Plasma Spectrochemistry in Münster, Germany, bringing an interdisciplinary group of users together to share experiences (13).
At the same event, other plasma emission experts reviewed recent progress in the coupling of laser-ablation ICP-MS with LIBS to bring determination of the full periodic table within reach of a single technique. This emerging hybrid for solid samples enables the analysis of organic and lighter elements, elemental mapping, and simultaneous measurement of major and trace elements and isotopes.
Big Brains, Small Packages
There are two more Hirschfeld predictions for us to review. The first states that the efficient generation of large databases in computer-readable media and new administrative machinery will make these affordable to all and will be one of the critical needs of the next few years. The second Hirschfeld prediction states: The combination of sensor-based instrumentation and microminiaturization will make possible distributed measurement by allowing point-of-use measurements by nonexperts. Let’s explore these predictions in more detail.
When Hirschfeld predicted the rapid decline of what he called “dumb” instrument systems, the idea of a spectrometer with built-in intelligence was new. Technological advances and price decreases in microelectronics that were driven by other high-volume applications of the recently introduced PC trickled down to the spectroscopy laboratory, bringing once-manual aspects of instrument management and operation under software control. Even then, the large volumes of experimental and control data generated by intelligent instruments begged questions of where to store all that information and how best to use it to ensure better measurements.
In 1986, the development of optical disks gave scientists a cheap means of adding gigabytes of data storage to a PC-a capacity that far exceeded the immediate requirements of most spectroscopic databases of the time. “When a computerized instrument can store the equivalent of a large library in a dozen compact disks, the nonexistence of the library will cause some unhappiness,” Hirschfeld quipped.
Today, the tables have turned. Big data is more than a buzz word-it’s a concise way to express the unimaginably vast scale of computational power coming within reach of spectroscopists today. Not every laboratory requires the ability to store, share, and process multiple terabytes of data, but in a number of fields that issue looms large. Health care, for example, is increasingly moving toward the precision medicine paradigm, where the diagnosis and treatment of cancer and other diseases of genetic origin is premised on results from each patient’s specific results from a battery of sophisticated pathology tests including whole-exome sequencing, high-resolution imaging, molecular tissue characterization, physical property analysis, and text from the patient’s electronic health records. Technology companies like Intel are partnering with medical centers and research universities to test and refine cloud-based analytics platforms that enable multiple institutions to share and add to one another’s patient-specific genomic, imaging, and clinical data (14). The advantages of cloud-based storage are not reserved for such rarified applications, however. Commercial enterprises such as LabCognition Analytical Software offer computing solutions for general laboratory use. The company’s SciGear product stores and helps manage spectroscopic and sensor data from IR, near-IR, UV–vis, Raman, and other instruments. The product can be housed on a PC, on one or more servers, in a data center, in a customer’s private cloud, or on the company’s public cloud.
Beyond the exascale world of big data, there is plenty of complexity and intelligence in the programs that power day-to-day spectroscopic applications. As processing speed accelerates, electronics shrink, and prices fall, Hirschfeld’s prediction of durable, automated analyzers ruggedized for use beyond the laboratory continues to play out. In this context, small and smart go together because it is the machine, and not the operator, that increasingly is on the front lines of analytical measurement.
Hirschfeld was an advocate for the development of measurement instruments that could empower end-users to record and understand critical chemical data without calling in the specialists. At Lawrence Livermore a major focus was the development of practical miniature sensors capable of measuring key chemical parameters of remote processes over optical fibers. His affiliation with what is now the Center for Process Analysis and Control at the University of Washington brought his insights to bear on a wide range of innovative solutions for industrial process measurements. He and other early true believers in the power of spectroscopy to help engineers, plant managers, and other nonspecialist users to improve productivity fueled a new niche in the instrument market for spectrometers designed for use beyond the laboratory.
The value of miniaturized spectrometers, Hirschfeld wrote, is connected to the concurrent advances in miniaturization of the electronics therein, which contributes to progressively smaller and more versatile analyzers. “It allows, in succession, portable, then hip-pocket, and then point-of-use implanted instrumentation. This reduction in size, already attained by some of the simpler electronic measurement instruments, will become a general feature of all kinds of instrumentation in the near future.”
Fast-forward to the introduction of Apple’s iPhone and the nonstop development of new applications (also called apps) that put astonishingly complex technologies into anyone’s reach. In the emerging area of citizen science, activists with groups such as Public Lab are acquiring kits to home-build simple but functional app-driven spectrometers with which they keep tabs on air quality near local industrial facilities. For the most part, the feasibility of this technology in principle has not driven a large consumer demand for state-of-the-art chemical measurement at the touch of a button. But that may be changing, at least for certain market niches where the need to know-now-trumps all.
The opportunity for a breakthrough consumer or life science application will motivate the continued push for smarter, smaller spectroscopic analyzers. One example of this trend is a collaborative effort between the University of Western Australia and MEMS developer Panorama Synergy to develop optical spectroscopic sensors for new applications for the Internet of Things market. According to the company, Panorama is adapting the university’s patented spectrometer technologies for operation on cell phones or other personal electronic devices, which consumers could use, for example, while grocery shopping to evaluate food properties such as freshness.
These consumer measurements may or may not inspire the “killer app” that makes spectroscopy as commonplace in the household as global positioning system (GPS) technology. But instrument companies continue improving the range of serious spectrometers in handheld, portable, or remote configurations, with incalculable return on investment for industrial customers who apply these tools to get the measurements they need when they need them for a lower cost than traditional laboratory analysis.
Handheld or mobile FT-IR instruments are finding increased use for quality or production control in the food, materials, energy, and mining sectors. To deliver laboratory-caliber performance in demanding point-of-use applications, these workhorse instruments are toughened for relatively maintenance-free operation in ambient conditions. Many are powered by software that guides nonexpert users through key measurement steps, yet they can also be adapted for benchtop use in more sophisticated method development and data handling tasks (15).
The food industry’s need for reliable high-throughput safety and quality monitoring tools across the supply chain becomes apparent during highly publicized outbreaks of foodborne illnesses. According to the U.S. Centers for Disease Control and Prevention, some 48 million cases of foodborne illnesses are reported every year-accounting for 3000 deaths. Last year, the fast-food chain Chipotle drew the spotlight with a multistate E. coli outbreak-just one of a dozen or more cases of foods tainted by microbes such as salmonella and cyclospora entering the food supply.
The IR group at McGill University’s Department of Food Science and Agricultural Chemistry has been adapting rapid FT-IR techniques for the analysis of oils and other fluid food products for decades (16). The fast and reagent-free nature of FT-IR is an obvious boon in the case of an outbreak, when quick identification of the offending toxin is critical. The team has been working to transfer its concepts for scaled-down FT-IR systems suitable for point-of-measurement use by nonexperts, such as food inspectors or production managers. Another example of miniaturized FT-IR in food analysis is the work of Luis Rodriguez-Saona of Ohio State University’s Food Innovation Center. Rodriguez-Saona develops chemometric-based applications for the analyzer to deliver benchtop-caliber results in pathogen analysis, materials characterization, freshness testing, and other common food quality measurements (17).
Industries that rely on advanced materials such as semiconductors, polymers, or specialty coatings also have a lengthy quality assurance (QA) and quality control (QC) checklist for which FT-IR is well suited. With the ability for easy swapping between multiple techniques such as attenuated total reflectance (ATR), specular reflectance, diffuse reflectance, or grazing angle sampling, the new generation of toughened-up FT-IR instruments are useful on loading docks, the factory floor, or production lines because they work in ambient conditions and maintain stable interferometer performance in unstable surroundings. Analysis of incoming raw materials shipments, real-time process monitoring, and measuring the impact of product wear and weathering are among the possibilities (15).
An emerging enabling technology that is putting FT-IR and other mobile techniques to work is the tunable IR laser. Lasers and spectroscopy were still getting acquainted in Hirschfeld’s time-at least in terms of routine industrial applications. The idea of putting a laser-based spectrometer in a hostile environment like an oil well site or a crime scene was beyond the pale. The photonics industry, of course, rapidly changed the equation in the production of lasers and electro-optical systems with remarkable flexibility in spectroscopic use. One of the greatest analytical matches for FT-IR’s measurement capabilities is the noncontact detection of suspected explosive devices and chemical agents in security and defense applications. Speed, specificity, and sensitivity are key in standoff analysis, and optical spectroscopy-especially FT-IR-appears to be an emerging choice, as is LIBS, Raman, fluorescence, and others. Along with optical parametric oscillators and quantum cascade lasers, ultrafast tunable lasers have emerged as promising mid-IR sources for field-deployed FT-IR devices. Tunable mid-IR fiber lasers are evolving to address some of the traditional signal-to-noise limitations of using lamp-source FT-IR instruments with mid-IR fiber probes in remote measurements. In addition to better laser signal-to-noise performance, these lasers deliver narrower linewidth for heightened selectivity and sensitivity and other performance features that make it faster and easier to get the instrument to the point of measurement, acquire the data, and get it out rapidly.
Mass Goes Mainstream
In Hirschfeld’s time, mass spectrometry was still one of the most specialized spectroscopic disciplines, the MS laboratory was the rarified domain of an insular group of specialists, and there were few opportunities for MS and optical spectroscopists to share expertise. But as he predicted, pressing needs in measurement science may have brought the power of MS instrumentation in reach of nonspecialists. The migration of MS technology into widespread use in many different laboratory environments is fueled by the same advances in hardware and computing that have miniaturized other spectrometers. Increasingly, the traditionally gargantuan machinery of mass analysis can be scaled down to create new applications in environmental and industrial settings. To track the state-of-the-art in miniaturized MS, the scientific program of Pittcon has featured an in-depth session on the topic for the past several years-this year with an emphasis on applications of autonomous, battery-powered mass spectrometers as light as 50 lb.
Just because a spectroscopy system is designed for use by nonexperts doesn’t mean it can be operated by just anyone. The field of intraoperative surgery is an emerging example of point-of-use spectroscopy and imaging that requires multidisciplinary teams of surgeons, scientists, engineers, and technicians to run. But the potential impact on surgical patients is profound. For example, giving a brain surgeon the ability to visualize the surgical site and verify progress in three dimensions during a procedure dramatically increases the likelihood of correcting the problem the first time, sparing patients the burden of multiple open-skull operations. Intraoperative MS techniques for the analysis of lipid and metabolite profiles are under development at R. Graham Cook’s laboratory at Purdue University (18). They are working to improve intraoperative utility of desorption electrospray ionization (DESI) MS to distinguish between healthy and diseased tissues.
Point-of-care MS systems are also bringing the technique’s established power in a range of drug monitoring applications directly into the doctor’s office. Ambient MS and other MS techniques for direct measurement have entered the point-of-care era. Also at Purdue, the laboratory of Zheng Ouyang is working to refine compact MS instrument and experimental protocols for medical diagnostics in the clinic (19). They have optimized a method that combines slug-flow microextraction with nano-electrospray ionization for organic analysis in matrices such as blood or urine. Their simplified technique has been adapted for clinic-based measurements such as the determination of therapeutic or illicit drugs in sample volumes as low as 5 μL. The group’s miniature spectrometer platform is engineered for low-cost, low-maintenance operation while performing critical quantitative clinical measurements. Among the range of end-uses envisioned for the tool is a system for point-of-care monitoring of patient response to oncology therapeutics-an increasingly important need in the development of precision, patient-specific cancer therapies. With protocols designed for use by physicians and nurses without chemical analysis experience, such systems fully validate Hirschfeld’s prediction.
Conclusion
It is impossible to know what else Tomas Hirschfeld may have contributed if his career had not been cut short. However, what he did accomplish or put in motion during his stints at North American Rockwell, Block Engineering, Lawrence Livermore National Laboratory, and the University of Washington has left an indelible imprint on the spectroscopy world. Today, a new generation of scientists is striving to channel his unique approach to solving difficult problems through the power of measurement.
References
Trends in Infrared Spectroscopic Imaging
September 13th 2013An interview with Rohit Bhargava, winner of the 2013 Craver Award. This interview is part of the 2013 podcast series presented in collaboration with the Federation of Analytical Chemistry and Spectroscopy Societies (FACSS), in connection with SciX 2013, the federation?s North American conference.
Staying Updated with Spectroscopic Techniques: How Lead Investigators Adapt to a Changing Industry
June 6th 2024Spectroscopy is at the forefront of many changes happening across many industries. Here, three lead investigators comment on how they stay updated with the latest innovations and developments.