Do We Qualify or Validate a Spectrometer?

News
Article
SpectroscopyNovember/December 2024
Volume 39
Issue 8
Pages: 22–30

Spectrometers have to be fit for their intended use; however, regulators separate analytical instrument qualification from computerized system validation. We critically review the qualification and validation approaches in the World Health Organization Technical Report Series (WHO TRS) 1019 Annex 3 and its applicability to spectrometer systems.

In December 2023, the Indian Good Manufacturing Practices (GMP) and Requirements of Premises, Plant and Equipment for Pharmaceutical Products regulations (Schedule M) were revised, with Section 12 stating:

The guidelines published by the World Health Organization (WHO) on following aspects relating to GMP through their Technical Report Series from time to time may be considered for general guidance purposes. (4) GMP guidelines for validation (1).

The applicable WHO guideline is TRS 1019, Annex 3 guidelines on validation (2), containing Appendix 5 on Validation of Computerized Systems (3) and Guidelines on Qualification in Appendix 6 (4). A similar approach is taken in EU GMP with Annex 11: Computerised Systems and Annex 15: Qualification and Validation (5,6). Qualification of analytical instruments and validation of computerized systems are Level 1 of the Data Integrity Model (7–9) and the foundation of the Data Quality Triangle in USP <1058> (10). These activities must be undertaken before the analytical procedure validation.

In this column, we highlight gaps and inconsistencies in TRS 1019 when qualifying a spectrometer and validating the application software. We also provide practical advice and interpretation for implementing qualification of spectrometers and validation of the controlling software (such as commercially available USP <1058> Group C systems) (10). Our interpretation is not all-inclusive and is not considered as a step-by-step review.

Let’s see how practical and useful these guidelines are. Spoiler alert—it could be much better. The big gap with computer validation is the lack of mapping and redesigning the analytical process before implementing a system that would provide business benefits (11). Furthermore, there is no mention of how to ensure data integrity, which is a bigger problem. From this stellar start, let’s see how what else remains to be discovered.

Why Do Regulators Separate Qualification and Validation?

Regulated GxP laboratories require a structured qualification and validation approach to ensure an analytical instrument and its associated system demonstrate fitness for intended use. However, GMP regulations for equipment are vague, as seen in 21 CFR 211.63 (12) and EU GMP Chapter 3.34 (13). Regulators treat qualification and validation as separate topics (5,6), but the problem is that you need the software to qualify the spectrometer and the instrument to validate the software, as shown in Figure 1. Therefore, an integrated approach to qualification and validation is essential. US Pharmacopeia <1058> does this under the umbrella of qualification. It connects AIQ and CSV from the instrument‘s perspective and avoids gaps if the two are treated as separate tasks (10).

FIGURE 1: Integrated Instrument Qualification (IQ) and Computerized System Validation (CSV) is essential for spectrometer systems.

FIGURE 1: Integrated Instrument Qualification (IQ) and Computerized System Validation (CSV) is essential for spectrometer systems.

Inconsistent Terminology

One of the problems with comparing regulations is inconsistent terminology; unfortunately, this is present throughout all regulations, including TRS 1019.

Incomplete Definition of Computerized System: As described by Figure 1 in PIC/S PI-011 (14), a computerized system includes the controlled function consisting of equipment (spectrometer) operated by trained people. The TRS 1019 definition is wrong because it lacks the controlled function. It merely mentions peripheral devices, such as printers (2).

Inconsistent Qualification Terms: The 4Qs model shown in Figure 2 where the same terms have different meanings depending on if an analytical instrument is qualified or a computerized system is validated. Figure 2 also illustrates in red a three-phase integrated approach to Analytical Instrument Qualification and System Validation (AIQSV) from the European Compliance Academy (ECA) (15).

FIGURE 2: Terminology used for Analytical Instrument Qualification (AIQ) and Computerized System Validation (CSV).

FIGURE 2: Terminology used for Analytical Instrument Qualification (AIQ) and Computerized System Validation (CSV).

  • The final stage before instrument release is Operational Qualification (OQ) for Analytical Instrument Qualification (AIQ), but for Computerized System Validation (CSV), it is Performance Qualification (PQ). To overcome the problem, USP <1058> takes the AIQ approach and OQ includes software validation (10).
  • The Food and Drug Administration (FDA) avoided using 4Qs in the General Principles of Software Validation (16) as have the first and second editions of Good Automated Manufacturing Practice (GAMP) 5 (17,18). However, the 4Qs model is ingrained in pharmaceutical laboratories and suppliers; thus, laboratories must be very careful to ensure that AIQ and CSV cover all elements of the system before release. There have been cases where laboratories think that an instrument OQ also validates the software. It does not. The CSV PQ confirms the overall suitability of the system for its intended use against the User Requirements Specification (URS).
  • For a spectroscopic system, there is no Design Qualification (DQ) as a supplier has designed it, and this information is confidential. A supplier also does not have full knowledge of the intended use. Thus, the user must ensure that the system is suitable for intended use (of course you have an updated URS, haven’t you?). A selection report, shown in Figure 2, replaces DQ and should include relevant documents as well as the supplier assessment/audit.
  • A Stimulus to the Revision Process article for USP <1058> has proposed calling the operational phase Ongoing Performance (19) instead of PQ.

Misleading use of Validation Master Plan (VMP): In Annex 3 (2), the definition of VMP is the same as PIC/S PI 006-3 (20); however, when used in practice, it is misleading. By definition, a VMP is a high-level document describing the overall approach to validation and qualification, but it is equated to qualification protocol in Appendix 6 clauses 4.11, 9.2, and 10.1 (4). Although the names are similar, a VMP and validation plan have a relationship, but a VMP cannot replace the plan and vice versa (see Figure 5).

SOPs and Training: The availability of Standard Operating Procedures (SOPs) is not harmonized throughout TRS 1019. In one place, it says available before starting PQ, and in another, at the end of PQ. In practice, SOPs must be available with trained staff before GxP release of the system.

Audit trail: Regulations for audit trails are focused on computerised systems (Annex 11, 21 CFR Part 11 and PI 041-1). Unfortunately, the definition in Appendix 5 includes paper records, which is inconsistent for computerized systems.

Glossaries Inconsistencies and Gaps: The glossaries throughout TRS 1019 do not provide consistent definition of key terms, as shown in Table I. The list is not exhaustive.

If WHO can’t get consistent definitions and terminology, what disasters remain to be discovered?

WHO TRS 1019

TRS 1019 Annex 3 separates qualification and validation, as shown in Figure 3 (2).

FIGURE 3: Structure of WHO TRS 1019 Annex 3 Guidelines on Validation with a focus on Appendices 5 and 6 (2).

FIGURE 3: Structure of WHO TRS 1019 Annex 3 Guidelines on Validation with a focus on Appendices 5 and 6 (2).

Annex 3’s introduction contains an overarching text on qualification and validation concepts (2), but covers a miscellany of items, such as buildings, cleaning, analytical systems, water, and analytical instruments.

Appendix 5 covers specific aspects of computerized systems validation.

Appendix 6 was originally titled Validation on Qualification of Systems, Utilities, and Equipment, but has been changed into Guidelines on Qualification,though it still covers a miscellany of subjects, such as premises, systems, utilities, and equipment.

The guidance has no specific section for the analytical instruments and systems, so it is imperative to provide a practical interpretation, especially for the pharmaceutical companies that use TRS 1019, such as Indian GMP laboratories.

To Validate or Qualify? That is the Question

As Figure 2 demonstrates, integrated qualification and validation comprises three levels. Our focus in this section is to see what TRS 1019 says about lifecycle activities.

Annex 3 describes validation as a concept that incorporates qualification, but in 4.2, it states Qualification normally precedes validation, implying a separate activity. As shown in Figure 1, qualification must be integrated with system validation. You can’t do one without the other!

Lifecycles are mentioned throughout TRS 1019 (2) and are consistent with EU GMP (5,6); however, there is no elucidation of what phases a lifecycle should consist of. To define the extent of lifecycle actions and employ an integrated approach, you need to know your instrument criticality, the type of software, and its intended use (10,15). TRS 1019 only lists major equipment, critical utilities and systems, and critical or non-critical instruments, but there is no information provided to define what these terms mean (2).

Spectrometers are USP <1058> Group C systems with three types, ranging from non-configurable, configurable, and configurable with custom extensions (types C1, C2, and C3) (10). More details can be found in the ECA Guide on AIQSV (15). A spectrometer’s optical bench and sampling accessories can be parameterized via the software, but this is not customization.

Initial Requirements Specification

A current URS is the most important validation document defining a systems intended use (5,6,12,13). It enables you to buy the right system for the right job while protecting the organization’s investment. A generic URS must be written that covers instrument and software requirements, mandatory GxP, data integrity, and Pharmacopoeial requirements before starting selection. Suppliers typically express instrument specification in a way that maximizes the impression of performance (for example, signal to noise), so it is important not to copy and use the supplier specifications as the URS.

Appendix 5 (3) has the best URS description, but there are no instrument control features. This is an example of how separation of validation and qualification can lead to a gap at the start of the project (see Table II). For instance, you need to define adequate size for the spectroscopic systems (12) and describe mandatory Pharmacopoeial requirements. Without documenting your intended use, you cannot select the right instrument and supplier, and vice versa. For examples, the questions you may need to ask are:

  • What tests, such as identification or quantification, need to be performed?
  • What are the operating ranges of key instrument parameters?
  • How does the sample presentation work? For example, is it a solid or liquid?
  • How is data stored? For example, is it embedded, a standalone workstation, or part of a networked system?

System selection must have a written URS to evaluate the commercially available systems.

Specifications Are Living Documents

Appendix 5 (3) section 8.6 states:

Prior to the initiation of the system qualification phase, the software program and requirements and specifications documents should be finalized and subsequently managed under formal change control.

Specifications are living documents and need to be updated with approved versions as a project proceeds. In Figure 4, a new project requires at least two URS versions. The first is a generic URS to select the system and identify if there are gaps against your requirements. The gaps should be addressed either through updating the URS or improving the application by the supplier. New user requirements can be added during evaluation. The second version is generated after installation and user training to reflect the configured application and workflows (15,21). It is also essential to define user roles and the associated access privileges, as these are unknown during system selection, as shown in Figure 4.

FIGURE 4: User requirements and configuration specifications must be updated.

FIGURE 4: User requirements and configuration specifications must be updated.

In Appendix 5 Clause 5.3, a series of bullet points about URS content is presented, as shown in Table II (3). Some of these list separate items that need careful evaluation and interpretation.

Problem of Separating Qualification and Validation

Table II does not mention how to specify the instrument operating ranges and Pharmacopoeial requirements. Although this guide only provides generic advice, there should be reference to other sources for specific instrument requirements. Here, only software functions are considered. Separating qualification of analytical instruments from software validation is a major problem, and regulations need to take an integrated approach, such as USP <1058> (10).

Alice in Wonderland?

If you want to buy a spectrometer system without a URS, STOP! Make sure your resume is up to date, and apply for a job in a supermarket. The reason? You will be following the “Alice in Wonderland” approach to system purchasing: if you don’t know your requirements, you can end up buying anything.

No Requirements Traceability

Traceability of requirements throughout the life-cycle is a regulatory requirement (5). There is a critical omission of requirements traceability in TRS 1019. Traceability only refers to components installed during Installation Qualification (IQ) and calibration materials. All requirements must be traceable from URS to the test or verification documents, and vice versa (11).

Supplier Assessment

A supplier assessment should be considered under Annex 11 3.1 and 4.5 (5). While TRS 1019 refers to supplier management (3), assessment is a better term that meets regulatory requirements. The focus in TRS 1019 is on the supplier’s Quality Management System (QMS), although it should also cover software development and testing to reduce in-house User Acceptance Testing/ Performance Qualification (UAT/PQ) testing (11,22). For a Group C system with category 3 software, the assessment is based on risk, but for category 4 software, supplier assessment should be performed (18). Appendix 5 (3) considers supplier assessment as an ongoing process. In our opinion, supplier requalification is dependent on your internal company procedures.

Purchase

The hard work of selecting a system can be undone by the Purchasing Department picking an alternative to save money. It’s a spectrometer, isn’t it? Buying on cost can end up with a bigger total cost of ownership (TCO) if the substitute has poor compliance functions and is used as a hybrid system instead of an electronic system. To limit the risk of inappropriate decisions by purchasing, some laboratories use the supplier’s specification so that only one instrument can be selected. But typically, this cannot be tested. Bring purchasing into the selection process and explain why a specific system and supplier are required. A selection report is strongly recommended, as shown in Figure 2. The purchase order also defines the initial configuration of the system, which is an input to the IQ, as mentioned in TRS 1019 Annex 3 (2).

System Risk Assessment

A system risk assessment is performed early in a project to determine how much work needs to demonstrate fitness for intended use, as shown in Figure 5. A risk assessment was published in an earlier Focus on Quality column with coauthor Chris Burgess (23), with an updated version being found in the European Compliance Academy (ECA) guide (24). The outcome determines if an integrated validation document suffices for simpler systems (22), or if a full validation project is required. TRS 1019 mentions general principles of quality risk management but lacks any advice at a system level.

FIGURE 5: VMP, System Risk Assessment, Validation Plan, and Summary Report.

FIGURE 5: VMP, System Risk Assessment, Validation Plan, and Summary Report.

Validation Plan

After the system risk assessment, you need to write a plan for the tasks that you are going to conduct over the validation lifecycle. The guidance uses the phrase of qualification and validation protocols to describe validation activities (2), though we suggest naming it a validation plan, which is a regulatory expectation in Annex 15 (6). The validation plan describes the overall intent of a validation and the documents to be written, as shown in Figure 5.

Installation and Commissioning

As shown in Figure 2 and Figure 6, the work here is also covered under IQ and OQ.

Plan for Installation: All regulations are consistent that equipment needs to be suitably located. Depending on the complexity of the purchased system, an installation plan may be required for covering any specialist power, such as gas supply, network connection, network server, or printer. Any preparatory work should be undertaken before the qualification work starts.

Platform Installation: First, the IT platform is installed, configured, and qualified. This is typically a standalone workstation provided by the IT department, or by the supplier that should be connected to the network.

System IQ and OQ: The supplier installs the system components and integrates them together following an approved IQ protocol. Next, the OQ is performed by the supplier on the unconfigured software. Any data generated should remain on the system and not an engineer’s laptop to avoid orphan data (25) or being handed over with documentation. IQ and OQ can be separate documents or integrated into a single document. The latter approach is supported by both TRS 1019 (2) and Annex 15 in clauses 2.5 and 3.10 (6). Regardless of the approach, commissioning, or OQ activities, are performed on unconfigured software. Prototyping and configuration of the system occurs after OQ, as shown in Figure 6, and when users have been adequately trained. The writers of TRS 1019 Appendix 5 are shown to lack practical validation experience in clause 9.1, as it states that the software should be configured in the IQ (3). This is rubbish.

Integrated Instrument and Software IQ and OQ: As shown in Figure 6, supplier commissioning covers the documented GMP activities for spectrometers and their controlling software. Therefore, we need both the instrument and software to perform OQ, as shown in Figure 1. Table III positions OQ for all the system components.

FIGURE 6: Typical timeline of qualification and verification activities of a spectrometer system.

FIGURE 6: Typical timeline of qualification and verification activities of a spectrometer system.

Prototyping and Application Configuration

Documenting the application configuration is a regulatory expectation for ensuring data integrity (26–28), and TRS 1019 Appendix 5, 6.3 covers configuration specification (3) as follows:

The system design and configuration specifications may include, as applicable, a software design specification, in case of code development, and configuration specifications of the software application parameters, such as security profiles, audit trail configuration, data libraries and other configurable elements.

This is a poor place to discuss configuration specifications under software design. Software design, code development, and testing occur at the supplier level. However, prototyping and configuration of the software are the responsibility of the laboratory after commissioning. Prototyping is an informal and optional step that can be performed by trained users to understand how the system software operates and select the best configuration settings for operational use, as shown in Figure 6.

Configuration can cover:

  • Use of e-signatures for paperless operation
  • Defining user roles and access privileges, such as providing no deletion privileges for any user
  • Avoiding conflicts of interest with access privileges
  • Unique user identities, allocation of a user role and password management
  • Audit trail turning on and defining default reasons for change (DO NOT permit the audit trail to be turned off).

In addition, during prototyping, work can be started on developing some software test scripts and updating the URS, as necessary.

Demonstrating Fitness for Intended Use

As shown in Figure 6, this is testing the configured system against the URS. Let’s see what TRS 1019 Annex 3, 10.23 says about (PQ/UAT):

Normally, PQ should be conducted prior to release of the… system (2).

At least they have this right! However, Appendix 5 mentioned that commercial software is tested against the URS (right) and DQ (very wrong) (3).

PQ or UAT is performed on the configured system to demonstrate that the system meets its intended use, as defined in the updated URS and configuration specification. Testing and verification will include:

  • Functional requirements, including data processing, modification, and associated audit trail entries.
  • Non-functional requirements, such as security, access control, backup, and recovery.

Although TRS 1019 refers to test, live, or production environments, for most spectrometers, testing will be conducted in the production environment.

Summary Report

A Validation Summary Report (VSR) should summarize all validation work. TRS 1019 expects separate reports to be written for the IQ, OQ, and PQ phases (3). This approach does not align with the principles in USP <1058> (10) and Annex 15 (6), where some stages can be combined. IQ and OQ is typically performed by the supplier, and each protocol should have a proforma report avoiding a need for a separate report. PQ testing can be summarized in the VSR.

A VSR is also the first calling point of a computerized system inspection, according to clause 23.10 in PIC/S PI-011 (14). It is important that what was intended in the validation plan is delivered in the VSR. Figure 5 shows that the VSR must tell the story of the validation, including any changes, incidents, deviations, additions, problems, and amendments to the validation plan, plus all deliverables from the work, such as the results of the PQ testing.

Control During Operational Use

Following GxP release, the system and instrument must remain validated and qualified over the lifecycle. There are four control levels shown in Figure 7, with each level being compared to TRS 1019. System retirement is out of scope.

  • Routine Use of Spectroscopy System: Daily system suitability tests are conducted to assure that the system can meet Pharmacopoeial criteria before analyzing samples. Trending test results is required by EU GMP Chapter 6.9 (29), but TRS 1019 does not consider this.
  • Instrument and IT Platform Repair: Fixing or replacing components is discussed in Annex 3, 10.25 (2), where a repair is a triggering factor to requalify the instrument. This is consistent with USP <1058> (10).
  • Instrument Preventative Maintenance (PM): This is a planned activity and is usually conducted annually immediately before instrument requalification. PM includes replacement of consumable parts, calibration, and cleaning. It is crucial that the laboratory has an understanding of PM documentation so they can justify the approaches in an inspection. This should be included in Appendix 6 rather than in Appendix 5.
  • Instrument Requalification: It is a repeat of all or part of the instrument qualification following preventative maintenance or repair (15) to demonstrate the instrument is fit for its intended use. Requalification is adequately covered in Annex 3 (2). Instrument requalification is performed more often than software validation.
  • System Revalidation: A repeat of all or part of the validation to provide assurance that changes in the system, like software upgrades, do not adversely affect the system operation or data integrity (15). Annex 3 is mostly centered on procedures, processes, and methods revalidation (2), and Appendix 5 provides no information on revalidation (3). Smith and McDowall have written an article discussing the upgrade and revalidation of analytical systems (30).
  • Periodic Review: An assessment that the computerized system remains in a validated state (5). This is addressed in Appendices 5 and 6 (3,4), however, both are inadequate, as sections on IT/system incidents and problem management are missing. Problems are considered as a collection of incidents with the same or a similar underlying cause, and need to be reviewed, as required by Annex 11.11 (5,11).
FIGURE 7: Control levels during the operational use.

FIGURE 7: Control levels during the operational use.

Summary

We provide a critical interpretation of TRS 1019 on practical qualification and validation for computerized spectroscopic systems. It is another example of separating qualification and validation when systems require an integrated or combined approach. We highlight the inconsistencies and identify the gaps, such as the content of a URS, use of DQ, no requirements traceability, having no mention of process improvement or data integrity, or when to configure software, along with unharmonized glossary definitions. However, TRS 1019 is a requirement under Schedule M of Indian GMP, and it is not clear how it can be applied to analytical systems.

Because regulations separate AIQ and CSV, there is a reason to explain fitness for intended use from both perspectives. Combining the above two approaches benefits quality and the company‘s business. Qualification is done on the hardware components, whereas validation is of a process (of use) and the overall system.

Separation can result in a hole in your validation and qualification project, as discussed in this column.

In contrast, the USP <1058> takes a combined approach with much more detail, as found in the recent ECA guide, which uses a three-phase life cycle approach. A draft update of USP <1058> is anticipated to be published next year, which may contain an integrated approach.

Acknowledgments

We wish to thank Behnusch Athenstaedt, Chris Burgess, Markus Dathe, Paul Smith, and Stefan Wurzer for their constructive review comments during the preparation of this column.

References

  1. Schedule, M. Good Manufacturing Practice of Premises, Plant and Equipment for Pharmaceutical Products. Part 1 Good Manufacturing Practice for Pharmaceutical Products Main Principles. Department of Health and Family Welfare: New Delhi, India, 2023.
  2. WHO Technical Report Series No.1019 Annex 3. Good Manufacturing Practices, Guidelines on Validation. World Health Organisation: Geneva, 2019.
  3. WHO Technical Report Series No.1019 Annex 3, Appendix 5 Validation of Computerised Systems. World Health Organisation: Geneva, 2019.
  4. WHO Technical Report Series No.1019 Annex 3, Appendix 6 Guidelines on Qualification. World Heath Organisation: Geneva, 2019.
  5. EudraLex - Volume 4 Good Manufacturing Practice (GMP) Guidelines, Annex 11 Computerised Systems. European Commission: Brussels, 2011.
  6. EudraLex - Volume 4 Good Manufacturing Practice (GMP) Guidelines, Annex 15 Qualification and Validation. European Commission: Brussels, 2015.
  7. McDowall, R. D. Understanding the Layers of a Laboratory Data Integrity Model. Spectroscopy 2016, 31 (4), 15–25.
  8. McDowall, R. D. Data Integrity Focus 1: Understanding the Scope of Data Integrity. LCGC N. Am. 2019, 37 (1), 44–51.
  9. McDowall, R. D. Do You Really Understand the Cost of Noncompliance? Spectroscopy 2020, 35 (11), 13–22.
  10. USP General Chapter <1058> Analytical Instrument Qualification. United States Pharmacopoeia Convention Inc.: Rockville, MD, 2016.
  11. McDowall, R. D. Validation of Chromatography Data Systems: Ensuring Data Integrity, Meeting Business and Regulatory Requirements, 2nd ed.; Royal Society of Chemistry, 2017. DOI: 10.1039/9781782624073
  12. 21 CFR 211 Current Good Manufacturing Practice for Finished Pharmaceutical Products. Food and Drug Administration: Silver Spring, MD, 2008.
  13. EudraLex - Volume 4 Good Manufacturing Practice (GMP) Guidelines, Chapter 3 Premise and Equipment. European Commission: Brussels, 2014.
  14. PIC/S Computerised Systems in GXP Environments (PI-011-3). Pharmaceutical Inspection Convention/Pharmaceutical Inspection Co-operation Scheme (PIC/S): Geneva, 2007.
  15. Guide for an Integrated Lifecycle Approach to Analytical Instrument Qualification and System Validation. European Compliance Academy: Heidelberg, Germany, 2023.
  16. FDA Guidance for Industry General Principles of Software Validation. Food and Drug Administration: Rockville, MD, 2002.
  17. Good Automated Manufacturing Practice (GAMP) Guide Version 5. International Society for Pharmaceutical Engineering: Tampa, FL, 2008.
  18. Good Automated Manufacturing Practice (GAMP) Guide 5, Second Edition. International Society of Pharmaceutical Engineering, 2022.
  19. Burgess, C. A Life Cycle Approach to the Calibration and Qualification of Analytical Instruments and Systems to Establish “Fitness for Purpose” for Pharmacopeial Purposes. Pharmacopoeia Forum 2020, 46 (4).
  20. PIC/S Recommendations on Validation Master Plan, Installation and Operational Qualification, Non-Sterile Process Validation and Cleaning Validation (PI-006). Pharmaceutical Inspection Convention / Pharmaceutical Co-operation Scheme (PIC/S): Geneva, 2001.
  21. McDowall, R.D. What! I Have to Specify My Spectrometer? Spectroscopy 2018, 33 (4), 12–16.
  22. McDowall, R. D. Simple Spectrometer System, Simple Validation? Spectroscopy 2023, 38 (11), 16–19.
  23. Burgess, C.; McDowall, R. D. An Integrated Risk Assessment for Analytical Instruments and Computerised Laboratory Systems. Spectroscopy 2013, 28 (11), 21–26.
  24. Burgess, C. Laboratory Data Management Guidance, Out of Expectation (OOE) and Out of Trend (OOT) Results V1.2. European Compliance Academy: Heidelberg, 2023.
  25. McDowall, R. D. What Exactly Are Orphan Data? LCGC Europe 2022, 35 (9), 381–387.
  26. WHO Technical Report Series No.996 Annex 5 Guidance on Good Data and Records Management Practices. World Health Organisation: Geneva, 2016.
  27. PIC/S PI-041 Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments. Pharmaceutical Inspection Convention/Pharmaceutical Inspection Cooperation Scheme: Geneva, 2021.
  28. MHRA GXP Data Integrity Guidance and Definitions. Medicines and Healthcare products Regulatory Agency: London, 2018.
  29. EudraLex - Volume 4 Good Manufacturing Practice (GMP) Guidelines, Chapter 6 Quality Control. European Commission: Brussels, 2014.
  30. Smith, P.; McDowall, R. D. Museum of Analytical Antiquities. Technology Networks 2024. https://www.technologynetworks.com/tn/articles/the-museum-of-analytical-antiquities-392323 (accessed 2024-12-11).
R. D. McDowall is the director of R. D. McDowall Limited and the editor of the “Questions of Quality” column for LCGC International, Spectroscopy’s sister magazine. Direct correspondence to: SpectroscopyEdit@MMHGroup.com●

R. D. McDowall is the director of R. D. McDowall Limited and the editor of the “Questions of Quality” column for LCGC International, Spectroscopy’s sister magazine. Direct correspondence to: SpectroscopyEdit@MMHGroup.com

About the Co-Author

Mahboubeh Lotfinia works as a Qualified Person and Quality Partner at F. Hoffmann-La Roche and is trained in GMP/GDP audit execution and CSV (Computerized System Validation). ●

Related Content