The new version of United States Pharmacopeia general chapter “Analytical Instrument Qualification” became effective August 1, 2017. What does this mean for you?
United States Pharmacopeia general chapter <1058> “Analytical Instrument Qualification” has been updated and became effective August 1, 2017. So, what has changed in the new version?
In the regulated world of good manufacturing practice (GMP) we have regulations that define what should be done, but leave the interpretation to the individual organization on how to do it. However, when we come to the regulated analytical or quality control (QC) laboratory we also have the pharmacopoeias, such as the European Pharmacopoeia (EP), Japanese Pharmacopoeia(JP), and United States Pharmacopeia (USP), to provide further information to help interpret the regulations. These tomes can have monographs for active pharmaceutical ingredients, finished products, and general chapters that provide requirements for how to apply various analytical techniques, such as spectroscopy.
Of the major pharmacopoeias, only USP has a general chapter on analytical instrument qualification (AIQ) (1). This chapter came about with a 2003 conference organized by the American Association of Pharmaceutical Scientists (AAPS) on analytical instrument validation. The first decision of the conference was that the name was wrong and it should be analytical instrument qualification (AIQ). The conference resulted in a white paper (2) that after review and revision became USP general chapter <1058> on AIQ effective in 2008. This chapter described a data quality triangle, general principles of instrument qualification, and a general risk classification of analytical equipment, instruments, and systems. General chapter <1058> did not specify any operating parameters or acceptance limits because those can be found in the specific general chapters for analytical techniques.
This column is written so that you will understand the changes that come with the new version and the impact that they will have on the way that you qualify and validate instruments and the associated software, respectively.
The simplest answer to this question is that qualification is important so that you know that the instrument is functioning correctly and that you can trust the results it produces when it is used to analyze samples.
However, there is a more important reason in today’s world of data integrity–integrated instrument qualification and computer validation is an essential component of a data integrity model. The complete four-layer model can be viewed in my recent book, Validation of Chromatography Data Systems (3), but the analytical portion of the model is shown in Figure 1 and is described in more detail in an earlier “Focus on Quality” column (4).
The four layers of the model are
The model works from the foundation up with each layer providing input to the next. As shown in Figure 1, AIQ and computerized system validation (CSV) come after the foundation layer, illustrating that if the instrument is not qualified and any software is not validated, the two layers above (method validation and sample analysis) will not be effective and can compromise data integrity. Interestingly, AIQ and CSV are missing from many of the data integrity guidance documents but you can see the importance in the overall data integrity framework of Figure 1.
Figure 1: The four layers of the data integrity model for laboratories within a pharmaceutical quality system. Adapted with permission from reference 4.
The 2008 version of USP <1058> had several issues that I presented during a session co-organized by Paul Smith at an AAPS conference in 2010 and published in this column (5). There were three main problems with the first version:
Although the approach to handling embedded software in Group B instruments where the firmware is implicitly or indirectly validated during the instrument qualification is fine, there are omissions. Users need to be aware that both calculations and user defined programs must be verified to comply with GMP requirements in 211.68(b) (6). Note that the qualification of firmware, which is a simple and practical approach, is now inconsistent with Good Automated Manufacturing Practice (GAMP) 5, which has dropped Category 2 (firmware).
Software for Group C systems is the weakest area in the whole chapter <1058>. The responsibility for software validation is dumped on the supplier: “The manufacturer should perform DQ, validate this software, and provide users with a summary of validation. At the user site, holistic qualification, which involves the entire instrument and software system, is more efficient than modular validation of the software alone.”
In the days of data integrity, this approach is completely untenable. The United States Food and Drug Administration (FDA) guidance on software validation (7), quoted by <1058>, was written for medical device software, which is not configured unlike much of the laboratory software used today.
To try and rectify some of these issues, the revision process of USP <1058> started in 2012 with the publication of a “Stimulus to the Revision” process article published in Pharmacopeial Forum written Chris Burgess and myself (8). This article proposed two items:
We used the feedback from that article to draft a new version of USP <1058> in the summer of 2013, which was circulated to a few individuals in industry and suppliers for review before submission to the USP.
Proposed drafts of the new version were published for public comment in Pharmacopeial Forum in 2015 (9) and 2016 (10) and comments were incorporated in the updated versions. The approved USP <1058> final version was published in January 2017 in the First Supplementto USP 40 (11). The chapter became effective on August 1, 2017. There was an erratum published in February, but the only change was reference of the operational qualification (OQ) testing to the intended use definition (user requirements specification [URS]).
First let us look at the overall scope of changes between the old and new versions of USP <1058> as shown in Table I.
CLICK TABLE TO ENLARGE
Missing in Action
The following items were omitted from the new version of <1058>:
Of greater interest to readers will be the changes and additions to the new general chapter, again these can be seen in Table I. Below I will discuss the following three areas that reflect the main changes to the general chapter:
Roles and Responsibilities
The USP <1058> update to the “Roles and Responsibilities” section makes users ultimately responsible for specifying their needs, ensuring that a selected instrument meets them and that data quality and integrity are maintained (11). The manufacturer section now includes suppliers, service agents, and consultants to reflect the real world of instrument qualification. One new responsibility is for the supplier or manufacturer to develop meaningful specifications for users to compare with their needs. Incumbent on both users and suppliers is the need to understand and state, respectively, the conditions under which specifications are measured to ensure that laboratory requirements can be met. We will discuss this further under the 4Qs model in the next section.
Finally, there is a requirement for a technical agreement between users and suppliers for the support and maintenance of any Group B instrument and Group C system. The agreement may take the form of a contract that both parties need to understand the contents of and the responsibilities of each.
An Updated 4Qs Model
At first sight, the new version of USP <1058> uses the same 4Qs model as the 2008 version. Yes . . . but there are some significant differences. Look at Figure 2, which presents the 4Qs model in the form of a V model rather that a linear flow. This figure was also published in a “Questions of Quality” column authored by Paul Smith and myself (14). However, Figure 2 has now been updated to reflect the changes in the new version of USP <1058>. Look at the green-shaded boxes to see the main changes:
Figure 2: Modified 4Qs model for analytical instrument qualification. Adapted with permission from reference 14.
Design Qualification
Design qualification now has two phases associated with it.
These two sections are where most laboratories get it wrong for reasons such as we know what we want (therefore why bother to document it) or we believed the supplier’s literature. This is where most qualifications fail because there is no specification upon which to base the testing in the OQ phase of the process as shown in Figure 2. Executing an OQ without a corresponding URS or design document is planning to fail any qualification. This is one of the major changes in the new version of USP <1058>.
Installation Qualification
In the new version of <1058>, installation qualification now includes
Operational Qualification
Operational qualification has also been extended to include
Software Validation Changes
The major changes to this General Chapter occur in the section on software validation. They are shown diagrammatically in Figure 2. Because the instrument examples have been removed in the new version of <1058> and replaced with the need to determine the group based on intended use, a formal risk assessment now needs to be performed and documented. A risk assessment, based on Figure 3 to classify instruments based on their intended use has been published by Burgess and myself (15) and is based on the updated classification used in the new version of USP <1058> (11). As can be seen from Figure 2, the risk assessment should be conducted at the start of the process in the DQ phase of work because the outcome of the risk assessment can influence the extent of work in the OQ phase.
Figure 3: Software validation and verification options with the new USP <1058>. Adapted with permission from reference 3.
Rather than classify an item as either Group B or Group C, there is now more granularity for both groups with three suboptions in each of these two groups. This increased granularity allows laboratories more flexibility in qualification and validation approaches, but also fills the holes from the first version of <1058>.
Group B instruments now just require either qualification of the instrument and either verification of any embedded calculations if used or specification, build, and test of any user defined programs.
For Group C systems, the new USP <1058> divides software into three types:
As can be seen from Figure 3, these three subtypes can be mapped to GAMP software categories 3, 4, and 4 plus category 5 modules. These changes now align USP <1058> closer to, but not identically with GAMP 5. The main difference is how firmware in Group B instruments is validated-directly with GAMP 5 or indirectly when qualifying the instrument with USP <1058>. Mapping of GAMP 5 software categories to the new USP <1058> groups has been published (16) for those readers who want more information harmonizing <1058> and GAMP approaches. This chapter is much improved and closer in approaches, but not quite there yet!
However, the bottom line is that software validation of Group C systems under the new USP <1058> should be the same as any GxP system following GAMP 5. One item that is not mentioned in the new <1058> is a traceability matrix. For Group B instruments, it will be self-evident that the operating range of a single parameter will be tested in the OQ. However, this process changes with Group C systems, especially because software and networking are involved; a traceability matrix will be mandatory.
This column has highlighted the main changes in the new version of USP <1058> on analytical instrument qualification that became effective on August 1, 2017. We then discussed three of the main changes: roles and responsibilities, changes to the 4Qs model, and the much-improved approach to software validation for Group C systems.
In general, the USP is moving toward full life cycle processes. When the new version of <1058> became effective in August 2017, it is likely that a new revision cycle will be initiated. If this occurs, a full life cycle will be the centerpiece of this revision.
I would like to thank Chris Burgess, Mark Newton, Kevin Roberson, Paul Smith, and Lorrie Schuessler for their helpful review comments in preparing this column.
R.D. McDowall is the Principal of McDowall Consulting and the director of R.D. McDowall Limited, as well as the editor of the “Questions of Quality” column for LCGC Europe, Spectroscopy’s sister magazine. Direct correspondence to: SpectroscopyEdit@UBM.com