Data process mapping is an effective way to identify data integrity vulnerabilities and remediate them. A modified example published by the Active Pharmaceutical Ingredients Committee (APIC) is reviewed critically. Can it be simplified?
Regulatory authorities expect that laboratories assess their manual processes and computerized systems to identify and remediate any data integrity vulnerabilities and risks. This approach is mentioned in the Medicines and Healthcare Products Regulatory Agency (MHRA) (section 3.4) and the Pharmaceutical Inspection Co-operation Scheme (PIC/S) (in section 9.1.6) guidance documents (1,2). Typically, this assessment can be performed in one of two ways—either by using a checklist or data process mapping.
MHRA guidance goes into specifics (1):
3.4 … An example of a suitable approach is to perform a data integrity risk assessment (DIRA) where the processes that produce data or where data is obtained are mapped out and each of the formats and their controls are identified and the data criticality and inherent risks documented.
What happens if you don’t do a DIRA? The PIC/S Guidance describes data integrity deficiencies in Section 11.2.3 (2) as:
An example of the first is sharing user identities so that an action cannot be attributed to an individual. The second is more interesting—you have not done anything wrong, but there are no controls in place to prevent an analyst from doing a wrong action, such as, for example, user access to data and the system clock via the operating system. It is the possibility that an analyst could do something that is preventable. Take note! Are you sure you have all necessary controls in place over the data lifecycle?
To undertake these assessments, organizations have published guidance documents (3–9). We review the second version of the APIC Practical Risk-based Guide for Managing Data Integrity Methodology for assessing systems and processes (9). This is a good guide, and, as the name suggests, it is very practical because it provides examples related to good and poor system and process design.
In this column, we do the following:
Before we start, my assumption is that your organization has an inventory of computerized systems and manual processes. Each entry is classified for data criticality (the process, data generated, and potential impact on the patient), which determines the priority order of assessment and remediation. You have an inventory, haven’t you?
In addition, management must support both business and data mapping assessments and remediation by providing the resources for this work. If they don’t, perhaps you might like to show them my column on understanding the cost of non-compliance (10), or simply ask they if they look good in orange?
Before delving into the details of a way of data process mapping, another method is a checklist to identify data integrity vulnerabilities. A comparison of the two approaches is shown in Table I. Data process mapping is more flexible, and it is also applicable to manual processes because the checklists focus only on computerized systems.
Regardless of the methodology, both take time and require trained staff to be involved. Therefore, management must ensure that adequate time is available to perform these assessments and remediate any data integrity vulnerabilities. Please avoid implementing just procedural solutions with analyst training. These are just temporary, error-prone sticking plasters to await automation of a manual process or replacement of a deficient system.
The APIC data process mapping methodology consists of six stages, and is shown in Figure 1:
The guide provides a working example of a QC process in Section 9.2 of a quantitative analysis using an analytical balance, an ultraviolet–visible (UV-vis) spectrometer system to calculate the results, which are entered manually into a laboratory information management system (LIMS) (9). It provides a spreadsheet for completed FMEAs before and after remediation.
The process steps are:
In the APIC process, the LIMS and balances are out of scope of the assessment. I have included them for completeness because you must always map the entire process. If the balance and LIMS were covered in separate process mapping exercises, the work should be cross referenced in this assessment. I have also modified the APIC example as follows:
We’ll start the review by looking at the APIC classification of processes and systems, which are divided into six categories (Table II). I have edited the description to focus on the laboratory.
Some critical comments:
I have no comments about the remaining categories, except that I would add spreadsheets to category 6.
Section 9.2 of the APIC guide has been modified, as mentioned earlier, to include a transcription check of the result to LIMS, comply with Annex 11 clause 6 (11), and incorporate electronic signatures by the analyst and second person reviewer. The modified workflow is shown in Figure 2. Implicit in the data process mapping is to ensure that applicable ALCOA++ criteria are met (12).
The data process mapping of the analytical process is divided by APIC into four phases [9]:
(1) Sample preparation
(2) Instrument set up and analysis
(3) Result calculation and entry into LIMS
(4) Second person review of analytical records.
The overall process is shown in Stage 1 of Figure 2.
What is missing from this description? The balance logbook! This is a specific requirement under Chapter 4.31 (14) and 21 CFR 211.182 (15). Moreover, a logbook is a critical component of data integrity correlating work performed and documenting the sequence of user actions (16). Oops!
Let’s look at the set up and analysis performed using the UV-vis spectrometer and start with some initial checks (stages 2 and 3 in Figure 2):
Not the brightest start to the assessment of this phase of the process.
On the plus side, the instrument setup and use is good with instrument methods named for easy identification, and data files are automatically saved by system along with a procedure for file naming. Printouts are linked to the data file name and associated metadata, such as the method used.
However, the instrument logbook has been omitted from the APIC example— AGAIN! I want to reiterate the importance of this apparently mundane and forgettable document; it allows correlation between instrument printouts, e-records in the data system, and logbook entries. Put simply, if all three are congruent, it is an indication that the correct processes have been followed, and that they are consistent and traceable.
In a deviation from the APIC example, measurements from the UV spectrometer printouts are typed into a validated Excel template, which is good, but it is another hybrid system (stages 4 and 5 in Figure 2). However, the data process mapping identifies the following problems:
The completed spreadsheet file is saved, but there is no file naming convention or specific storage location defined. Therefore, a user can save the file on a local or network drive. Be warned that auditors and inspectors search local drives for files with .xlsx and similar file extensions. Insert a big smiley face emoji.
The results are printed, but there is no link to spreadsheet file which fails record signature linking 21 CFR 11.70 (17) and PIC/S PI-041 section 9.5.2, correlating hybrid records for completeness (2).
As a result, there is a failure of ALCOA++ criteria because the records are not consistent or traceable (18,19).
The analyst then enters the individual determinations and reportable value into the LIMS, where they are compared with the specifications, and, if within the limits, the analyst e-signs the result.
This phase of the process is both a scientific necessity and regulatory imperative to comply with 21 CFR 211.194(a) (15) and GMP Chapter 6.17vii (20), and it is stage 6 in Figure 2. It consists of a detailed review of all paper and electronic records from sampling to reporting to ensure the work was performed correctly, procedures were followed, results are not selected, data are not falsified, and any deviations are recorded and investigated.
Ideally, each software application should have a function to document AT review electronically without the need to print or have paper checklists. However, few applications have this option in the laboratory. If the audit trail does not have an adequate search function or can highlight changes (21), either review on screen or print to PDF.
The system has limited functionality, and only a specific user role called Level 90 can change data. Otherwise, the audit trail just records the functioning of the system.
To be contentious, why do you need to keep the PDF file since the original records are in the audit trail? The printing to PDF can be repeated at any time as the original records remain in the audit trail.
The data process mapping has found several data integrity vulnerabilities or poor practices as collated in Table III. Each one is classified as:
The last one is of interest because this is typically not covered in a checklist, but it is obvious when conducting data process mapping. The business process must be improved in three places for business efficiently and eliminate data vulnerabilities:
Incorporation of technical rather than procedural controls is one of the improvements listed for the update of Annex 11 for computerized systems (22).
Now, we must consider what to do. The APIC process flow in Figure 1 shows an FMEA risk assessment should be conducted, BUT before starting down this road, let us take a value-added detour and see what ICH Q9(R1) on Quality Risk Management (23) says.
ICH Q9(R1) for Quality Risk Management has been recently updated and section 5.1 is entitled Formality for QRM Documentation (23), and argues that not all situations require a formal risk assessment. The three determining factors for a such decision are uncertainty, importance, and complexity.
Let’s see how the three criteria match up with the non-compliances listed in Table III; we’ll take the problem of shared user identities in the UV-Vis spectrometer.
The process flow analysis has identified the data integrity vulnerabilities, and this will be documented and remediated in the assessment report. Do we need to complete the FMEA? A resounding NO should be the answer because it is a total waste of time; you don’t need a crystal ball to know the outcome of the FMEA.
Alternatively, apply common sense. You already know what the problem is and the resolution: why waste time?
Put simply:
YOU ARE OUT OF COMPLIANCE.
FIX IT.
NOW.
Don’t wait for a risk assessment to tell you what you already know. I would argue strongly, based on ICH Q9(R1) above, that the data process mapping report (see Table I) is your risk assessment (23). The remedial action should be:
Take a similar approach with all other DI vulnerabilities, poor practices, and business inefficiencies: Fix them! Redesign the process, and incorporate remediation or eliminate the vulnerabilities.
On the other hand, you can waste time and effort if you have a fetish for risk assessments or if QA cannot think and insist you must complete an FMEA. Think of the fun hours inputting to your FMEA spreadsheet the problem, the possible effect, and then allocating random numbers for severity, occurrence, detectability. Finally, the final cell in the row has a value of 42 and the color turns red. Shock and horror: we are out of compliance!
Hold on a minute! What is new? Nothing, apart from a completed and colorful spreadsheet stating the bleeding obvious. What is the point?
My modified approach to data process mapping is shown in Figure 3. It involves the complete elimination of the FMEA risk assessment and the substitution of the data process mapping report. A further modification could be that the data process mapping report just identifies the DI vulnerabilities and links each one to the various CAPA and change requests raised.
This column has conducted a practical review of one data process mapping example and has highlighted some practical changes that were made to improve it. Instead of performing a formal risk assessment, once the data process mapping exercise has highlighted any data vulnerabilities and business inefficiencies, these can be scheduled for remediation.
I would like to thank Mahboubeh Lotfinia for constructive review comments.
(1) MHRA GXP Data Integrity Guidance and Definitions. Medicines and Healthcare Products Regulatory Agency, London, United Kingdom, 2018.
(2) PIC/S PI-041 Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments. Pharmaceutical Inspection Convention/Pharmaceutical Inspection Cooperation Scheme, Geneva, Switzerland, 2021.
(3) GAMP Guide Records and Data integrity. International Society for Pharmaceutical Engineering, Tampa, Florida, 2017.
(4) GAMP Good Practice Guide: Data Integrity-Key Concepts. International Society for Pharmaceutical Engineering, Tampa, Florida, 2018.
(5) GAMP Good Practice Guide: Data Integrity by Design. International Society for Pharmaceutical Engineering, Tampa, Florida, 2020.
(6) Technical Report 80: Data Integrity Management System for Pharmaceutical Laboratories. Parenteral Drug Association (PDA), Bethesda, Maryland, 2018.
(7) Burgess, C.; et al. GMP, GCP and GDP Data Integrity and Data Governance, 3rd Edition. European Compliance Academy Heidelberg, Germany, 2022.
(8) Practical Risk-based Guide for Managing Data Integrity version 1. 2019.
(9) Practical Risk-based Guide for Managing Data Integrity version 2. 2022. https://apic.cefic.org/publication/practical-risk-based-guide-for-managing-data-integrity/.
(10) McDowall, R. D. Do You Really Understand the Cost of Noncompliance? Spectroscopy 2020, 35 (11), 13–22.
(11) EudraLex - Volume 4 Good Manufacturing Practice (GMP) Guidelines, Annex 11 Computerised Systems. European Commission: Brussels, Belgium, 2011.
(12) McDowall, R. D. Is Traceability the Glue for ALCOA, ALCOA+, or ALCOA++? Spectroscopy 2022, 37 (4), 13–19. DOI: 10.56530/spectroscopy.up8185n1
(13) FDA 483 Observations Jiangsu Hengrui Pharmaceuticals Co. Ltd. 2024, (accessed 2024-06-22). https://www.fda.gov/media/179134/download.
(14) EudraLex - Volume 4 Good Manufacturing Practice (GMP) Guidelines, Chapter 4 Documentation, E. Commission, Editor. Brussels, Belgium, 2011.
(15) 21 CFR 211 Current Good Manufacturing Practice for Finished Pharmaceutical Products. Food and Drug Administration, Sliver Spring, Maryland, 2008.
(16) McDowall, R. D. The Humble Instrument Log Book. Spectroscopy 2017, 32 (12), 8–12.
(17) 21 CFR Part 11; Electronic Records; Electronic Signatures Final Rule. Federal Register 1997, 62 (54), 13430–13466.
(18) McDowall, R. D. Is Traceability the Glue for ALCOA, ALCOA+ or ALCOA++? Spectroscopy 2022, 37 (4), 13–19. DOI: 10.56530/spectroscopy.up8185n1
(19) EMA Guideline on Computerised Systems and Electronic Data in Clinical Trials. European Medicines Agency: Amsterdam, The Netherlands, 2023.
(20) EudraLex - Volume 4 Good Manufacturing Practice (GMP) Guidelines, Chapter 6 Quality Control. European Commission: Brussels, Belgium, 2014.
(21) McDowall, R. D. Why is My Application’s Audit Trail Rubbish? Spectroscopy 2017, 32 (11), 24–27.
(22) Concept Paper on the Revision of Annex 11 of the Guidelines on Good Manufacturing Practice for Medicinal Products–Computerised Systems. 2022. https://www.ema.europa.eu/en/documents/regulatory-procedural-guideline/concept-paper-revision-annex-11-guidelines-good-manufacturing-practice-medicinal-products_en.pdf.
(23) ICH Q9(R1) Quality Risk Management. International Council for Harmonisation: Geneva, Switzerland, 2023.
R. D. McDowall is the director of R. D. McDowall Limited and the editor of the “Questions of Quality” column for LCGC International, Spectroscopy’s sister magazine. Direct correspondence to: SpectroscopyEdit@MMHGroup.com ●
Synthesizing Synthetic Oligonucleotides: An Interview with the CEO of Oligo Factory
February 6th 2024LCGC and Spectroscopy Editor Patrick Lavery spoke with Oligo Factory CEO Chris Boggess about the company’s recently attained compliance with Good Manufacturing Practice (GMP) International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) Expert Working Group (Q7) guidance and its distinction from Research Use Only (RUO) and International Organization for Standardization (ISO) 13485 designations.