How to review audit trails

 

The FDA regularly issues warning letters regarding data integrity. Now, the agency is encouraging companies involved in GMP to get on top of these issues with regular audit trail review. You may have read some FDA or EU publications relating to data integrity and audit trails and wondered how they may apply to your organization. This article will demystify audit trail review and provide you an outline for your organization’s data integrity initiative.

What is an audit trail? What are some other data integrity terms?

Audit trail means “a secure, computer-generated, time-stamped electronic record that allows for reconstruction of the course of events relating to the creation, modification, or deletion of an electronic record. An audit trail is a chronology of the ‘who, what, when, and why’ of a record.”

The usefulness and power of the audit trail can vary quite a bit by system. In an older hemocytometer for example, you may find that there is no audit trail – the printout containing the time, date, result, and sample name is the extent of the data produced by the machine.

In the case of a newer analytical instrument such as a mass spectrometer, you will find that the audit trail is exhaustive and essentially unalterable. It will be a dynamic, all-inclusive record that displays information about every user and instrument action, in a human-readable format. It may be searchable by date, user, type of action, etc. It may even be in its own view-only, data manager module for ease of use. It may be exportable as an Excel file or printable as a PDF.

Another instrument may be intermediate in user-friendliness: you can display a page or two of printable, static information surrounding a specific file or sample, but cannot easily search or scroll.

Audit trails are part of a broader concept called metadata. “Metadata is the contextual information required to understand data.” For example, a set of data may be a list of compounds and their concentrations in a sample. The metadata surrounding this data would include the username, time and date, the data path (where the files are automatically saved), any error messages that came up, and any changes to the sample name. As outlined above, the extent of the metadata collected by the system can vary quite a bit.

An even broader concept is data integrity. This term refers to “the completeness, consistency, and accuracy of data. Complete, consistent, and accurate data should be attributable, legible, contemporaneously recorded, original or a true copy, and accurate (ALCOA).”

A concrete analogy

To make data integrity a little more concrete, consider how you treat crossouts on a paper record. Most crossouts are justified (mistakes happen). But when someone does make a crossout to update a raw data entry, your organization’s SOPs require them to address it. They do this by drawing a single line through the original entry (leaving it still legible). They then need to provide an explanation (such as “incorrect entry”) along with the identity of the person who did the correction, and the date of the correction.

In this way the date and recorder of the original entry, the date and recorder of the correction, and the reason for the correction are all documented. In addition the original entry remains legible for comparison to the corrected entry.

Data integrity refers to these same controls but in a digital format. And reviewing audit trails helps the Quality unit to review these deletions and changes in the same way they review them on paper.

Everyone in your organization plays a part in ensuring data integrity. Reviewing audit trails is part of that effort.

Where do I start?

Start by looking at the list of all computerized systems in your organization. If you have no such list, make one! A spreadsheet is adequate.

Classify each system by what regulated activity it is used for (GMP, GLP, etc.) and by its criticality. Document who is the system administrator (i.e. who controls access and sets up and deactivates users). Identify the subject matter expert who can answer your questions (see below). Sort the systems by criticality. Audit trail review is resource-intensive, so you will want to start with the critical ones.

Do a brief, informal review with a regular user of the system. See if you can easily access the audit trail from the main screen of the program or if you need to log into a separate module such as a data manager, security module, or report mode.

Audit trail functionality may have been documented in the validation. Check the validation paperwork for hints on how to access it. Check the manufacturer’s materials as well.

Consider the feasibility and resource-intensiveness of audit trail review for each system. Does reviewing it mean booting someone off the terminal? Is there a limited number of user licenses? All these considerations will influence the requirements you define in your data integrity SOP.

Take notes and write down any questions to address during a more detailed review to follow. Update the spreadsheet you started.

An assessment of each system is needed

This involves a more detailed assessment. A member of the Quality unit will sit down for about an hour with one subject matter expert in the software/instrument. In advance, you will want to come up with a template Word document with items to address. The idea is to get accurate information, including screenshots, to help with training and crafting the SOP. Topics may include the following:

What type of audit trail is it?

• Is it a dynamic event log, capturing every action automatically? Or does it just display the parameters of the individual run?
• Can the reviewer select a date range?
• Can he or she search by user, batch or project number?
• Are there any error messages for the subject matter expert to look into? Are there error messages that qualify as red flags for the reviewer?
• Does the audit trail capture when a user edits a sample name or lot number? When critical parameters are changed? When background scans are re-run? When the results from standards are rejected?
• What do manipulations and deletions look like? Is there a way to compare versions of a file?
• In analytical chemistry, is there an audit trail for acquisition of data, but no audit trail for processing of data? In chromatography, is there a way to compare integrations of peaks? What does excessive integration look like, and is this something the Quality unit should review?
• Compare records from a recent project or run to what you see onscreen in the audit trail. Go step by step and write down some typical audit trail events.

Don’t duplicate the effort of the original validation. But do look at the system through a data integrity mindset.

You may find that routine audit trail review opens new conversations about a system. Although an audit trail cannot be “corrected,” review of the audit trail may point to deficiencies elsewhere in the project’s documentation.

For example, if one person signed the batch record, lab notebook or worksheet, but the audit trail shows a different user (such as a trainer) performing that procedure, then both signatures should be in the documentation.

Another example is retesting or reprocessing: if the audit trail clearly shows a retest, but this was not documented, then this should be addressed in the documentation before Quality unit approval of the record.

For legacy systems, make sure any deficiencies are documented and that alternative data integrity measures are in place

You may look at an older instrument or software and find it is not 21CFR11 compliant, or not compliant in the way you thought it was. You can add a brief risk assessment to the validation paperwork with a memo referencing the new audit trail review procedure that is being put in place.

Look to the manufacturer’s compliance software

If the software is older, ask the vendor about updates. They may have released a data security module that you can add to the software. If there is no audit trail functionality, the system may still be Part 11 compliant. Don’t worry – audit trail functionality is not an absolute Part 11 requirement.

Get access for the QA reviewer for each system

Once this assessment is complete, get the Quality reviewer access to each system.

Some software will have a reviewer (read-only) mode. This is ideal because they will not be able to accidentally delete a file or alter a method. If the Quality reviewer is a standard user, that’s fine too!

Efficiency side note: to avoid duplication of effort, the periodic review can be reduced somewhat in scope.

Although audit trails will now be reviewed routinely, the periodic review is still important because this is where failed login attempts, periodic vendor maintenance, and changes to the overall method are captured. Keep in mind the risk-based approach.

Write the SOP!

Reference information technology SOPs, validation SOPs, ethics and training SOPs, and Quality review SOPs. Make sure it has caveats that address the range of software at your company. This procedure should involve organization-wide training. Having an executive or director champion it would be very valuable. Everyone should know what is expected of them with respect to data integrity.

Revisit this procedure in a year

To follow up, continue reviewing FDA warning letters for the agency’s thinking on data integrity matters. Distribute the pertinent letters to your team. Connect the audit trail reviewers with those involved with equipment/software validations so that audit trails are set up and understood proactively. Even better, ask for the reviewers’ input on the next set of compliance software that the vendor is trying to sell you.

A year after effectivity, revisit this procedure and see how it’s going:

• Is it too impractical? Are reports being delayed because the reviewer can’t get into the system while others are logged on?
• Is system access an issue? Does only one person in the Quality unit have the needed access?
• If you have technical data reviewers and a Quality reviewer, are they duplicating each other’s review? It can be hard to separate a technical review from a procedural review. Perhaps only one group should review.
• Look at the overall value of the project. If you find that reviewing audit trails in the way recommended in the FDA’s draft guidance is not a value-added step, let the FDA know! Now is the time to comment before the draft guidance becomes a requirement.

Lastly, take the long and broad view. Consider audit trail review to be one of many tools in your organization’s data integrity efforts. Keep in mind that other organizations are grappling with these issues as well, and there are no experts out there who have all the answers. You will have to treat data integrity as an ongoing commitment, with every data integrity procedure open to change, optimization and improvement.

If you have had successes, failures or questions in your audit trail efforts, I’d love to hear about them!

Mississippi River St Paul Feb 2017

Explore further

Data Integrity and Compliance With CGMP Guidance for Industry (draft guidance). The FDA published this in April 2016 as draft guidance. But as we know, you need to get ahead of the guidance!

(https://www.fda.gov/downloads/drugs/guidances/ucm495891.pdf)

 

This FDA slideshow, released a year later, provides some helpful elaboration on the guidance:

(https://www.fda.gov/downloads/AboutFDA/CentersOffices/OfficeofMedicalProductsandTobacco/CDER/UCM561491.pdf)

 

This article provides an concise summary of the challenges of starting an audit trail review process, and the importance of a risk-based approach based on an assessment of each system. The link opens a PDF:

(https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=5&ved=0ahUKEwj63Lvzu5nXAhUk4YMKHch_BmgQFgg-MAQ&url=http%3A%2F%2Fwww.ivtnetwork.com%2Fprintpdf%2Farticle%2Faudit-trails-reviews-data-integrity&usg=AOvVaw197_k-WCTeF3Ju5zL45gPY)

How to validate a spreadsheet

Over the years, the spreadsheets circulating in a testing or manufacturing company can become a real zoo. This presents a regulatory and client-relevant problem. It is important that you take a risk-based approach and validate your spreadsheets according to their use.

A spreadsheet used for dosing, reporting data or in dilutions may be a “legacy document”: a digital file developed years earlier by someone who may not even be employed with you anymore. It may have been developed in a previous version of Excel.  Its workings may include macros that the current users only vaguely understand. Or it may have functions and hidden rows that have not been examined in years.

Although configurable off-the-shelf spreadsheet software such as Excel has many good security and data integrity features, you must validate spreadsheets according to a defined procedure. Regulators and clients increasingly expect this. This article will show you how to get started.

Define the three responsibilities involved: developer, verifier, and quality assurance

Developer

The developer is the person who creates the spreadsheet or who initiates the validation of an existing spreadsheet. This person will be familiar with the use of the spreadsheet and with Excel (or whatever spreadsheet software you use).

The developer is responsible for understanding the requirements in the SOP that defines the development, use and control of spreadsheets. He or she will complete a form, ideally a checklist, that refers to each of these requirements. He or she will also document the spreadsheet’s inputs, processing, and outputs for verification (see below).

This is the lengthiest and most involved step of the three. Making it easier is your job. You can help by making sure the checklist is user-friendly and flows nicely with the SOP!

Verifier

Like the developer, the verifier will also be someone who is familiar with the context the spreadsheet will be used in and with Excel. For example, this person may a technician who uses the spreadsheet every day, a peer of the developer, or an Excel expert within the company.

To save space on the form and to show the verification clearly, you can simply include an additional column where the verifier adds a check mark indicating each requirement was met.

Another term for verifier is simply technical reviewer.

QA

When this verification step is done, QA can serve as the final sign-off. This may include spot-checks, format review, a wiping of personal identifying information and comment history, and administering a final document password that will only be distributed to authorized users.

If problems come up during the spot check, the QA reviewer will take a closer look and possibly reject the document, kicking it back to the developer, who must address it and get the spreadsheet verified again. The QA reviewer should not sign until all issues have been resolved.

When these three people have done their work, you have a validated spreadsheet. And the overall process is called the spreadsheet validation. (You may use different terms than what I have described, of course.)

Break it down into three steps: inputs, processing, and outputs

I recommend this approach because it reminds you to visualize the data flow in three parts. On the left is raw data such as a column of animal weights in grams.

On the right is the outputs, such as the required dose of a test article extract in milliliters.

In the middle is the processing. In our example, the individual animal’s weight is multiplied by a specific factor (according to the protocol or SOP) and then presented in the correct unit. The spreadsheet may also provide the group’s average weight, highlight any animals whose weight is outside a defined range, and present the highest and lowest weights for inclusion in the final report.

What goes on during the processing is like a black box. The spreadsheet validation explains, verifies and documents this processing so that independent reviewers can understand the spreadsheet’s purpose and be assured of the quality of the data that comes out. It brings the black box out into the light!

Define any additional requirements

The above are very general requirements that apply to all spreadsheets. Depending on use, some more specific requirements might include:

  • All cells except those needed to enter data or sample information are locked down. This one is very important!
  • Data validation is enabled. In other words, nonsensical entries are not permitted or will result in an error message.
  • Conditional formatting is appropriate and user-friendly. It may help in the study interpretation or trigger validity issues per protocol, so getting correct conditional formatting is important. Note that conditional formatting rules are not displayed in Excel’s Show Formulas function.
  • Drop-down lists are in the desired order
  • Rounding, significant figures, and display of digits is appropriate. Only the final value is rounded (there are no averaging of averages, for example).
  • Input cells display the value exactly as entered. This helps reviewers who must check transcription.
  • There are places to enter sample information. There is a place to enter the user’s name and date (or, better, a function that does this automatically).
  • The spreadsheet has been checked for robustness. Expected and extreme values were tried. You attempted to “break” it.
  • Personal identifying information has been wiped using the “inspect document” function in Office 2016.
  • The print preview was inspected. Saving and copying and pasting data works fine. The font matches the final report table where the results will be copied and pasted, if applicable.
  • Pagination, logos and the form identifier are included and correct.
  • Printouts are included in the validation paperwork with all the formulas, row numbers, and column numbers displayed.
  • This is optional: an additional sheet has been included that helps users understand and use the spreadsheet. (e.g. What raw data do I enter and when? When do I print it? What secure network folder do I save it in?) We like to think every user will read and understand the SOP. But in reality this sheet may be what they rely on!

Create a form where the three responsible groups document their work.

There should be a space for free text where the developer can describe the inputs, processing, and outputs. There should be a checklist for any additional requirements, and space to address the ones that are not applicable. There should be a clear separation of the developing from the checking.

The form should be comprehensive in case the developer and verifier do not actually read the spreadsheet validation SOP (we all know this happens). Finally, in exchange for asking busy people to do this validation, you should make sure the form is easy to use!

Hawk Ridge 9-11-16 (2)

Some further reading

Surprisingly, the FDA provides very little guidance on spreadsheet validation:

(https://www.fda.gov/ScienceResearch/FieldScience/ucm174286.htm)

 

However, the FDA has issued warning letters that cite spreadsheet deficiencies, including spreadsheets whose functions are not locked down:

(https://www.fda.gov/iceci/enforcementactions/warningletters/2013/ucm369409.htm)

 

Two more good resources:

(http://www.cbinet.com/sites/default/files/files/Workshop%20B_Soto_pres.pdf)

(https://learnaboutgmp.com/validation-of-ms-excel-spreadsheets/)

Independence: the importance of separating the checking from the doing

Independence is a prominent theme in the field of Quality. At its heart is the separation of doing and checking.

The example of the Hubble Space Telescope

Why is independence important? Consider the Hubble Space Telescope. It is a major achievement of science. But the first images it returned were found to be blurry! After a costly spacewalk and replacement, the problem was fixed. A fascinating 1990 NASA report pointed to why:

 

“The Perkin-Elmer plan for fabricating the primary mirror placed complete reliance on the reflective null corrector as the only test to be used in both manufacturing and verifying the mirror’s surface with the required precision.”

 

Basically, the instrument used to test the mirror was the same one used to manufacture it. In fact, cruder (but more independent) tests had detected the same error, but these results were disregarded.

 

This report clearly implicates the lack of independent checking. The instrument, although exquisitely precise, contributed to this costly error because it was used for both manufacturing the mirror and for verifying it.

 

The report touches on other Quality concepts as it goes on to say, “the engineering unit responsible for the mirror was insulated from review or technical supervision and were unaware that discrepant data existed, and were subject to great concern about cost and schedule, which further inhibited consideration of independent tests.”’

Reasons for independent review

Building a space telescope can seem simple compared to the dynamics of a typical workplace. The following are some more “human-factors” reasons for independent review.

Dumb mistakes

An independent reviewer can catch dumb mistakes. In drafting a scientific report, for example, the writer can become blind to certain errors that are readily apparent to a reviewer. Each draft is so subtly different from the previous one that the final draft may have drifted, or become too wordy, or have been dragged along with the author’s interpretation so far that the protocol and predefined endpoints have been underemphasized or not addressed at all.

 

The reviewer will also notice more prosaic omissions right before finalization such as a missing company watermark, a missing part of the header, or the wrong formatting.

Context-free understanding

Reviewers in scientific and technical fields can have varying levels of independence from the author. Some may be within the same company but in a different department. Some may be contracted. Some may be peers with the same title as the author but a different specialty.

 

In each of these arrangements, the reviewer has the advantage of looking at the report or product from a perspective that is closer to that of the customer. The reviewer will question confusing technical jargon. The reviewer will avoid “filling in the gaps” with their own knowledge.

 

Most importantly, an independent reviewer will apply a standard (such as a checklist) during their review instead of relying on their expertise, however extensive it may be, in the subject. If this standard is valid (see below) and matches the customer’s requirements and any other pertinent requirements, then problems will be identified well in advance of the product being released to the customer.

The independent reviewer will be more familiar with the requirements

Understanding the customer’s requirements

Members of a well-organized Quality program will regularly review client feedback, client audits, and returned product. Ideally they will be involved in requirements elicitation, where other sales and design people elicit and define exactly what the customer wants. These customer requirements then are translated into the standard that the product is checked against during the review.

Understanding business requirements

Ideally, scientists and technical personnel involved in testing and interpretation of results will be insulated from customer pressures. This should apply to the reviewer as well. But the reviewer may be better versed in the business environment in which the research, testing or manufacturing is occurring.

Understanding other requirements such as regulations and standards

Although scientists are usually enthusiastic about learning the best techniques and methods, there can be a great amount of drift from the best practices. To obtain consistent results in a particular assay, an entire lab may rely on a technique developed 30 or 40 years previously and defined in a single published paper. Without pressure from a Quality group or other reviewers, they may never make the undertaking of updating their methods to current standards.

 

In fact, in many organizations the Quality program includes a regulatory affairs unit. This is because Quality reviewers check the product against the standard. The regulatory affairs unit keeps the correct and current standards on file and available to the scientists. When there is a gap between the current methods and the standards, members of the Quality group document this and ask for justification. If the gap is too large, the physical products, validations, protocols, standard operating procedures, and final reports may be rejected, and not accepted until those gaps are closed.

Applying a validated standard

This idea is crucial and warrants its own article. Suffice it to say that an independent reviewer, because they are approaching someone else’s work, is more likely than the originator/author/producer to use a standard that has been demonstrated to match requirements.

 

I will expand on the role of standards in Quality later.

An independent reviewer is also insulated from customer pressures.

This is not to say they will impose undue delay, rather that they are less likely to be influenced by pressures outside the predetermined standards they use during review.

 

You will find something similar in the editorial policy of a news organization: the sales team, which sells advertising space to businesses such as car dealerships, will not have extensive involvement with the editorial team, which may be very vocal about air pollution, traffic deaths and climate change.

 

If the independent reviewer adheres to clear, written requirements, and the originator is familiar with these requirements, then problems are more likely to be fixed upstream, in advance of the review. Even problems that the originator could have easily hidden from the customer will have been addressed, because of the likelihood that they will be discovered during review.

Suggestions for your organization

Independence is difficult to define and implement. Here are some open-ended questions to ask of your organization.

Is your Quality unit too involved in the nitty-gritty of things?

Do you remain the final check on most processes before release or finalization? When you review something, are things mostly polished and complete? Or does your Quality role involve identifying minute corrections, which then require rework? Keep in mind that the more errors there are in the product submitted to the Quality group, the more likely there is to be further rework and further review handoffs.

 

To fix errors further upstream, at lower cost, consider introducing a technical data review unit that does these checks. QA can implement spot checks and other verifications that the technical data reviewers’ processes are in place and that the defined reviews were done. Consider doing this for the most error-prone processes first. For other processes, that are found to be running smoothly, the number of checks and reviews can be reduced.

Are you poised to adapt, wherever the company goes in terms of growth?

You are responsible for timely reviews so you do not want to be a bottleneck at the end of the process. If the Quality review often triggers investigations, rework and holds that delay the release of the product, then it is time to add a technical review upstream.

 

You want to be able to grow with the organization. If your business expands 10% more than predicted in one year, the Quality group should be able to absorb that, and decide later on whether it should expand too. An independent approach to review allows management to control how much the reviewers’ work will grow and change with the growth of testing or production.

Do your Quality people have routine, repetitive work that can be delegated to a technician instead?

An example may be temperature and humidity monitoring, or sampling for bacteria in the water system. If these are routine and repetitive, a technician can quickly be trained to do them. The QA personnel can then sign off on these checks and provide management with assurance that the process is under control. While checking these logs and reports, the Quality group may find they have the time to define new processes for better trending and reporting as well.

Do you have a way to address drift?

Drift must be addressed periodically. There is no tried and true way to do this, but a couple of things may help: periodic reviews, regular looks at industry best practices, and the normal churn that results from new people being hired from other companies and longtime employees departing.

 

That last one – churn – is important. Often a longstanding but flawed practice goes unquestioned until a new employee says, “You know, at my old job we did it this way…”

 

When someone says this, listen!

Some further reading

A costly error partly caused by checking with the same instrument that was used for doing:

The Hubble Space Telescope Optical Systems Failure Report

(https://www.ssl.berkeley.edu/~mlampton/AllenReportHST.pdf)

What is a quality assurance unit?

 

[This article pertains to nonclinical laboratory studies.]

 

Suppose you were just hired in a Quality role (QA specialist, tester, auditor, inspector, reviewer, technical data specialist, etc.). During your orientation and training people bandy about such acronyms and initialisms as the GLPs, the OECD, the QAU, and 21 CFR 58.

 

You probably have questions. Are you supposed to know all these regulations and guidance documents in addition to training on your organization’s protocols and standard operating procedures? Not to mention the underlying science of the testing your company does? What is the history of this kind of role? Where does your role fit in with the overall business and with the science and technology going on around you?

 

First, take a deep breath and remind yourself that you don’t need to figure it out on your own. This article will give you an overview of the QAU from the United States regulatory perspective, the international guidance, and some organizational particulars. Finally, you’ll find links to help you learn further on your own.

US FDA regulations

Take a few minutes to read the Quality Assurance Unit (QAU) section of 21 CFR 58. Do it. It’s several concise paragraphs that are free of jargon:

 

(https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfcfr/CFRSearch.cfm?fr=58.35)

 

What you find is that the QAU is one of three very important groups of personnel defined in the regulations:

 

  • Test facility management
  • The study director
  • The QAU

 

The section on the QAU is the lengthiest of the three.

 

Five things stand out. The QAU:

 

  1. Monitors each study to assure management that facilities, equipment, personnel, methods, practices, records, and controls are in conformance with the regulations in this Part.
  2. Inspects each nonclinical laboratory study at intervals adequate to assure the integrity of the study.
  3. Determines that no deviations from approved protocols or standard operating procedures were made without proper authorization and documentation.
  4. Reviews the final study report to assure that the methods and results are true to the standard operating procedures and raw data.
  5. Prepares and signs a statement to be included with the final study report which specifies the dates inspections were made and findings reported to management and to the study director.

 

Scanning the above for keywords, you see such action terms as monitoring, inspecting, reviewing, and reporting, all of which you will do as a Quality professional.

 

Consider this a touchpoint. As you learn and grow in Quality and come take on decisionmaking responsibilities, revisit 21CFR58 occasionally. Read the Guidance for Industry document (linked below) for questions raised by other professionals and answers provided directly by the FDA.

International guidance

The Organisation for Economic Co-operation and Development (OECD) is not a governmental body, but it is composed of representatives of governments.

 

The OECD published their own GLPs, and these align for the most part with the FDA regulations. Foreign governments can hold organizations to these OECD standards. Reading them can provide you with an international perspective on the sometimes jargon-filled US regulatory environment.

 

Some individual differences between the FDA and the OECD GLP requirements regarding the QAU follow:

  • There is more specific language on how the Quality Assurance Programme should be carried out by individuals designated by and directly responsible to management and who are familiar with the test procedures.
  • There is more specific language on how there must be clear lines of communication in a multi-site study between the Study Director, Principal Investigator(s), the Quality Assurance Programme and study personnel.
  • There is language on ensuring that the Quality Assurance personnel have a copy of the study plan and any amendments in a timely manner and that the study director will communicate effectively with the Quality Assurance personnel as required during the conduct of the study.

Particulars of an organization

How your organization meets these QAU requirements will vary quite a bit.

 

Take the example of “inspecting the study.” You may decide that every study gets inspected at an interval of one week. You might decide that these inspections will occur during critical phases such as preparation or dosing, where mix-ups and mistakes are highly consequential.

 

The reporting of the inspection to the study director and management could be a paper form, or an online tracking system. You may decide that only deviations from the protocol get reported. Or you may inspect the raw data accumulated so far very thoroughly, with the goal of finding and fixing problems as far upstream as possible.

 

After getting comfortable with your procedures, you might all of a sudden have an executive, Lean project team, or consultant come in who brings new ideas and revolutionizes the way you do things.

 

Although the particulars may vary, every company is under constant pressure from regulators, clients, and their own leadership to improve their QAU’s effectiveness and continue the upward spiral of quality.

New Quality software and other techniques

Your QAU will use third-party software to do its job. Some examples of quality management software include:

  • MasterControl
  • uniPoint Quality Management
  • TrueERP
  • QT9 Quality Management
  • Verse Solutions

 

This list will only grow. You will need to master one, two or more of these systems at the user level. You may need to train others in your organization in how to use them. You may need to work closely with the software vendor to get the specific functionality your organization needs. Finally, as you gain decisionmaking authority, you will need to be a smart buyer and decide which quality management software is worth the price.

What next?

The particulars of the QAU are constantly changing as business and technology change. But the underlying principles of independence, inspection, thorough review, documentation and reporting will remain the same. Some of the GLP regulations we follow go back to 1979. But the software we use to assure quality may have just been invented a year ago.

 

Be prepared to continually question the way your QAU operates and grows with the rest of the organization. As always, if you found this article insightful (or found it to be the opposite), let me know!

Some further reading

Read about the astonishing scientific fraud scandal that led to the development of the US GLP regulations:

(http://planetwaves.net/contents/faking_it.html)

 

FDA 21CFR58, Good Laboratory Practice for Nonclinical Laboratory Studies

(https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfcfr/CFRSearch.cfm?CFRPart=58)

 

FDA Comparison Chart of FDA and EPA Good Laboratory Practice (GLP) Regulations and the OECD Principles of GLP

(https://www.fda.gov/ICECI/EnforcementActions/BioresearchMonitoring/ucm135197.htm)

 

FDA Guidance for Industry Good Laboratory Practices Questions and Answers

(https://www.fda.gov/downloads/ICECI/EnforcementActions/BioresearchMonitoring/ucm133748.pdf)

 

OECD Quality Assurance and GLP guidance

(http://www.oecd.org/officialdocuments/publicdisplaydocumentpdf/?cote=env/jm/mono(99)20&doclanguage=en)

 

Why work in Quality?

Quality is a profession at the heart of our advanced, industrialized, information society. It is also a field that is full of opportunity. Job boards are peppered with postings for quality managers, quality assurance specialists, quality auditors, quality control technicians, and other roles. Although academic coursework in Quality is in its infancy, opportunities in business are abundant.

 

Read on to find out more about the field of Quality. As you read, think about where you and your mix of interests and skills might fit in.

Everyday examples

Reflect for a moment on some everyday things: the marvelous automotive machinery that gets you around town. The safe and effective pharmaceuticals that you take for common or more serious  ailments. The flawless email, word processing and photo backup provided by Google free of charge. The item you ordered online (after Amazon suggested it to you) with a click or two that arrived the next morning.

 

These consumer objects are at the customer-facing end of highly complex systems. Underlying it all are quality systems that are built and continuously reinvented in order to meet the customer’s requirements and to deliver delight.

Delight

This last term – delight – might be unexpected. But it provides perspective on the endgame here. In a competitive marketplace, a company gets an advantage by delivering customer delight, by doing it consistently, and by doing it for a lower price than its competitors.

 

Amazon again provides a good example. The online giant has continually improved its shipping times while keeping prices low. It adds new perks to its Prime loyalty program regularly. And it has a customer-focused return and refund policy that continues to earn praise from consumer advocates and commentators.

 

It’s no wonder then that 49% of US households hold an Amazon Prime subscription. Amazon’s astonishing sales and membership numbers will continue to grow as long as the company keeps delighting its customers.

 

Things can go wrong

On the other hand, we have all had negative experiences with these same companies and their peers. I once read with astonishment how a man lost almost all his personal data when his Google (!) cloud backup somehow failed. I have had frustrating and fruitless email exchanges with third-party Amazon sellers when my packages never arrived (Amazon eventually reimbursed me). I received a surprise bill from my dentist when they performed an x-ray I wasn’t due for under my insurance plan. And I have gotten scary letters from at least two large financial companies apologizing for a hacking breach of my personal and account information.

 

Sometimes it’s small things and near-misses that spook you. A slip-up can shake your confidence in a company you normally trust and cause you to permanently take your money elsewhere. I recall with horror a time when I worked in a hospital emergency room and the nurse read and took notes in a patient’s paper chart for several minutes before realizing it was the wrong one. The nurse apologized. The patient expressed his totally justified dismay. In response the nurse said, “Yeah, it happens” and continued in the correct chart. I would not blame the patient if he chose to write a negative Yelp review of the hospital, bring the incident to hospital management, and never use that clinic and hospital system again. After all, what if he had been given drugs or another invasive treatment based on someone else’s chart?

 

Consider some other high-profile quality issues: the Takata airbag deaths and subsequent recalls, the Equifax security breach, the Wells Fargo unauthorized account scandal, etc. These are all quality issues, some of them extremely serious.

 

Clearly, even the big players, with the slick websites, high-tech equipment and highly paid staff, don’t have it completely figured out.

 

Bringing it back to you personally

So how does this relate to you, as a student, a job seeker, a technician, a supervisor, or whatever?

 

Start by asking yourself if any of what I described above appealed to you. Does a commitment to continuous improvement match your values? Are you interested in eliminating waste (wasted resources, wasted time, and wasted human potential)? Are you interested in systems thinking? Do you consider yourself perceptive, thorough, candid, courageous, and helpful to others?

 

This mix of values, traits and talents are the foundation of career in Quality.

 

Read through my other articles and contact me with questions and comments.