How to validate a spreadsheet

Over the years, the spreadsheets circulating in a testing or manufacturing company can become a real zoo. This presents a regulatory and client-relevant problem. It is important that you take a risk-based approach and validate your spreadsheets according to their use.

A spreadsheet used for dosing, reporting data or in dilutions may be a “legacy document”: a digital file developed years earlier by someone who may not even be employed with you anymore. It may have been developed in a previous version of Excel.  Its workings may include macros that the current users only vaguely understand. Or it may have functions and hidden rows that have not been examined in years.

Although configurable off-the-shelf spreadsheet software such as Excel has many good security and data integrity features, you must validate spreadsheets according to a defined procedure. Regulators and clients increasingly expect this. This article will show you how to get started.

Define the three responsibilities involved: developer, verifier, and quality assurance

Developer

The developer is the person who creates the spreadsheet or who initiates the validation of an existing spreadsheet. This person will be familiar with the use of the spreadsheet and with Excel (or whatever spreadsheet software you use).

The developer is responsible for understanding the requirements in the SOP that defines the development, use and control of spreadsheets. He or she will complete a form, ideally a checklist, that refers to each of these requirements. He or she will also document the spreadsheet’s inputs, processing, and outputs for verification (see below).

This is the lengthiest and most involved step of the three. Making it easier is your job. You can help by making sure the checklist is user-friendly and flows nicely with the SOP!

Verifier

Like the developer, the verifier will also be someone who is familiar with the context the spreadsheet will be used in and with Excel. For example, this person may a technician who uses the spreadsheet every day, a peer of the developer, or an Excel expert within the company.

To save space on the form and to show the verification clearly, you can simply include an additional column where the verifier adds a check mark indicating each requirement was met.

Another term for verifier is simply technical reviewer.

QA

When this verification step is done, QA can serve as the final sign-off. This may include spot-checks, format review, a wiping of personal identifying information and comment history, and administering a final document password that will only be distributed to authorized users.

If problems come up during the spot check, the QA reviewer will take a closer look and possibly reject the document, kicking it back to the developer, who must address it and get the spreadsheet verified again. The QA reviewer should not sign until all issues have been resolved.

When these three people have done their work, you have a validated spreadsheet. And the overall process is called the spreadsheet validation. (You may use different terms than what I have described, of course.)

Break it down into three steps: inputs, processing, and outputs

I recommend this approach because it reminds you to visualize the data flow in three parts. On the left is raw data such as a column of animal weights in grams.

On the right is the outputs, such as the required dose of a test article extract in milliliters.

In the middle is the processing. In our example, the individual animal’s weight is multiplied by a specific factor (according to the protocol or SOP) and then presented in the correct unit. The spreadsheet may also provide the group’s average weight, highlight any animals whose weight is outside a defined range, and present the highest and lowest weights for inclusion in the final report.

What goes on during the processing is like a black box. The spreadsheet validation explains, verifies and documents this processing so that independent reviewers can understand the spreadsheet’s purpose and be assured of the quality of the data that comes out. It brings the black box out into the light!

Define any additional requirements

The above are very general requirements that apply to all spreadsheets. Depending on use, some more specific requirements might include:

  • All cells except those needed to enter data or sample information are locked down. This one is very important!
  • Data validation is enabled. In other words, nonsensical entries are not permitted or will result in an error message.
  • Conditional formatting is appropriate and user-friendly. It may help in the study interpretation or trigger validity issues per protocol, so getting correct conditional formatting is important. Note that conditional formatting rules are not displayed in Excel’s Show Formulas function.
  • Drop-down lists are in the desired order
  • Rounding, significant figures, and display of digits is appropriate. Only the final value is rounded (there are no averaging of averages, for example).
  • Input cells display the value exactly as entered. This helps reviewers who must check transcription.
  • There are places to enter sample information. There is a place to enter the user’s name and date (or, better, a function that does this automatically).
  • The spreadsheet has been checked for robustness. Expected and extreme values were tried. You attempted to “break” it.
  • Personal identifying information has been wiped using the “inspect document” function in Office 2016.
  • The print preview was inspected. Saving and copying and pasting data works fine. The font matches the final report table where the results will be copied and pasted, if applicable.
  • Pagination, logos and the form identifier are included and correct.
  • Printouts are included in the validation paperwork with all the formulas, row numbers, and column numbers displayed.
  • This is optional: an additional sheet has been included that helps users understand and use the spreadsheet. (e.g. What raw data do I enter and when? When do I print it? What secure network folder do I save it in?) We like to think every user will read and understand the SOP. But in reality this sheet may be what they rely on!

Create a form where the three responsible groups document their work.

There should be a space for free text where the developer can describe the inputs, processing, and outputs. There should be a checklist for any additional requirements, and space to address the ones that are not applicable. There should be a clear separation of the developing from the checking.

The form should be comprehensive in case the developer and verifier do not actually read the spreadsheet validation SOP (we all know this happens). Finally, in exchange for asking busy people to do this validation, you should make sure the form is easy to use!

Hawk Ridge 9-11-16 (2)

Some further reading

Surprisingly, the FDA provides very little guidance on spreadsheet validation:

(https://www.fda.gov/ScienceResearch/FieldScience/ucm174286.htm)

 

However, the FDA has issued warning letters that cite spreadsheet deficiencies, including spreadsheets whose functions are not locked down:

(https://www.fda.gov/iceci/enforcementactions/warningletters/2013/ucm369409.htm)

 

Two more good resources:

(http://www.cbinet.com/sites/default/files/files/Workshop%20B_Soto_pres.pdf)

(https://learnaboutgmp.com/validation-of-ms-excel-spreadsheets/)

Independence: the importance of separating the checking from the doing

Independence is a prominent theme in the field of Quality. At its heart is the separation of doing and checking.

The example of the Hubble Space Telescope

Why is independence important? Consider the Hubble Space Telescope. It is a major achievement of science. But the first images it returned were found to be blurry! After a costly spacewalk and replacement, the problem was fixed. A fascinating 1990 NASA report pointed to why:

 

“The Perkin-Elmer plan for fabricating the primary mirror placed complete reliance on the reflective null corrector as the only test to be used in both manufacturing and verifying the mirror’s surface with the required precision.”

 

Basically, the instrument used to test the mirror was the same one used to manufacture it. In fact, cruder (but more independent) tests had detected the same error, but these results were disregarded.

 

This report clearly implicates the lack of independent checking. The instrument, although exquisitely precise, contributed to this costly error because it was used for both manufacturing the mirror and for verifying it.

 

The report touches on other Quality concepts as it goes on to say, “the engineering unit responsible for the mirror was insulated from review or technical supervision and were unaware that discrepant data existed, and were subject to great concern about cost and schedule, which further inhibited consideration of independent tests.”’

Reasons for independent review

Building a space telescope can seem simple compared to the dynamics of a typical workplace. The following are some more “human-factors” reasons for independent review.

Dumb mistakes

An independent reviewer can catch dumb mistakes. In drafting a scientific report, for example, the writer can become blind to certain errors that are readily apparent to a reviewer. Each draft is so subtly different from the previous one that the final draft may have drifted, or become too wordy, or have been dragged along with the author’s interpretation so far that the protocol and predefined endpoints have been underemphasized or not addressed at all.

 

The reviewer will also notice more prosaic omissions right before finalization such as a missing company watermark, a missing part of the header, or the wrong formatting.

Context-free understanding

Reviewers in scientific and technical fields can have varying levels of independence from the author. Some may be within the same company but in a different department. Some may be contracted. Some may be peers with the same title as the author but a different specialty.

 

In each of these arrangements, the reviewer has the advantage of looking at the report or product from a perspective that is closer to that of the customer. The reviewer will question confusing technical jargon. The reviewer will avoid “filling in the gaps” with their own knowledge.

 

Most importantly, an independent reviewer will apply a standard (such as a checklist) during their review instead of relying on their expertise, however extensive it may be, in the subject. If this standard is valid (see below) and matches the customer’s requirements and any other pertinent requirements, then problems will be identified well in advance of the product being released to the customer.

The independent reviewer will be more familiar with the requirements

Understanding the customer’s requirements

Members of a well-organized Quality program will regularly review client feedback, client audits, and returned product. Ideally they will be involved in requirements elicitation, where other sales and design people elicit and define exactly what the customer wants. These customer requirements then are translated into the standard that the product is checked against during the review.

Understanding business requirements

Ideally, scientists and technical personnel involved in testing and interpretation of results will be insulated from customer pressures. This should apply to the reviewer as well. But the reviewer may be better versed in the business environment in which the research, testing or manufacturing is occurring.

Understanding other requirements such as regulations and standards

Although scientists are usually enthusiastic about learning the best techniques and methods, there can be a great amount of drift from the best practices. To obtain consistent results in a particular assay, an entire lab may rely on a technique developed 30 or 40 years previously and defined in a single published paper. Without pressure from a Quality group or other reviewers, they may never make the undertaking of updating their methods to current standards.

 

In fact, in many organizations the Quality program includes a regulatory affairs unit. This is because Quality reviewers check the product against the standard. The regulatory affairs unit keeps the correct and current standards on file and available to the scientists. When there is a gap between the current methods and the standards, members of the Quality group document this and ask for justification. If the gap is too large, the physical products, validations, protocols, standard operating procedures, and final reports may be rejected, and not accepted until those gaps are closed.

Applying a validated standard

This idea is crucial and warrants its own article. Suffice it to say that an independent reviewer, because they are approaching someone else’s work, is more likely than the originator/author/producer to use a standard that has been demonstrated to match requirements.

 

I will expand on the role of standards in Quality later.

An independent reviewer is also insulated from customer pressures.

This is not to say they will impose undue delay, rather that they are less likely to be influenced by pressures outside the predetermined standards they use during review.

 

You will find something similar in the editorial policy of a news organization: the sales team, which sells advertising space to businesses such as car dealerships, will not have extensive involvement with the editorial team, which may be very vocal about air pollution, traffic deaths and climate change.

 

If the independent reviewer adheres to clear, written requirements, and the originator is familiar with these requirements, then problems are more likely to be fixed upstream, in advance of the review. Even problems that the originator could have easily hidden from the customer will have been addressed, because of the likelihood that they will be discovered during review.

Suggestions for your organization

Independence is difficult to define and implement. Here are some open-ended questions to ask of your organization.

Is your Quality unit too involved in the nitty-gritty of things?

Do you remain the final check on most processes before release or finalization? When you review something, are things mostly polished and complete? Or does your Quality role involve identifying minute corrections, which then require rework? Keep in mind that the more errors there are in the product submitted to the Quality group, the more likely there is to be further rework and further review handoffs.

 

To fix errors further upstream, at lower cost, consider introducing a technical data review unit that does these checks. QA can implement spot checks and other verifications that the technical data reviewers’ processes are in place and that the defined reviews were done. Consider doing this for the most error-prone processes first. For other processes, that are found to be running smoothly, the number of checks and reviews can be reduced.

Are you poised to adapt, wherever the company goes in terms of growth?

You are responsible for timely reviews so you do not want to be a bottleneck at the end of the process. If the Quality review often triggers investigations, rework and holds that delay the release of the product, then it is time to add a technical review upstream.

 

You want to be able to grow with the organization. If your business expands 10% more than predicted in one year, the Quality group should be able to absorb that, and decide later on whether it should expand too. An independent approach to review allows management to control how much the reviewers’ work will grow and change with the growth of testing or production.

Do your Quality people have routine, repetitive work that can be delegated to a technician instead?

An example may be temperature and humidity monitoring, or sampling for bacteria in the water system. If these are routine and repetitive, a technician can quickly be trained to do them. The QA personnel can then sign off on these checks and provide management with assurance that the process is under control. While checking these logs and reports, the Quality group may find they have the time to define new processes for better trending and reporting as well.

Do you have a way to address drift?

Drift must be addressed periodically. There is no tried and true way to do this, but a couple of things may help: periodic reviews, regular looks at industry best practices, and the normal churn that results from new people being hired from other companies and longtime employees departing.

 

That last one – churn – is important. Often a longstanding but flawed practice goes unquestioned until a new employee says, “You know, at my old job we did it this way…”

 

When someone says this, listen!

Some further reading

A costly error partly caused by checking with the same instrument that was used for doing:

The Hubble Space Telescope Optical Systems Failure Report

(https://www.ssl.berkeley.edu/~mlampton/AllenReportHST.pdf)

What is a quality assurance unit?

 

[This article pertains to nonclinical laboratory studies.]

 

Suppose you were just hired in a Quality role (QA specialist, tester, auditor, inspector, reviewer, technical data specialist, etc.). During your orientation and training people bandy about such acronyms and initialisms as the GLPs, the OECD, the QAU, and 21 CFR 58.

 

You probably have questions. Are you supposed to know all these regulations and guidance documents in addition to training on your organization’s protocols and standard operating procedures? Not to mention the underlying science of the testing your company does? What is the history of this kind of role? Where does your role fit in with the overall business and with the science and technology going on around you?

 

First, take a deep breath and remind yourself that you don’t need to figure it out on your own. This article will give you an overview of the QAU from the United States regulatory perspective, the international guidance, and some organizational particulars. Finally, you’ll find links to help you learn further on your own.

US FDA regulations

Take a few minutes to read the Quality Assurance Unit (QAU) section of 21 CFR 58. Do it. It’s several concise paragraphs that are free of jargon:

 

(https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfcfr/CFRSearch.cfm?fr=58.35)

 

What you find is that the QAU is one of three very important groups of personnel defined in the regulations:

 

  • Test facility management
  • The study director
  • The QAU

 

The section on the QAU is the lengthiest of the three.

 

Five things stand out. The QAU:

 

  1. Monitors each study to assure management that facilities, equipment, personnel, methods, practices, records, and controls are in conformance with the regulations in this Part.
  2. Inspects each nonclinical laboratory study at intervals adequate to assure the integrity of the study.
  3. Determines that no deviations from approved protocols or standard operating procedures were made without proper authorization and documentation.
  4. Reviews the final study report to assure that the methods and results are true to the standard operating procedures and raw data.
  5. Prepares and signs a statement to be included with the final study report which specifies the dates inspections were made and findings reported to management and to the study director.

 

Scanning the above for keywords, you see such action terms as monitoring, inspecting, reviewing, and reporting, all of which you will do as a Quality professional.

 

Consider this a touchpoint. As you learn and grow in Quality and come take on decisionmaking responsibilities, revisit 21CFR58 occasionally. Read the Guidance for Industry document (linked below) for questions raised by other professionals and answers provided directly by the FDA.

International guidance

The Organisation for Economic Co-operation and Development (OECD) is not a governmental body, but it is composed of representatives of governments.

 

The OECD published their own GLPs, and these align for the most part with the FDA regulations. Foreign governments can hold organizations to these OECD standards. Reading them can provide you with an international perspective on the sometimes jargon-filled US regulatory environment.

 

Some individual differences between the FDA and the OECD GLP requirements regarding the QAU follow:

  • There is more specific language on how the Quality Assurance Programme should be carried out by individuals designated by and directly responsible to management and who are familiar with the test procedures.
  • There is more specific language on how there must be clear lines of communication in a multi-site study between the Study Director, Principal Investigator(s), the Quality Assurance Programme and study personnel.
  • There is language on ensuring that the Quality Assurance personnel have a copy of the study plan and any amendments in a timely manner and that the study director will communicate effectively with the Quality Assurance personnel as required during the conduct of the study.

Particulars of an organization

How your organization meets these QAU requirements will vary quite a bit.

 

Take the example of “inspecting the study.” You may decide that every study gets inspected at an interval of one week. You might decide that these inspections will occur during critical phases such as preparation or dosing, where mix-ups and mistakes are highly consequential.

 

The reporting of the inspection to the study director and management could be a paper form, or an online tracking system. You may decide that only deviations from the protocol get reported. Or you may inspect the raw data accumulated so far very thoroughly, with the goal of finding and fixing problems as far upstream as possible.

 

After getting comfortable with your procedures, you might all of a sudden have an executive, Lean project team, or consultant come in who brings new ideas and revolutionizes the way you do things.

 

Although the particulars may vary, every company is under constant pressure from regulators, clients, and their own leadership to improve their QAU’s effectiveness and continue the upward spiral of quality.

New Quality software and other techniques

Your QAU will use third-party software to do its job. Some examples of quality management software include:

  • MasterControl
  • uniPoint Quality Management
  • TrueERP
  • QT9 Quality Management
  • Verse Solutions

 

This list will only grow. You will need to master one, two or more of these systems at the user level. You may need to train others in your organization in how to use them. You may need to work closely with the software vendor to get the specific functionality your organization needs. Finally, as you gain decisionmaking authority, you will need to be a smart buyer and decide which quality management software is worth the price.

What next?

The particulars of the QAU are constantly changing as business and technology change. But the underlying principles of independence, inspection, thorough review, documentation and reporting will remain the same. Some of the GLP regulations we follow go back to 1979. But the software we use to assure quality may have just been invented a year ago.

 

Be prepared to continually question the way your QAU operates and grows with the rest of the organization. As always, if you found this article insightful (or found it to be the opposite), let me know!

Some further reading

Read about the astonishing scientific fraud scandal that led to the development of the US GLP regulations:

(http://planetwaves.net/contents/faking_it.html)

 

FDA 21CFR58, Good Laboratory Practice for Nonclinical Laboratory Studies

(https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfcfr/CFRSearch.cfm?CFRPart=58)

 

FDA Comparison Chart of FDA and EPA Good Laboratory Practice (GLP) Regulations and the OECD Principles of GLP

(https://www.fda.gov/ICECI/EnforcementActions/BioresearchMonitoring/ucm135197.htm)

 

FDA Guidance for Industry Good Laboratory Practices Questions and Answers

(https://www.fda.gov/downloads/ICECI/EnforcementActions/BioresearchMonitoring/ucm133748.pdf)

 

OECD Quality Assurance and GLP guidance

(http://www.oecd.org/officialdocuments/publicdisplaydocumentpdf/?cote=env/jm/mono(99)20&doclanguage=en)

 

Why work in Quality?

Quality is a profession at the heart of our advanced, industrialized, information society. It is also a field that is full of opportunity. Job boards are peppered with postings for quality managers, quality assurance specialists, quality auditors, quality control technicians, and other roles. Although academic coursework in Quality is in its infancy, opportunities in business are abundant.

 

Read on to find out more about the field of Quality. As you read, think about where you and your mix of interests and skills might fit in.

Everyday examples

Reflect for a moment on some everyday things: the marvelous automotive machinery that gets you around town. The safe and effective pharmaceuticals that you take for common or more serious  ailments. The flawless email, word processing and photo backup provided by Google free of charge. The item you ordered online (after Amazon suggested it to you) with a click or two that arrived the next morning.

 

These consumer objects are at the customer-facing end of highly complex systems. Underlying it all are quality systems that are built and continuously reinvented in order to meet the customer’s requirements and to deliver delight.

Delight

This last term – delight – might be unexpected. But it provides perspective on the endgame here. In a competitive marketplace, a company gets an advantage by delivering customer delight, by doing it consistently, and by doing it for a lower price than its competitors.

 

Amazon again provides a good example. The online giant has continually improved its shipping times while keeping prices low. It adds new perks to its Prime loyalty program regularly. And it has a customer-focused return and refund policy that continues to earn praise from consumer advocates and commentators.

 

It’s no wonder then that 49% of US households hold an Amazon Prime subscription. Amazon’s astonishing sales and membership numbers will continue to grow as long as the company keeps delighting its customers.

 

Things can go wrong

On the other hand, we have all had negative experiences with these same companies and their peers. I once read with astonishment how a man lost almost all his personal data when his Google (!) cloud backup somehow failed. I have had frustrating and fruitless email exchanges with third-party Amazon sellers when my packages never arrived (Amazon eventually reimbursed me). I received a surprise bill from my dentist when they performed an x-ray I wasn’t due for under my insurance plan. And I have gotten scary letters from at least two large financial companies apologizing for a hacking breach of my personal and account information.

 

Sometimes it’s small things and near-misses that spook you. A slip-up can shake your confidence in a company you normally trust and cause you to permanently take your money elsewhere. I recall with horror a time when I worked in a hospital emergency room and the nurse read and took notes in a patient’s paper chart for several minutes before realizing it was the wrong one. The nurse apologized. The patient expressed his totally justified dismay. In response the nurse said, “Yeah, it happens” and continued in the correct chart. I would not blame the patient if he chose to write a negative Yelp review of the hospital, bring the incident to hospital management, and never use that clinic and hospital system again. After all, what if he had been given drugs or another invasive treatment based on someone else’s chart?

 

Consider some other high-profile quality issues: the Takata airbag deaths and subsequent recalls, the Equifax security breach, the Wells Fargo unauthorized account scandal, etc. These are all quality issues, some of them extremely serious.

 

Clearly, even the big players, with the slick websites, high-tech equipment and highly paid staff, don’t have it completely figured out.

 

Bringing it back to you personally

So how does this relate to you, as a student, a job seeker, a technician, a supervisor, or whatever?

 

Start by asking yourself if any of what I described above appealed to you. Does a commitment to continuous improvement match your values? Are you interested in eliminating waste (wasted resources, wasted time, and wasted human potential)? Are you interested in systems thinking? Do you consider yourself perceptive, thorough, candid, courageous, and helpful to others?

 

This mix of values, traits and talents are the foundation of career in Quality.

 

Read through my other articles and contact me with questions and comments.