Quality assurance professionals in the biotech and research sector must hold technical staff to exacting standards. However, few things in life are truly black and white. In this article I will outline several tools and mental habits that support prioritization and compromise.
Follow your written procedures
At the level of forms and work instructions, your written procedures should be fairly clear and unambiguous, to the point where a person from outside your organization could follow along and perform the task at hand.
At the higher level of SOPs and policy documents, the written instruction is more general because they must apply to the wide scope of work that is done. This general language, even when it is of a technical nature, introduces room for drift and ambiguity.
Below are two examples from my experience where technical staff followed the immediate procedures exactly as written. But their analysis and treatment of data fell outside of the higher-level SOPs that guide the work of the organization.
Example 1: Inappropriate rounding of numerical results
At my company, we had an SOP on data analysis. Having this SOP aligns with the requirement of ISO 13485 that states, “The organization shall document procedures to determine, collect and analyse appropriate data to demonstrate the suitability, adequacy and effectiveness of the quality management system. The procedures shall include determination of appropriate methods, including statistical techniques and the extent of their use.”
One requirement of this SOP was that intermediate rounding be avoided. Instead, all digits are to be retained and only the final result will be rounded to the appropriate number of significant digits.
This requirement is straightforward and it is in fact taught to high schoolers and below. It’s easy to demonstrate how intermediate rounding (i.e. averaging an average) can lead to erroneous results. For a standardized ISO study such as a cell genetic toxicity assay with quantitative cutoffs between passing and failing, inappropriate rounding could lead to reporting an erroneous scientific conclusion about the test article.
In practice however, this rule is not straightforward to explain or implement throughout a study, especially a study involving handoffs and sequences of testing. In my reviews of scientific reports and assay validations, I found that scientists were wedded to their way of doing things. This way of doing things often involved rounding a set of results from an initial set of testing, and later rounding the numbers again after the next set of testing was complete.
For the most part, this was inconsequential and did not change the study conclusion. But borderline cases could be problematic. And the practice went against the statistical methods SOP.
When I identified this problem and pushed for adherence to the SOP I was met with resistance. The scientists involved acknowledged that clients had questioned the practice in the past. They said, “We have always done it this way.” And they said that rounding intermediate numbers made sense because the rounding was done at discrete phases of the study, and the dropped digits are so small that their removal had only a small effect on the final results.
I took a step back and considered the wider context. I realized that it’s hard to change the way you do things immediately upon identifying a problem. I examined the spreadsheets used to do this analysis. The spreadsheets were ancient and had never been validated. They were retained in an uncontrolled folder and maintained by the scientists responsible for that study. In fact, anyone could edit these Excel files because Excel files are notorious for their lack of attributability for changes and for relying on laborious, error-prone, manual change control. The spreadsheets were often inherited from another scientist from years previously. They were initiated in a long-obsolesced version of Excel.
On top of that, our spreadsheet validation process at the time was labor-intensive, unclear and uninviting to the scientists. They saw the process as taking away their flexibility and as being an exercise in documentation with no benefit to their studies and to the quality and integrity of the data. In a sense, they were correct in that a validation is said to “lock down” a spreadsheet. They were also correct that the validation process was labor-intensive and involved multiple handoffs. (I detailed the efforts of my team and I to correct this in another article.)
Ultimately, I placed this issue in the context of many competing priorities. Hence I moved it down the list. Instead of taking up the issue urgently, I acknowledged its importance but relative lack of urgency and chose to address the root cause.
Addressing the root cause entailed updating the statistical methods outlined in the SOP on analysis of data. With the consultation of my manager, I added clear examples and further clarity on what intermediate rounding was and why this practice was not allowed.
I also updated the spreadsheet validation process to make it easier for a spreadsheet owner to get their spreadsheet validated. I added a revalidation form to ensure that they could easily update the spreadsheet (with the appropriate checks) instead of having to redo the entire validation.
Example 2: Incorrect use of a statistical formula in Excel
Scientists in my organization were using a subtype of a descriptive statistical tool that can give skewed results.
Take the following table of pH values and two standard deviation calculations:
Entering the STDEV.S formula into Excel gives you the standard deviation for a sample. This formula is used when you have a subset of values but not the values for the entire population. This got us 0.41.
Entering the STDEV.P formula gives you the standard deviation for the entire population. This formula is used when you have values for the entire population, as we do above (i.e. all 4 pH results). This got us 0.35.
The third value is another of Excel’s standard deviation formulas. It is provided only for compatibility with previous versions of Excel. In Excel’s drop-down list, it is flagged in yellow to indicate this. And it also gives you the sample standard deviation, which is 0.41 again.
The sample standard deviation in this comparison is 17% higher than the population standard deviation. This is an overestimate of the variation in the original pH readings.
The problem at my organization was the use of the sample standard deviation even when all the values in the population were available. It arose from the use of ancient spreadsheets and a lack of awareness that there was more than one method of calculating a standard deviation.
Firstly, the ancient spreadsheets defaulted to this sample standard deviation even when the population standard deviation was the appropriate method.
Secondly, the scientists who “owned” the spreadsheets were sometimes not aware that since they were reporting a set of pH’s, or a set of animal weights, or a set of rates of cell death, they possessed the complete population and should thus use the population standard deviation.
Excel has many historical quirks. This is one of them.
I took the same approach as above. I compromised. I chose to educate. And I addressed the root cause by paying close attention to this formula in reviews of spreadsheet validations. I also clarified the SOP cited above by providing examples of the two types of formulas and a clear explanation of when to use each one.
Three tools of incremental improvement
Compromise when appropriate
Follow your written procedures. This is a must.
However, consider the broader context.
Divide the problems of the day into four quadrants:
You must address the problems in quadrant 1 because you have little choice in a typical day. But doing this is reactive, not proactive, in nature. Your long-term goal is to reduce the severity and frequency of these issues.
You do this by spending more and more of your time working on the problems in quadrant 2. This is where most of your time is (ideally) spent. It is the realm of proactiveness and prioritization.
Problems in quadrant 3 might include answering office messaging notifications or fixing small safety lapses you come across. You should optimize them out of your life as much as possible, when there is time.
Quadrant 4 problems should be eliminated from your work and your mind entirely.
After using a tool or heuristic like the ones above, decide which problems to drive toward a resolution immediately, and which ones to go further upstream to fix. Decide what to escalate and what to analyze further.
Run a dialectical thinking exercise
“Dialectical thinking refers to the ability to view issues from multiple perspectives and to arrive at the most economical and reasonable reconciliation of seemingly contradictory information and postures.”
This approach is an antidote to black-and-white thinking.
You start with a statement or rule that is putatively “always true.” You then conceive an instance where this rule is not true. You then conceive a rule or principle that can help you decide when the original statement is true and when you are dealing with an exception.
The three steps (the always true statement, followed by an exception, followed by a rule for distinguishing) are also referred to as thesis, antithesis, and synthesis.
An example is the following:
Thesis: In my company, raw data must always be printed and then signed and dated in wet ink by the originator and then placed in the study folder.
Antithesis: However, some analytical instruments produce prodigious data. A large study, or one with retests, would swamp the archives, produce waste and other impracticalities, and require off-site storage.
Synthesis: For certain analytical instruments that produce prodigious data, the originator will instead sign and date (in wet ink) a controlled record indicating the electronic repository where such data is retained. Use of the repository will meet all the data integrity regulations and standards the company has incorporated into its record retention SOPs.
At the end of this exercise, you will have developed more flexible thinking. And you may recast the original statement as less of an ironclad rule and more of a precept (a general rule intended to regulate behavior or thought.).
You will more clearly see when it applies and when it does not. You will have done the hard work of discriminating and deciding. And you will be better positioned to get that distinguishing guidance firmly into your written procedures so that others can follow the rule without ambiguity and uncertainty.
Offer a grace period when change is difficult
In a formal, complex, or protracted change, you may wish to maintain the status quo but notify analysts/operators of the “washout” period after which you expect full adherence to the new or existing standard. Do this only after fully educating and training personnel on the requirement.
From an auditor’s perspective, you could notify personnel (informally or formally) that the gap will be written up in audits once the washout period is done. Until then, you will note the gap in your observations but not as an audit finding.
Again, make sure personnel are educated and trained on the requirement. If needed, explain why it was not sufficiently followed in the past, how the requirement was “rediscovered,” and why it is important.
Address the root cause that led to the problem
Push yourself to identify and address the root cause.
Explore the rich, exhaustive root cause analysis resources from FDA, ASQ and elsewhere. I will publish an article detailing root cause analysis and my personal experience with these powerful tools.
Uphold exacting standards while thinking flexibly. This will foster continuous improvement.
Do you have an example of compromising? Of taking a wider perspective and letting something go? Let me know in the comments.
A technical discussion from the National Institute of Standards and Technology on rounding
Another resource on the difference between rounding (for a final result) and truncating (for reporting an intermediate value)
An Excel resource on the two types of standard deviation formulas discussed above
One definition of dialectical thinking
The relevant FDA regulation on investigating the cause of quality problems
A useful outline of root cause analysis
About the photo
Oswald West State Park is a fantastic natural area on the upper Oregon coast. On a recent weekend I enjoyed hiking trails, a beach and incredible ocean vistas with two friends and a three-month old puppy. The puppy got tired and needed frequent carrying on a doggy backpack during her first trip to the Pacific coast (how adorable!).