The “Single Source of Truth” principle in biotech quality assurance

Quality assurance professionals use the single source of truth principle to ensure everyone in the company acts with confidence on the most current and accurate data.

The single source of truth principle (SSOT), stated in a few ways

SSOT is a data storage principle that originated in business software. The principle directs you to always source a particular piece of information from one place. It means that everyone in the company agrees that one view of data is the real, trusted number.

Stakeholders in a biotech or laboratory environment follow SSOT to make all scientific and quality decisions based on a single, definitive record that they all have appropriate access to.

SSOT also refers to the practice of aggregating the data from many systems within an organization to a single location. This does not mean all data is stored in this location, but that this hub contains accurate references to the definitive location of this data.

What are the advantages of an SSOT approach?

An SSOT approach will help achieve:

  • Clarity
  • Accessibility
  • A definitive record and a definitive analysis (i.e. avoiding duplicate records and analyses)
  • Accuracy (i.e. avoiding inconsistencies)

Where did this principle originate?

The field of quality has a long and laudable history of adapting successful practices from other industries and companies and promoting their spread.

The most celebrated example is the adoption of improved production processes from Japanese automakers after American automakers fell into universally poor quality in the decades after World War II.

Another example is the spread of checklists from aviation into surgery and other medical procedures. These checklists helped to reduce the frequency of “never events” (so called because they are completely unacceptable) such as leaving instruments inside a person or operating on the wrong body part.

SSOT’s origin is in enterprise software. Large companies especially deal with copious data from multiple siloed sources. They manage multiple computerized systems with software from various vendors. This leads to inefficiencies, redundancies, inconsistencies and the like between systems and individual data points.

A classic example is when the sales team uses one set of software and supply chain/order fulfillment uses another. When the two systems do not communicate with each other, and multiple workarounds are implemented to bypass this for routine work, the system can break down and cause a large order or non-routine work to be botched.

In recent years vendors have promoted solutions to fix this issue with (you guessed it) more software. These systems, one level above the rest in data abstraction, promise to aggregate all relevant computerized data in one single reference point. This solution sometimes works and sometimes amounts to an expensive extra layer of data and computerized workflows.

In this article I will explain how QA professionals in biotech can adopt the SSOT principle in their work. I do not necessarily promote the cloud computing solutions proffered on the market. I endorse continuing the spirit of adopting successful practices from other industries. I would like you to picture not just a software that further aggregates data, but the broader application of the SSOT principle. Think of examples in your company where a single definitive record would solve problems. I will provide several below.

SSOT is a data integrity principle

Data integrity is the overall accuracy, consistency and completeness of data. Data integrity also refers to the safety of data with respect to regulatory compliance, such as GMP compliance.

This data should be attributable, legible, contemporaneously recorded, original or a true copy, and accurate (ALCOA).

You won’t find SSOT referenced in any regulation. But language on data integrity requirements regarding “original or true copies” is found in 21 CFR Part 211:

“Records required under this part may be retained either as original records or as true copies such as photocopies, microfilm, microfiche, or other accurate reproductions of the original records.”

21 Code of Federal Regulations Part 211.194 Laboratory records also states:

“Laboratory records shall include complete data derived from all tests necessary to assure compliance with established specifications and standards, including examinations and assays” 

And

“A statement of each method used in the testing of the sample. The statement shall indicate the location of data that establish that the methods used in the testing of the sample meet proper standards of accuracy and reliability as applied to the product tested.”

The requirement on accuracy is found in:

211.68

“Appropriate controls shall be exercised over computer or related systems to assure that changes in master production and control records or other records are instituted only by authorized personnel. Input to and output from the computer or related system of formulas or other records or data shall be checked for accuracy. The degree and frequency of input/output verification shall be based on the complexity and reliability of the computer or related system. A backup file of data entered into the computer or related system shall be maintained except where certain data, such as calculations performed in connection with laboratory analysis, are eliminated by computerization or other automated processes. In such instances a written record of the program shall be maintained along with appropriate validation data. Hard copy or alternative systems, such as duplicates, tapes, or microfilm, designed to assure that backup data are exact and complete and that it is secure from alteration, inadvertent erasures, or loss shall be maintained.”

211.188

“Batch production and control records shall be prepared for each batch of drug product produced and shall include complete information relating to the production and control of each batch. These records shall include an accurate reproduction of the appropriate master production or control record, checked for accuracy, dated, and signed.”

Side note: What is a principle?

QA is a technical field but it is nonetheless guided by principles. With the right set of principles, you can develop the specific practices (or tools or techniques) to fit any situation.

To go back a step further, principles flow from values. In biotech our values might include the promotion of human health through safe and effective devices and drugs. Principles that flow from this value might include, “All data we produce or make decisions on should be accurate and complete.” And a technique or requirement that flows from this principle might include, “All study-related raw data will be automatically stored in a central, read-only repository with full audit trail capability.”

SSOT is a principle because it guides our behavior without specifying the practices we need to do to accomplish a goal. If you understand SSOT, you will be able to apply it flexibly in your audits/inspections, SOPs and validations.

Simple examples

A problem with a hard-to-see asset tag

During routine lab activities an analyst must frequently document the asset number and calibration due date for various equipment such as vortexers, incubators and spectrophotometers.

If the asset tag and calibration sticker are located on the back of the equipment in a hard-to-see place, the analyst may invent a workaround and write this info on a piece of lab tape with a marker and place this on the front of the equipment where it is easier referenced, saving several moments of searching each time this info must be recorded.

To the analyst, this is a good example of problem solving and efficiency. But it compromises the equipment data recorded on the form. Suppose the true asset tag were replaced but the improvised sticky note containing old information continued to be used to record the equipment identity.

A solution to a hard-to-see asset tag

The SSOT perspective tells us that the improvised sticker is an unreliable copy. The true source of the equipment info is a little hard to define. In this case, reliability exists along a continuum.

The least reliable source is the asset tag and calibration sticker. These may be placed by a single equipment/calibration coordinator without a verifier. They may be misplaced, handwritten, difficult to read, etc. The calibration sticker might have been placed by an outside vendor in an inconsistent format across equipment.

The next most reliable source may be the equipment database. An Excel spreadsheet is unreliable. An actual database in Access might be better because of the ability to pull reports. A piece of software such as Qualcy Calibration/PM Software will provide much greater functionality and accuracy.

But even these databases must be fed with data, often transcribed, from the calibration certificate, equipment validation record, and asset number log. This leaves open the question of what the single source of truth is.

To address all these difficulties, apply the SSOT principle: The calibration coordinator must place the equipment tags in an easy-to-see place for the analyst to reference. The coordinator should convert the outside vendor’s calibration sticker into an internally developed sticker that is consistent across equipment with respect to date format, placement, and required fields. Transcription from certificates, manuals and validations into the equipment database should be verified and reviewed wherever possible. And the analyst’s SOPs and forms should clearly indicate where they get required equipment information, and who to notify when it appears to be missing.

The problem of duplicate entries

In the various spreadsheets and databases of biotech quality assurance, duplicate entries are a classic problem. 

A solution to duplicate entries

A database can get around this problem by assigning each row or column a Primary Key number, which is a unique identifier referring to all the information in that row or column. 

A lot number assignment problem

Laboratory staff must often assign lot numbers for in-process components or for a testing procedure. Lot numbers are often assigned in a way that encodes the date of the procedure. If several people are at work at once, they risk duplicating the lot number.

A lot number assignment solution

Apply the SSOT principle to establish a definitive record of lot number assignment. SOPs should define in detail how lot numbers are assigned and explain any prohibited practices.

You may choose to establish a single paper logbook, centrally located, for assigning lot numbers. This avoids the problem of two remote users trying to create a new record at once and duplicating the new lot number entry.

You may detail procedural controls: the operator or analyst must assign the lot before beginning work, not during or after producing a component successfully. They assign this lot number before entering the lab and beginning work, not “when they have a moment.”

You may establish a QA-issued lot number assignment log. All lots are issued by QA, handwritten or printed on a QA-issued batch record. Controlled work may only be conducted on these forms. The forms will be clearly marked when they are QA-issued so that analysts will easily notice when they are using a form that was not issued appropriately.

More complex examples of the problem

Problem of a complicated but critical spreadsheet

Supplier qualification produces voluminous records. As your quality system grows, you may have implemented a spreadsheet as a quick fix to track the file location and contents for each supplier. 

As this system grows, the spreadsheet will inevitably become more unstable and inaccurate, especially if it was never validated for this purpose.

One of the most important bits of information in this supplier information repository is the status of the supplier: is it low-risk, medium-risk, or high-risk, and is it approved? If this information exists in a spreadsheet accessed by any staff member who may submit a purchase order, by purchasing, by quality, and others, then the risk of inadvertent changes to these supplier status cells is too great to leave to chance.

A solution to a complicated but critical spreadsheet

You may consider clearly defining the spreadsheet as a document that only references the actual supplier qualification record. You may insist in your procedural controls that staff reference only the original records when purchasing a component from a supplier.

But even with these controls, people will naturally use the accessible and easy to use spreadsheet instead of pulling up scanned or archived records containing the original supplier designations.

To address this, apply the SSOT principle and provide easy access to the supplier qualification records and the true, current supplier status. Ensure that anyone who orders can see these records without hassle. When a supplier’s status is changed, the previous records should be clearly marked so or obsolesced. Detail in an SOP that the reviewer of the purchase order will verify the supplier status before signing off. Validate the spreadsheet or supplier management software and audit it regularly (perhaps once a year) for accuracy against the supplier records. Ensure that no one feels compelled to use a second-hand file when ordering raw materials.

The problem of paper copies

Biotech and research are paper-dominated industries. We are far from replacing handwritten records, although we are getting there slowly. Consequently every firm must decide how to treat copies of paper records.

For example, how does one mark a copy as a copy? What does “exact copy” or “working copy” mean? Can you write on the copy? Can you retain a contaminated original in the lab and continue working on a copy outside the lab for review and approval? Is there a record of how many copies were made? If a significant change was made to the original, is there a way to track down all the copies and make sure the change is reflected in each one of them, or that they are replaced with the current, updated copies? For which purpose should I look into the original data set and which process can safely use data from the copy?

A solution to paper copies

Apply the SSOT principle to insist that the original be used wherever possible.

Implement procedural controls such as the following by detailing them in your SOP that covers good documentation practices:

  • Copies will only be clearly marked as such by the person who made the copy.
  • All copies will be made in a certain color paper that is unmistakable (pink or green for example).
  • All copies will have an expiration date.
  • Beyond this expiration date, the copy must be shredded.
  • Copies can be used only for analysis and are not considered raw data for scientific, quality or business decisions.

Prohibit copies from being filed in additional physical folders when a reference to the original would suffice.

Ensure that each original record pertaining to a quality subsystem (e.g. CAPA, nonconformance, validations, internal audits) has a numbering system that allows for a unique identifier, which allows easy and unambiguous reference. 

Ensure that the record can be pulled up easily in a repository of scans by those who may need to reference them. Ensure that all scans have a watermark superimposed on them to clearly indicate they are not originals.

The problem of multiple document control statuses

Your company has two separate document control systems, one for controlled procedures and one for pilot procedures. Which is the one for released products? Are they totally segregated? Do they feed into each other regularly or with specific exceptions? Do pilot documents sometimes accidentally get used as controlled documents, and vice versa?

A solution to multiple document control statuses

A biotech company must segregate design and development work from controlled production. Design and development benefit from having more flexible procedures. But the experimental components must not mix inadvertently with the controlled outputs. One way to do this is to unambiguously separate them and implement procedural controls to prevent this. 

Components generated under pilot status must be marked unambiguously. The documents used in these procedures must be clearly marked as pilot procedures. These designations must carry through to any components generated using these pilot components as starting materials.

A problem of inaccessible records

Quality system records such as nonconformance reports, planned deviations and CAPAs are locked in a repository that only QA has access to. Production and research staff need the specifics in these folders in order to do their job (for example, to carry out the planned deviation or to understand the corrective action prescribed after a nonconformance). If the finalized document is in a physical folder on someone’s desk or in archives, and laboratory staff is working off of a Word draft or a copy of the in-process record from before it was finalized, then they may be using incorrect or incomplete information.

You may even have the problem of individual staff running their own version of archives right in their office. This is unacceptable, but it happens. An individual finds they are working on so many files so frequently, that they no longer abide by the archival procedure and they set up their own cabinet brimming with records. People who need access to originals find they must search through the controlled archives and also message scattershot an array of individuals who also might have the file.

A solution to inaccessible records

Apply an SSOT approach. Staff need access to the finalized folder or an accurate, current version of it. They should not rely on drafts and incomplete versions of the plan when doing their work. They could access this in a network repository where the scan of the finalized record is housed. They could reference only the original record, if the building is small and the record can be physically brought into the lab.

Ideally, you will migrate your quality system procedures into a third party software or one that was developed in-house. This software will make these procedural controls inherent in the design. For example, instead of relying on a user to remember to copy stamp a sheet of paper, the software will imprint the sheet with a copy watermark and a printed expiration date.

Software will not fix all problems and it may create new ones. For example, if your document software comes with a limited set of licences, and staff cannot access the forms they need at the right time, they may resort to hoarding of forms or accessing unfinished copies that are not as reliable. This brings you back to the original problem.

There is a continuum of truth and reliability

Often you must proceed with imperfect data to avoid not proceeding at all. In these instances you should assess where along the reliability continuum your information lies.

Suppose your company produces vast quantities of data in support of regulatory approval for a medical device. To finalize your application for approval, you must assemble data from many sources of varying reliability.

At the bottom of the reliability continuum is information floating around in email, Microsoft Teams messages and Slack. Next up is various spreadsheets, PDFs and statistical software containing analyses. Next up is memos and internal reports generated without a formal review process. Next is reports from outside labs (which must be interpreted within their predefined scope). At the highest level of reliability are reports and other documents generated internally according to a written procedure for review and approval.

What does ISO 13485 say about SSOT?

The international standard ISO 13485 (2016 revision) on quality management systems for medical device makers states that:

  • The company will ensure that relevant versions of applicable documents are available at points of use.
  • The organization shall document procedures to define the controls needed for the identification, storage, security and integrity, retrieval, retention time and disposition of records
  • Records shall remain legible, readily identifiable and retrievable. Changes to a record shall remain identifiable

Think about how you can meet these requirements using an SSOT approach. The concepts of a definitive record, an SSOT, and document control are closely intertwined and they feed off each other in the upward spiral of quality.

Example from a regulation

FDA’s GMP regulations state:

  • Each manufacturer shall establish and maintain a DHF for each type of device. The DHF shall contain or reference the records necessary to demonstrate that the design was developed in accordance with the approved design plan and the requirements of this part.
  • Each manufacturer shall maintain a quality system record (QSR). The QSR shall include, or refer to the location of, procedures and the documentation of activities required by this part.
  • The equipment identification, calibration dates, the individual performing each calibration, and the next calibration date shall be documented. These records shall be displayed on or near each piece of equipment or shall be readily available to the personnel using such equipment and to the individuals responsible for calibrating the equipment.

As you can see, the required documents such as the device history file can reference the original without having to contain a copy in the file.

Calibration records must be “readily available” to users. “Readily available” is an extension of the ALCOA principles cited above and sometimes stated as ALCOA+: “complete, consistent, enduring and available.” These last four principles extend beyond the immediate recording and disposition of data and indicate it must be available for inspection over the lifetime of the record.

Example from an FDA warning letter

In this warning letter to a large American device maker, the FDA cited the following:

“Your firm failed to follow its CAPA procedures when evaluating a third party report, in that your firm released a product risk assessment, an updated risk assessment and its corresponding corrective action, before approving the CAPA request for this issue. Your firm conducted a risk assessment and a corrective action outside of your CAPA system. Your firm did not confirm all required corrective and preventive actions were completed, including a full root cause investigation and the identification of actions to correct and prevent recurrence of potential cybersecurity vulnerabilities, as required by your CAPA procedures. Additionally, your firm did not confirm that verification or validation activities for the corrective actions had been completed, to ensure the corrective actions were effective and did not adversely affect the finished device.”

In this case, the company perhaps paid big bucks to a third party to conduct a risk assessment in response to a CAPA. When the risk assessment was completed, the company failed to fold it into the quality system. Perhaps it was locked in a lengthy report in unwieldy PDF format. 

To meet the requirements of 21 CFR 820.100(a), and their own CAPA procedures, the company should have conducted this CAPA within the quality system in a closed-loop manner and not allowed the outside risk assessment to sit there independently and feed into product release decisions independently of the CAPA system.

Last word

The single source of truth is a principle from enterprise software that applies to biotech quality assurance. Apply this principle in biotech QA to ensure your people act confidently from a single, agreed-upon source of information.

Resources 

From other industries

This fascinating article explains why Google stores billions of lines of code in a single repository.

From the medical device industry

This article from a Greenlight Guru leader extolls following SSOT principles for medical device CAPA systems and promotes their software for doing so. One reason among several cited is that a CAPA seldom affects only one quality subsystem.

About the photo

My birding friend and I visited Mirror Lake in the Mount Hood National Forest in Oregon. Gray jays are bold and inquisitive. They approached us and ate our peanuts as we snacked next to this frozen alpine lake.

Roaches, Zika virus, and vaping-addicted kids: FDA warning letters are back!

Yesterday US Food and Drug Administration (FDA) warning letters came back, and they indicate the agency’s actions on novel and critical public health risks. The recent federal government shutdown appeared to slow the weekly issuance of FDA warning letters. I am glad these inspections and enforcement actions have recommenced. Read below for a few highlights.

Zika

A reproductive health clinic (i.e. fertility clinic) in Miami received a warning for failures in screening for Zika virus and other infectious diseases. Inspectors noted inadequate labeling of specimens (including semen). In addition, the clinic gave some donors an “eligible” approval before receiving the results of infectious disease testing.

There is also a worrying observation about issues at the clinic that go back four years, to 2015:

In response to Observations 3 and 4, you stated that your corrective action was to review and sign the appropriate forms in a timely manner based on the results of donor screening and testing. However, we note that these two observations were also cited during the previous inspection of your establishment conducted from November 19, 2015 to December 2, 2015. Your corrective actions, if any, after the previous inspection were not adequate to prevent recurrence of the same violations, as they were cited again during the current inspection.

Reading this letter I get the impression that basic laboratory controls are missing. This is unfortunate especially since the clinic is affiliated with a university. The letter mentioned Zika virus, west Nile virus, and a prion disorder called Creutzfeldt-Jakob disease. I sometimes wonder if we are failing to learn the required lessons from these emerging diseases. A few years on from the worst of the Zika outbreak, we still have inadequate screening and testing, even in a high-risk region such as south Florida.

Underage vaping

The FDA warned an online store for vaping juice and devices after the store sold these nicotine products to a minor.

If you are curious about vaping and teenagers, check out this recent New York Times article that paints a sad portrait of addicted kids. It also cites a statistic that “3.6 million middle and high school students are now vaping regularly.” I happen to believe that a teenager is better off vaping than smoking. However the addictive potential and long-term health risks of vaping are unknown. In addition the kids can vape anywhere, they can easily conceal these small devices, and they have no clear path to quitting.

It is in this context that the FDA has warned this and other companies that sell electronic nicotine delivery system (ENDS) products. Get used to that ENDS acronym because vaping is an issue that will not go away any time soon.

Also, I have to point out the cake, candy and fruit-flavored nicotine e-juice this company is selling. Some of them are even in a fruit juice box-style packaging! It’s hard to deny that they are marketing to kids:

Imported over-the-counter drugs

A manufacturer in China was cited for producing over-the-counter drugs (which are sold in the US) in the same equipment as non-drug products. This practice is unacceptable under the Good Manufacturing Practice (GMP) regulations.

The inspector cited other ongoing violations, including inadequate stability-indicating methods on the drugs and distributing drug batches without the quality unit’s review of failing test results. Interestingly, the company’s previous response was deemed inadequate because it did not include a root cause analysis. To me this shows the importance of thorough investigations when a failure occurs.

Pest-ridden food distributors

This one is gross. At a food storage and distribution facility in Florida, an inspector observed:

· One live cockroach on the wall along the east side of your freezer, under racking storing various food products.

· Three dead cockroaches in between packages of “United Sugars, Extra Fine Granulated, Pure Sugar Cane” packaged in 5lb paper bags. These packages were on a pallet along the wall on the west storage area, near the north end of the facility.

· An apparent birds’ nest along the ceiling in a center aisle, in the east storage area, across from the freezer and cooler.

· Apparent bird excreta on several boxes of “Loty Chicken Flavor Bouillon,” which were stored on a pallet near the wall shared with the south end of the freezer.

· Five dead cockroaches in Cooler #1 along the wall on both sides of the door. This cooler is used to store rice and other products which require a cool environment.

· Apparent cockroach parts (e.g., wings and exoskeleton parts) too numerous to count throughout your facility, along the east and center aisles in between racking systems, and along the walls.

· A powder, identified by your firm as a mix of corn flour and ground-up rodent bait blocks, throughout your entire facility along the walls, and in between racking systems of stored packaged food products. The bait was identified to be “Bell Contrac All Weather Blox” used to kill rats, mice and meadow voles.

· Your three dock doors used for shipping and receiving along the north end of the facility are open during business hours and are not screened to exclude any type of rodent or pest from entering the facility.

I appreciate the inspector’s thoroughness! They counted the roaches, poop and other animal signs, and noted their location. An audit finding should always be specific so that a follow-up inspection can verify whether corrective actions were effective.

Another food processing facility in Florida was warned over a repeated failure to follow a specific pathogen reduction process in their fruit juices. Because of the repeated nature of the violations, the letter included an interesting note about how the FDA is authorized “to assess and collect fees to cover FDA’s costs for certain activities, including re-inspection-related costs.”

If they have to keep coming back to help you get your act together, you might have to pay!

Final comments

FDA warning letters are a treasure trove of information to help you understand the agency’s enforcement priorities on a wide array of medical and consumer products. I think it’s just a coincidence that three of the five entities cited here are in Florida.

The FDA website noted early on in the shutdown that “agency operations continue to the extent permitted by law, such as activities necessary to address imminent threats to the safety of human life and activities funded by carryover user fee funds.” I am not sure if this included the inspections referenced in this letter. However, I wouldn’t be surprised if there is a gap in warning letter postings three to six months from now.

The letter to the fertility clinic highlights a real risk: what if you or your fetus contracted the Zika virus from an anonymous sperm donor? For its part the underage vaping issue is big and it will only get bigger. The only apparent requirement for getting this nicotine vape juice from the site above is a credit card and a mailing address. The over-the-counter drugs could be part of a recall if the company’s response is deemed inadequate. This means American consumers could have to search their medicine cabinets for the bad pills. Finally, the pest issues at the food facilities are just gross.

Links

Centers for Disease Control and Prevention:

“No instances of Zika virus transmission during fertility treatment have been documented, but transmission through donated gametes or embryos is theoretically possible.”

FDA:

Recently Posted Warning Letters

New York Times:

“Addicted to Vaped Nicotine, Teenagers Have No Clear Path to Quitting”

Nexus Vapour:

Check out these kid-friendly nicotine vaping juices

Business book review: “Essentialism: The Disciplined Pursuit of Less” by Greg McKeown

Essentialism: The Disciplined Pursuit of Less is a business bestseller from 2014. This slim paperback is lively and readable in describing the principles of separating the essential from the nonessential in work and life. The author provides the briefest possible guidance on how to hone your skills of judgement and discrimination in order to focus on doing what is essential, and eliminating the rest.

I picked up the Kindle version of Essentialism after a webinar host recommended it. In addition, Tim Ferriss read some passages from it on his podcast that resonated with me. Lastly, I was attracted to the message of decluttering and simplifying in general, and this book promised another writer’s perspective on this issue. Essentialism is not specific to any particular industry or profession.

In this review I will explain why Quality professionals should consider picking up this book. If you don’t have time to read it, I hope this review will give you a couple of key takeaways and quotes. I will also direct you to some good YouTube summaries, because that is an excellent way to absorb quick messages from the huge yearly tide of business books.

McKeown argues that hardworking, smart people are overextended and distracted by many projects and tasks, and that many of these tasks are unimportant. Part of what drives this may be technology. But there are other factors, including the pattern of success distracting a person from what made them successful in the first place (such as a technically skilled engineer who becomes a manager and finds himself doing less and less of the skilled work). Another pattern is the many unimportant choices that a person faces in the typical day.

By applying a few simple methods of sorting the essential from the nonessential, one can eliminate many unimportant tasks and focus on what truly matters. Part of this means getting better at saying no (though this is not at all the bulk of the book).

A simple tactic for how to say no is the following:

Use the words “You are welcome to X. I am willing to Y.” For example, “You are welcome to borrow my car. I am willing to make sure the keys are here for you.” By this you are also saying, “I won’t be able to drive you.” You are saying what you will not do, but you are couching it in terms of what you are willing to do. This is a particularly good way to navigate a request you would like to support somewhat but cannot throw your full weight behind.

Another chapter goes into greater detail about the Pareto Principle, which suggests that in any given problem, 80 percent of consequences come from 20 percent of the causes. McKeown often restates this to say that 90 percent of the causes come from 10 percent of the causes. I like this latter, starker ratio because it encourages you to be even more ruthless in eliminating the nonessential and focusing on that essential 10 percent.

For the most part the author accomplishes the stated purpose of the book: defining essentialism and providing some tactics for getting there. In addition, he does not overstate the advantages of this approach. Since it is an all-encompassing outlook on life and business, essentialism requires constant discipline, as stated in the book’s title.

A strength of the book is the short examples of how to make a small essentialist win (such as the story of saying no above). Techniques and methods that bridge the gap between a proposed way of thinking and the day-to-day workplace tasks is helpful. Another thing I like is the numerous small tables that highlight the differences between an essentialist and non-essentialist with respect to the chapter’s topic.

I also like the simple black-and-white art. One illustration in particular from early in the book asks the reader: Do you want to make a millimeter of progress in all directions or a mile of progress in one direction? Accompanying this challenge is a diagram to keep in mind when you are considering what you want to “go big on.”

One weakness of the book (as with much business literature) is the anecdotes. As a business consultant the author has a wealth of illustrative stories to draw on and most of them are memorable. However, some do not support the underlying arguments of essentialism in the way the author intends.

For example, a highly practical part of the book provides hints on how to avoid lengthy, pointless meetings and do the work via an email instead. But a later chapter opens with a Silicon Valley CEO who insists on a 3-hour, unskippable meeting every Monday of every week that includes every executive in the global company. In fact they are required to schedule their travel to never conflict with this meeting.

This example is meant to illustrate how a firm routine can “eliminate the mental costs involved in planning the meeting or thinking about who will or won’t be there.” However, in my view it was another example of a large chunk of time that each meeting participant will subtract from the finite number of productive hours in his or her workweek.

A second criticism is the book’s short length. I wish it was longer! This makes it an ambivalent criticism because it shows that the author managed to light a fire with readers like me. I want more examples, more diagrams, more how-to’s. I want a book on essentialism in daily life. I want a book on essentialism in relationships. I want a book on essentialism in higher education and job training.

I hope this kind of guidance is forthcoming. It’s now 2019 so five years have passed since the book’s publication. To continue honing essentialist skills, one may have to look elsewhere. A few resources I have found in this strain of thought are Cal Newport’s book Deep Work, Tim Ferriss’s podcast (and his somewhat dated book The 4-Hour Workweek) and Marie Kondo’s rapidly expanding corpus of decluttering guidance.

Another book echoing this philosophy (in my opinion) is Your Money or Your Life by Vicki Robin and Joe Dominguez. Why? Because it addresses the concept of enough. In my opinion, enough is tied conceptually to essentialism. I would recommend this personal finance gem to anyone who felt a spark from Essentialism. I am always happy when I come across a book that helps tie together a few different authors in the same way of thinking.

At this point I would like to highlight two connections between essentialism and Quality.

First, Quality professionals will instantly recognize the Pareto principle. This principle is in the quality curriculum and it’s worth revisiting again and again. As much as I like essentialism, it may be no more than a sexy rebranding of the Pareto principle. By adhering to the view that a small number of causes are truly important, and disregarding or de-prioritizing the rest, one can make a lot of progress. This would conform to my view that all the winning principles of Quality are known, they just need to be implemented and maintained.

Second, Quality professionals need to be constantly vigilant about straying from their core role of checking. The purpose of a Quality group within a company boils down to checking the work of the operations/production part of the company. Too often, due to misunderstandings, mission creep, and the good nature of individuals, Quality professionals take on projects that belong to operations. Sure, they might do a good job. But the question is, who will check their work? The better arrangement is the traditional one: Quality, in the course of their work, identifies a gap. They recommend (or require) operations to fix it. Quality checks the fix and reports back, continuing the cycle.

Thus it is the checking that is essential. Quality professionals should keep that in mind. Reading Essentialism will help them maintain vigilance at the dividing line between the doing and the checking.

Quality professionals should read this book. I would also recommend it to technical people who feel they spend too much time on meetings, email and other ancillary projects (isn’t that everyone?).

The image is from Columbia Park in Portland, Oregon today (20 Jan 2019). The quote from Lao-tzu is from one of the chapter headings in Essentialism.

Lastly for a taste of the language of Essentialism and a few nuggets, see my Kindle highlights below.

“I have a vision of everyone – children, students, mothers, fathers, employees, managers, executives, world leaders – learning to better tap into more of their intelligence, capability, resourcefulness, and initiative to live more meaningful lives.”

“Essentialists see trade-offs as an inherent part of life, not as an inherently negative part of life. Instead of asking, “What do I have to give up?” they ask, “What do I want to go big on?” The cumulative impact of this small change in thinking can be profound.”

“Essentialists spend as much time as possible exploring, listening, debating, questioning, and thinking. But their exploration is not an end in itself. The purpose of the exploration is to discern the vital few from the trivial many.”

“Richard S. Westfall has written: “In the age of his celebrity, Newton was asked how he had discovered the law of universal gravitation. ‘By thinking on it continually’ was the reply.… What he thought on, he thought on continually, which is to say exclusively, or nearly exclusively.”3 In other words, Newton created space for intense concentration, and this uninterrupted space enabled him to explore the essential elements of the universe.”

“You can think of this as the 90 Percent Rule, and it’s one you can apply to just about every decision or dilemma. As you evaluate an option, think about the single most important criterion for that decision, and then simply give the option a score between 0 and 100. If you rate it any lower than 90 percent, then automatically change the rating to 0 and simply reject it. This way you avoid getting caught up in indecision, or worse, getting stuck with the 60s or 70s. Think about how you’d feel if you scored a 65 on some test. Why would you deliberately choose to feel that way about an important choice in your life?”

“In the first pattern, the team becomes overly focused on winning the attention of the manager. The problem is, when people don’t know what the end game is, they are unclear about how to win, and as a result they make up their own game and their own rules as they vie for the manager’s favor. Instead of focusing their time and energies on making a high level of contribution, they put all their effort into games like attempting to look better than their peers, demonstrating their self-importance, and echoing their manager’s every idea or sentiment. These kinds of activities are not only nonessential but damaging and counterproductive.”

“As Ralph Waldo Emerson said, “The crime which bankrupts men and states is that of job-work;—declining from your main design to serve a turn here or there.””

“Nonessentialists say yes because of feelings of social awkwardness and pressure. They say yes automatically, without thinking, often in pursuit of the rush one gets from having pleased someone. But Essentialists know that after the rush comes the pang of regret. They know they will soon feel bullied and resentful—both at the other person and at themselves. Eventually they will wake up to the unpleasant reality that something more important must now be sacrificed to accommodate this new commitment.”

“Use the words “You are welcome to X. I am willing to Y.” For example, “You are welcome to borrow my car. I am willing to make sure the keys are here for you.” By this you are also saying, “I won’t be able to drive you.” You are saying what you will not do, but you are couching it in terms of what you are willing to do. This is a particularly good way to navigate a request you would like to support somewhat but cannot throw your full weight behind.”

“In a reverse pilot you test whether removing an initiative or activity will have any negative consequences.”

“As Alan D. Williams observed in the essay “What Is an Editor?” there are “two basic questions the editor should be addressing to the author: Are you saying what you want to say? and, Are you saying it as clearly and concisely as possible?”7 Condensing means saying it as clearly and concisely as possible.”

“This may seem a little counterintuitive. But the best editors don’t feel the need to change everything. They know that sometimes having the discipline to leave certain things exactly as they are is the best use of their editorial judgment. It is just one more way in which being an editor is an invisible craft. The best surgeon is not the one who makes the most incisions; similarly, the best editors can sometimes be the least intrusive, the most restrained.”

“The way of the Essentialist is different. Instead of trying to accomplish it all—and all at once—and flaring out, the Essentialist starts small and celebrates progress. Instead of going for the big, flashy wins that don’t really matter, the Essentialist pursues small and simple wins in areas that are essential.”

“Research has shown that of all forms of human motivation the most effective one is progress. Why? Because a small, concrete win creates momentum and affirms our faith in our further success.”

“On the basis of these hundreds of thousands of reflections, Amabile and Kramer concluded that “everyday progress—even a small win” can make all the difference in how people feel and perform. “Of all the things that can boost emotions, motivation, and perceptions during a workday, the single most important is making progress in meaningful work,” they said.”

“My experience has taught me this about how people and organizations improve: the best place to look is for small changes we could make in the things we do often. There is power in steadiness and repetition.”

“Similarly, we can adopt a method of “minimal viable progress.” We can ask ourselves, “What is the smallest amount of progress that will be useful and valuable to the essential task we are trying to get done?” I used this practice in writing this book. For example, when I was still in the exploratory mode of the book, before I’d even begun to put pen to paper (or fingers to keyboard), I would share a short idea (my minimal viable product) on Twitter. If it seemed to resonate with people there, I would write a blog piece on Harvard Business Review. Through this iterative process, which required very little effort, I was able to find where there seemed to be a connection between what I was thinking and what seemed to have the highest relevancy in other people’s lives.”

“The way of the Nonessentialist is to think the essentials only get done when they are forced. That execution is a matter of raw effort alone. You labor to make it happen. You push through. The way of the Essentialist is different. The Essentialist designs a routine that makes achieving what you have identified as essential the default position. Yes, in some instances an Essentialist still has to work hard, but with the right routine in place each effort yields exponentially greater results.”

“And routines can indeed become this—the wrong routines. But the right routines can actually enhance innovation and creativity by giving us the equivalent of an energy rebate. Instead of spending our limited supply of discipline on making the same decisions again and again, embedding our decisions into our routine allows us to channel that discipline toward some other essential activity.”

“In an interview about his book The Power of Habit Charles Duhigg said “in the last 15 years, as we’ve learned how habits work and how they can be changed, scientists have explained that every habit is made up of a cue, a routine, and a reward.”

“The question is, “Which is your major and which is your minor?” Most of us have a little Essentialist and a little Nonessentialist in us, but the question is, Which are you at the core?”

“I still fight the urge to impulsively check my phone; on my worst days I have wondered if my tombstone will read, “He checked e-mail.” I’ll be the first to admit, the transition doesn’t happen overnight.”

“This story captures the two most personal learnings that have come to me on the long journey of writing this book. The first is the exquisitely important role of my family in my life. At the very, very end, everything else will fade into insignificance by comparison. The second is the pathetically tiny amount of time we have left of our lives. For me this is not a depressing thought but a thrilling one. It removes fear of choosing the wrong thing. It infuses courage into my bones. It challenges me to be even more unreasonably selective about how to use this precious—and precious is perhaps too insipid of a word—time. I know of someone who visits cemeteries around the world when he travels. I thought this was odd at first, but now I realize that this habit keeps his own mortality front and center.”

“If you take one thing away from this book, I hope you will remember this: whatever decision or challenge or crossroads you face in your life, simply ask yourself, “What is essential?” Eliminate everything else.”

Links

Essentialism (Amazon)

Your Money or Your Life (Amazon)

The Life-Changing Magic of Tidying Up (Amazon)

Tim Ferriss Podcast episode with Essentialism quotes and reaction

How to Say “No” Gracefully and Uncommit (#328)

The 4-Hour Workweek (Amazon)

Deep Work (Amazon)

A couple of YouTube summaries of Essentialism:

Software Requirements Specification: How to contribute as a Quality representative

Every employee wants their teammates to value their contributions. This is a universal desire. In this article I will describe how to contribute effectively to a specific kind of project that I know: the software requirements specification.

Specifically I will describe how to help craft this document as a Quality representative on the project team. The Internet abounds with resources on everything software-related. So in this article I hope to convey my own unique experience and voice on this topic. If you have ever been the Quality representative on any kind of software team, I invite you to read on.

What is a software requirements specification?

The software requirements specification (SRS) is an important document in software development. It is crafted near the beginning of a software development project. Its structure includes:

  • A complete description of the software’s function
  • Defining, in advance, how users will use the software
  • Defining all quality and engineering requirements and applicable standards
  • And others

Here is a good definition from Wikipedia: “Software requirements specification is a rigorous assessment of requirements before the more specific system design stages, and its goal is to reduce later redesign.”

Its purpose is to:

  • Prevent failure
  • Reduce redesign
  • Meet and exceed requirements

The SRS is important because it helps prevent project failure. Simply put, it is useful. A well-crafted one will save time and money during development and serve as the basis for verification and validation later on.

Behind the SRS is a complex process of requirements elicitation, made possible through effective communication and documentation. The person who writes it is generally a business analyst. This individual is not a developer, not a user and not a manager, but someone trained in the software lifecycle and in technical writing.

Where Quality fits into this effort

The SRS is fundamentally a Quality process, even though it is not driven by the Quality unit of the company.

Note the terms requirements elicitation, verification, validation, redesign, standards, etc. The SRS is a quality planning document, with the intent of building quality into the software instead of reworking it toward the end of the project. It is a tool for establishing explicit, agreed-upon standards by which to judge the outcome (which parallels the Quality unit’s role of checking). It prevents redesign, which is analogous to scrap but is probably even more labor-intensive to address.

In addition, the Quality unit may be an internal customer that will use the software. As a representative on the team you thus help to provide user stories and convey what is important to you in using the planned software to do your job.

All engineering involves trade-offs. The information you provide, once it is explicitly documented in the SRS, helps the software engineers to prioritize wants and needs and incorporate them into the final product.

My own experience

I had a great opportunity to be on such a team. The goal was very worthy: to overhaul our clients’ ordering process.

Leaders within my company had heard client feedback about our ordering process and not all of it was positive. In addition they had their own experience to draw on: they compared our clunky, web-based form with the frictionless experience of ordering on Amazon.com or other e-commerce apps. They compared them and realized we could do better. If we redesigned our ordering system to be like Amazon’s, our clients would avoid the occasional frustrations, would be more likely to come back, and would be more likely to order the additional testing and manufacturing services that we recommend.

I joined this team after the initial project charter was created but before the SRS was created. The charter reflected these worthy goals and it had inspiring language created by the project manager. The charter also conveyed the full support of executives (including the “project champion” from the executive leadership).

Once my name was added to the charter and my responsibilities listed, I felt eager to contribute toward making the project a success. Helping with the SRS is one of the ways I did so.

Key components of an SRS and my own experience

In general an SRS will contain the following:

1. Purpose

2. Scope

3. System overview

4. References

5. Definitions

6. Use cases (user stories)

7. Functional requirements

8. Non-functional requirements

I won’t go into detail because you are quite capable of Googling this stuff. But I would like to highlight the Quality unit’s role in the SRS’s purpose, use cases (user stories), and functional and non-functional requirements sections.

The Purpose section of the SRS is worth your attention. You should scrutinize it (and the project charter) to understand the goals of the software and how you will use it.

In my case (the client ordering system I mentioned above), I would use the client information entered in this new ordering system during my reviews and audits. Therefore it was very important that the project team incorporated my Quality input on regulations regarding handling and reporting of client-provided information.

For instance, the proposed system needed to preserve client-provided sample names, lot numbers, surface areas, extraction ratios, preparation instructions, etc. with exact fidelity. In the event of a change to this client-provided information, Quality would need to review each change. Each change also needed to be attributable and time-stamped in order for the Quality unit to meet its Food and Drug Administration-mandated objective of reviewing these changes.

The Use cases (user stories) is where Quality (and other future users) convey to the team how they will interact with the software.

In my project, the most important user of the client ordering system is the client. The project team I worked with had extensive communication with several of our clients who had expressed willingness to “pilot” the new system once it was online. These were loyal clients who had mastered the old system and all its kinks, but were eager to use a new and improved one.

In addition to the client, Quality would interface with this system daily and hourly when reviewing reports, inspecting manufacturing processes, and inspecting in-process scientific studies. We would also pull up this client order record during audits, during failure investigations, and before approving reports and certificates. As Quality representative, I and another auditor collaborated and described to the team how we would use the new software. With the business analyst’s help, these user stories were formalized and refined and they became part of the SRS.

Lastly, the functional and non-functional requirements sections help project stakeholders understand and agree on what the software will do and how it will work. In software engineering there is a distinction between the two kinds of requirements. You don’t need to understand this distinction or separate your input into the two categories. More important as a Quality representative is to communicate on an ongoing basis with the team to convey your must-haves and your wants. You will commit to being a good-faith negotiator on needs versus wants. You will make a full effort to understand, and then you will be flexible on your wants but clear on your needs.

Here are some examples. In my project to create a new client ordering system, the following requirements surfaced:

  • The ordering system must communicate with full fidelity/exactitude with our Laboratory Information Management System (which is used in data capture and reporting).
  • The system must have full audit trails, allowing investigations to capture the who, what and when of every change to client-supplied information.
  • The system should flag any changes to client-supplied information made by company technicians and scientists after the sample submission with strikethrough text or an asterisk, to allow the Quality unit to review these changes.
  • The system should allow Quality to pull up a few useful reports, including a report for inspections of ongoing studies, and a report to pull up during client visits.
  • The system should be clear about where the definitive record is, so that there is no disagreement or confusion for a reviewer contemporaneously or months or years down the line.

The examples above are both functional requirements and non-functional requirements (in software engineering terms). They are also divided into wants and needs. In this project I was explicit with the team about the audit trail need because regulations require that the Quality unit review changes to critical study data. We needed to ensure that a sample submitted under one name would not be reported under another name, without the proper authorization and review.

When it came to “flagging” these changes, I considered such a feature a want because such system flags would help us zoom in more efficiently on changes, deletions and insertions that needed Quality review. In discussions with the team my Quality partner and I made it clear that this and the reporting features were a couple of “nice-to-haves” so that the team could prioritize while still understanding our desires.

The last bullet point is borderline between wants and needs. On one hand, computerized systems that talk to each other are bound to have redundant data. Written procedures can compensate for this by pointing to the most concise record. On the other hand, any hint of contradiction in two computerized systems could be problematic if regulators examine the record years later and cannot reconstruct what was actually done in the lab or manufacturing area.

There is one last SRS component I would like to highlight: diagrams and pictures.

Busy people (doesn’t that include all of us?) and adult learners in general will scan the document for images. If you can create a Microsoft Visio process chart that illustrates clearly and simply how Quality will use the software, then this image may feature prominently in the SRS, and it will help the team and others in your company to understand your role and build software that will fully equip and enable that function.

Microsoft Visio is a fantastic tool that warrants an article of its own. (Alexa: remind me in 3 months to write it!). If you use Visio to map out the Quality interface with the planned software, review this diagram with the technical writer and your unit, make it pretty and then finalize it in a way people can agree on, you will be a hero. And your work will feature prominently in the SRS.

How to contribute as Quality representative

Help the software team out at this crucial stage by being available, by reading the project updates thoroughly, by relaying needed info back to the Quality team (while cc’ing the project manager and observing any other project confidentiality ground rules), and by bringing questions you can’t answer to Quality management.

Below is a list of further tips:

  • As a Quality representative you can help the writer of the SRS to fully describe:
    • System Interfaces
    • User Interfaces
    • Hardware Interfaces
    • Software Interfaces
    • Communication Interfaces
    • User characteristics
    • Reliability
    • Availability

For instance, you will learn how the proposed software interfaces with your already-existing systems. You will convey to them how often and where you will log in (at your desk, in the labs, on the manufacturing floor, on a portable tablet, etc.). You will be able to request reviewer-only access, or superuser or Quality administrator privileges for your group.

You may convey to the writer of the SRS how the typical Quality employee goes about their job. Are they all tech-savvy? Will they know how to use a tablet or stylus? Are they accustomed to pulling up the proposed reports and queries?

You may need to convey how and where Quality auditors and specialists do their jobs. If the software needs to be accessed while in labs or on the manufacturing floor, it might require a tablet. Will the WiFi connection be fast enough? Will the tablet be durable? Will it be wheeled about on a cart, or carried by hand? Will it be navigable and reliable if a client wants to see the system? Or does it require an employee to print out a manual copy of a report for the client to review?

  • To continue in this vein, the Quality representative may contribute quality attribute general scenarios:
    • A modifiability general scenario
    • A performance general scenario
    • A availability general scenario
    • A security general scenario
    • A usability general scenario
    • A reliability general scenario

Don’t be intimidated by the term “quality attribute general scenarios.” You are not responsible for putting Quality requirements into engineering terms that will be used by developers. Rather what this means is that the Quality representatives will convey system-independent requirements, and the writer of the SRS will draft them into system requirements.

As an example, Quality may require that auditors have read-only access to the system, so that they fulfill their role of checking but not altering or “fixing” things themselves, which would muddy the waters of measurement and accountability. This is a system-independent requirement. The writer of the SRS will convey this requirement into technical terms and it will thereby become an architectural pattern and a set of measurable, specific quality attribute requirements. See the Carnegie Mellon University link below for an interesting discussion of these terms.

Five final tips

Consult with your team for additional perspectives. Explain the goal of the project, and then provide specifics. Give an open invitation to join for those who want a greater voice in a software project that will affect them.

Escalate to Quality management if you encounter a question that you don’t know the answer to. Consult Regulatory for requirements you may not be aware of that may need to be reflected in software requirements. Ask a manager to help you distinguish between wants and needs. Ask a manager to join you in project meetings if you need to press a certain point more finely.

Read the project charter in the beginning and re-read it occasionally or when it is amended. Share it with the Quality unit if it will help them understand the goals and benefits of the project.

There are development methodologies where an SRS is not central. Consider the following forum response (I edited for clarity) from https://cofounderslab.com/discuss/are-spec-documents-still-necessary-for-software-development. The author is referencing the fact that you can’t know everything in advance, no matter how much you plan:

“Recently developers have articulated why Big Up-Front Design (BUFD) is the hallmark of failure. BUFD is typically an exercise in make-believe, where you pretend you fully understand the problem and how precise code will solve it before you even begin development. The design document is treated as a contract, and all kinds of pain ensues. Been there, done that.”

This means you should be open to other, novel means of capturing requirements. You should be as agile as the rest of the team!

Lastly, various companies will try to sell you software to create SRS’s. Some of these services may work great. However a vendor cannot do the hard work of talking to people, following through, clarifying, etc. A senior product manager at Cisco said, “Understanding what is actually required is both a science and an art.” (See the smartsheet.com reference below). A piece of software on its own cannot achieve one of the main objectives of the SRS, which is to foster consensus. Do a CTRL+F on my article and note how many times I used the word “craft.” A skilled communicator often drives the SRS process and makes it so valuable to the software project. You can make this person’s job easier through your contributions.

Research further on your own with these resources

SRS templates:

Examples of actual or mock SRS’s:

YouTube videos:

  • Some YouTube videos on SRS’s are of poor quality or are simply incorrect. Some are from third-party companies that want to convince you that their software/outsourcing operation is the only way to do it correctly. I won’t provide a list of videos because the assortment is constantly evolving. Since you are a Quality professional, I trust you to use your well-practiced discernment!

Articles and papers:

Final thought

I see a bright future for individuals in Quality who are able to interface effectively with software teams. If you can communicate with business analysts and developers and bring that technology back to your Quality unit, you will benefit both groups and you will grow in credibility and prestige.

I hope you have felt a spark from reading this article. I enjoyed writing it. If you have a question or comment for me, let me know below!

p.s. The photo is from Saint Johns Bridge in Portland, Oregon. I took it on a fine sunny day in Cathedral Park.

How to use Microsoft Access for your company’s Quality needs

This year I had a great opportunity to learn Microsoft Access.

MS Access is a database software that comes with Microsoft Office (Word, PowerPoint, Excel, et cetera). To my surprise, I found it was packaged with my home edition of MS Office. If you are paying for a home edition of MS Office, you already have it for as long as you pay the monthly fee ($8).

In this article I will show you the basics of what I learned. I will describe how you can use this software to solve problems in your Quality unit or in the larger organization. I will show you how you can gain the highly advantageous moniker of “the software person” on your quality team within a short timeframe. And I will highlight some great opportunities to pursue once you get a handle on this powerful database tool.

What is MS Access?

You can easily read the Wikipedia page, visit the Microsoft tutorial site, or watch a couple of Youtube videos on what Access does, so I won’t restate all that information here. In a nutshell, an Access database helps you go beyond Excel spreadsheets in robustness and functionality.

Think of the biggest, oldest, most bloated spreadsheet you use at work. Perhaps it is a list of approved suppliers. Perhaps it is a team tracking spreadsheet. Perhaps it is a spreadsheet for tracking defects and corrective actions in outgoing products.

Now think of the problems involved: individuals are often locked out as another person edits the spreadsheet. Multiple versions are floating around, and there is confusion over which one is definitive. They constantly need to be reconciled. The spreadsheet is old, so it is opened in compatibility mode and some macros have broken down, resulting in extra clicks with every opening and closing of the file. The file is large and slow. Individuals can easily delete, replace, or fat-finger their entries without much traceability. When it comes to analyzing the data, you have to make extra sheets and you question whether you are drawing from the correct source cells. Redundant and ambiguous values proliferate.

A database helps you get around these problems. For instance, instead of navigating to the correct sheet and entering data in an individual row, the user can fill out an attractive, easy-to-use form. Instead of relying on finicky charts, you can export a standardized report with parameters you have specified. Finally, there is the extremely important advantage of unambiguous (unique) values. A properly designed Access database, unlike a spreadsheet, will not give you ambiguous answers when you ask, “What is client X’s current email address?”

The four main components of Access

Access has four pillars: tables, forms, queries and reports.

Tables

An Access table is basically a spreadsheet. In fact, you can enter data directly in each table just like you would with an Excel spreadsheet. Early on, however, you will assign a primary key to each table that ensures unique records. Access tables can also be linked in very versatile and customizable ways that allow you to answer tailored questions and perform trending and other analysis.

Forms

Access forms are exactly that: an organized way for the user to enter data. Instead of typing into a row on a spreadsheet, the user can follow the prompts in a pretty, organized form. The data goes precisely to the table or tables specified by the designer. Invalid data entries can be rejected.

Queries

A database can grow to huge proportions. A query allows a user to ask a simple question, filter data to answer a question, combine data from multiple tables, or add, alter or delete data from tables.

Reports

Reports are fun. They allow you to analyze and visualize vast amounts of database data based on your specific parameters. You can customize the report’s header and graphics.

Assistance for the learner

All of the above components are accompanied by a “wizard.” This is Microsoft’s module for helping you efficiently select the correct option when constructing your database.

In addition, you will not be writing code unless you want to. If you have a giant database that needs to do something highly specialized, you can learn macros and Visual Basic to solve the problem. But you don’t need to learn to write code, or google snippets of code, to get the job done. Instead, Access is populated with tools, drop-down lists, drag-and-drop functionality, and modules to help. Access also provides “views” such as Design View, Layout View, Datasheet View, Form View, et cetera that display essentially the same information, but in a different way. This allows you to get the job done in a very visual manner or in a spreadsheet-like way. And you can switch back and forth between views at any time to understand what you are looking at.

Three uses for Access in a Quality unit

Do you have an Approved Suppliers List? This important, often regulated document is sometimes nothing more than an Excel spreadsheet. Auditors may ask to see it and question whether it is locked down with the correct editing permissions. They may ask if it is the definitive record, or just a notation for the paper copies, which are definitive. An Access database can help you meet and surpass these expectations. Of course, you will have to validate the use of the database. But in the end it will be much less slippery than a spreadsheet (and less pricey than paying for supplier management software!)

Another use is for tracking your team’s work. Suppose you are trying to standardize the various tasks the typical Quality unit does. This often begins with timing studies. For a defined period, each Quality person can use the database to enter their daily tasks, along with the time it took to complete them. The form they use can even include a box for entering unexpected problems (Pareto events) that led to extra time beyond the standard/expected/takt time. When analyzing this timing data, management can pull up customized reports that zeroes in on outlier tasks and the corresponding problems. This allows management to focus on the 20% of the issues that are causing 80% of the problems.

Finally, if you are a preclinical testing laboratory, the GLP regulations require that QA maintain a Master Schedule of all studies. An Access database, validated of course, is a great place for this. QA can edit while still making it available on a view-only basis to anyone in the organization who needs to see it.

An example from my own life: a personal finance tool

Early on while learning Access I put one of my personal finance spreadsheets (my net worth tracker) into Access. I had been relying on a Google Drive spreadsheet for this task. However, as some accounts are opened and some are closed, it became more and more unwieldy. The auto-updating charts and tables I had created were becoming inaccurate and drawing from the wrong data sources.

So, I exported some of this data into Excel, and then into Access. After tweaking some of the reporting functions, I found this to be a more robust way of tracking my financial goals. I also created a better form for entering my biweekly update on payday. Currently my financial info is small enough to manage with spreadsheets. But over the course of decades, a database may be the way to go. If I ever start a business, I will be well-positioned to track this kind of info because of what I have learned to do with Access.

Growth and value to your company

I would encourage you to learn Access and teach it to others in your group. Don’t just preach, though: actually solve a problem. If you identify one or more people struggling with a bloated spreadsheet or with aging tracking and trending reports that are becoming less relevant, create a user-friendly database for them with a beautiful splash page. Show them the possibilities and then once they get the hang of it, offer to go all the way and import the source data with the support of management.

You will benefit immensely from gaining a reputation as “the software person” in Quality. People who can solve software issues in their group without expensive in-house IT and vendor involvement are greatly valued.

Once you have gained proficiency in Access, consider moving on to more advanced database software such as SQL. In this program, also provided by Microsoft, you can create advanced databases that integrate with web-based apps that face customers or internal groups. The reporting functionality is correspondingly advanced.

Where you can go to learn the software

Start with the training at Microsoft Support. Watch Youtube videos for every level of expertise.

Visit Lynda.com for an excellent, comprehensive guide to getting proficient. This guide is by an Access expert named Adam Wilbert. The course is 5 hours total and has been viewed 330,000 times. It is an excellent resource. My only complaint is that the guy talks very fast. I got around this by playing the tutorial at 0.75x speed! The course is filed under Business > Databases > Access 2016. I am fortunate to have free access to this learning suite through my metropolitan library. Check whether you have such a perk. If not, the monthly fee may still be worth it. Consider it a career expense.

Lastly, take an in-person course. It’s hard to learn from a screen alone. Although community education systems (through public schools or libraries) may offer Excel classes, they are unlikely to offer Access classes. So, look to community colleges and universities. My manager and I took an 8-hour, one-day course at Normandale Community College in Bloomington, Minnesota. Since we had acquainted ourselves already with the basics, we skipped to the intermediate level course. It was the right choice! I would recommend this to save some money. Again, I am fortunate that my employer covered this excellent career development opportunity.

My challenge to you

I would like to challenge you to a short-term goal of learning the basics of Microsoft Access. Create several tables with data relating to something you are into, such as sports, personal finance info, or something from work. Create a form, a query, and a report. Identify a problem at work that you might be able to help with. Lastly, let me know how it goes. I would love to hear from someone who has used Access to make the job easier for themselves and their colleagues. I would also like to hear from those who progressed successfully from Access to SQL or another web-integrated database software.

This FDA warning letter is appalling

Thank goodness for diligent regulators. Read on, and be horrified.

The US Food and Drug Administration (FDA) recently inspected a testing facility outside Los Angeles and found serious concerns about the quality and integrity of the data produced there. In its warning letter, which is publicly posted online, the agency’s inspectors outlined the objectionable conditions they found.

Anyone interested in the role of this powerful agency in protecting the American public health should read this letter. Prepare for a glimpse into the “wild west” of preclinical research. Prepare to be appalled!

Background

Drugs and medical devices undergo extensive testing before touching human beings in clinical trials. These studies are called non-clinical (or preclinical) research. In the United States these studies are regulated under the Good Laboratory Practice regulations. As part of its Bioresearch Monitoring Program, the FDA inspects testing facilities to ensure the data generated is scientifically accurate and valid.

These inspections can result in a range of actions. For a well-run facility, management actually benefits from such an inspection. They get feedback on weak areas and they better understand the agency’s thinking on certain scientific or organizational issues they face.

For a poorly run facility, the inspection could result in a warning letter at best or an immediate shutdown of the facility, followed by legal prosecution, at worst.

This letter indicates the facility is poorly run. Appallingly so. As such, all the scientific data it has generated on the investigational drug is in question.

Some choice passages from this warning letter

There were no records indicating the presence of a functional QAU, or records of any QA activities at the facility. In addition, you indicated during the FDA inspection that you were the study director as well as the head of the QAU. However, this practice undermines the QAU from performing its required functions separate from and independent of the personnel engaged in the conduct of nonclinical laboratory studies. You also indicated that a veterinarian for the study from (b)(4), was a member of the QAU; however, that veterinarian signed a statement denying that he was a QAU member.

It’s going to be hard to walk that one back!

Considering your test facility management’s overall lack of responsibility to implement basic essential elements of GLP compliance, the quality and integrity of the study data cannot be assured because there was no QAU oversight of the nonclinical laboratory studies conducted at your testing facility.

This is pretty bad. The QAU must monitor the conduct and reporting of the organization’s studies. When it comes to life-impacting drugs and devices, the integrity and acumen of the researcher alone is not enough.

Your testing facility failed to establish standard operating procedures (SOPs) in writing setting forth nonclinical laboratory study methods that are adequate to insure [sic] the quality and integrity of the data generated in the course of the studies, as well as to ensure appropriate handling and care of animals.

During the inspection, you indicated that as a surgeon, you knew all operation procedures, and therefore you did not agree that SOPs needed to be established.

Written SOPs, with appropriate approvals and control, are a bedrock of quality testing. It’s hard to overstate this.

For a reader who does not have a scientific background, let me make an analogy: Have you ever been in a court of law? What if you were on trial for a capital crime, and your lawyer showed up with no computer and no documents. You ask him, “What’s up, did you not prepare at all?” In response, he taps his temple and grins, saying, “Don’t worry, it’s all up here!”

I am impressed with the incredible capabilities of the scientists that I work with every day. But they and I know the importance of independent checks. The surgeon the FDA interviewed is saying his authority and expertise puts him above independent checks. He is saying we should trust his data because of his skill and expertise. Sorry, but this is not a medieval guild. Mastery is not a validated method, and modern drugs and devices require independent examination and comparison with a standard.

As the study director, you must ensure that all experimental data, including observations of unanticipated responses of the test system, are accurately recorded and verified.

You failed to ensure that all experimental data were accurately recorded and verified.

For the toxicity study in dogs, dose calculation worksheets for nine dogs show body weight measurements and test article administration in June and July 2016; however, purchase records indicate that eight of the nine dogs were purchased in August 2016, which makes the data entered before August 2016 invalid. In fact, during the FDA inspection, you acknowledged that the data entered before the dosing in August was inaccurate and did not reflect real-time data entry.

For all data changes, you failed to ensure: (1) that the original entries were not obscured; (2) that the reason for all changes was indicated; and (3) that all changes were dated and signed or identified at the time of the changes.

The study director’s role in a preclinical study is extremely important. He or she serves as the single point of control for the study. The passages above speak for themselves. They may even point to falsified data, though this is not explicitly suggested in the warning letter.

Basic data integrity principles and methods are taught to undergraduate science students within the first week of their laboratory courses. These include correcting handwritten entries without obscuring the original entry, signing and dating, documenting concurrently, et cetera. The warning letter shows these principles were not followed at this facility.

Because you failed to ensure that all experimental data were accurately recorded and verified, and that all entries and changes in entries were properly documented, FDA has concerns about the integrity of the data generated from the nonclinical toxicity studies conducted at your testing facility.

The investigators use professional-sounding understatement.

You failed to comply with the above requirements to retain and to provide archives for storage and retrieval of all raw data and specimens.

You were unable to provide all data requested during the inspection; you indicated that you were unsure where the data was, and that it might be at your home.

The absence of raw data and specimens collected during these nonclinical toxicity studies, and the absence of proper storage of raw data and specimens, raises significant concerns about the integrity of the study records and data.

Again, understatement. Archival procedures, controlled storage conditions, data completeness, and traceability are a bedrock of quality. These procedures would be in the SOPs, but as noted above, the facility’s SOPs were inadequate or absent.

Testing to determine the stability of the test and control articles in the mixture must be conducted, either before the study initiation or concomitantly, according to written SOPs that provide for periodic analysis of the test and control articles in the mixture. However, none of these analyses were done.

Does this sound a bit esoteric to you? It’s not. Before testing a compound in preclinical toxicity trials, the compound must be well-characterized. This ensures that the compound that eventually is dosed in humans matches the compound that was given to animals or to in vitro test systems. Without this characterization, you are putting humans in those safety trials at risk, or generating poor-quality data.

The protocols of nonclinical toxicity studies in mice, hamsters, rabbits, and dogs… did not have a date of approval by the sponsor and did not contain sufficient details, including but not limited to:

a. A statement of the purpose of the specific study

b. The name and address of the testing facility at which the study is being conducted

c. The procedure for identification of the test system

d. The methods for the control of bias

e. The frequency of tests, analyses, and measurements to be made

f. The records to be maintained

The study protocol is a central document. To name just two possible consequences of not having one: How do you identify a deviation from the study plan, if no plan was documented? How do you evaluate the outcome, if no predetermined endpoints were established? Not having an adequate protocol leads to weak scientific data and conclusions.

As evidenced by FDA’s inspection findings, your testing facility management failed to fulfill the primary responsibilities to establish appropriate policies and procedures intended to ensure the quality and integrity of nonclinical data for FDA submission. Furthermore, the deficiencies found in your oversight as the study director and the absence of an independent QAU indicate that your testing facility failed to fulfill the core responsibilities to remain GLP-compliant. As a result, FDA is concerned about the validity of nonclinical data generated by your testing facility.

Failure to address the violations noted above adequately and promptly may result in regulatory action without further notice.

More understatement. I haven’t checked, but this facility may already have been shut down! The FDA has the power to walk in and immediately halt operations. Especially since this is not their first warning letter.

Some caveats

I don’t wish to pile on. I don’t want to be part of an Internet mob. And I always assume good intent instead of nefarious fabrication and fraud. But people should be aware of the risks to health out there. The FDA cannot inspect every facility all the time. They cannot catch all the shoddy science out there. They can’t prevent people from wasting money on ineffective treatments.

I also feel sorry for the animals in these studies, which include mice, hamsters, rabbits and dogs. The letter indicates there were no SOPs describing their care and handling. What if some of them suffered from poor treatment? What if their treatment was fine, but the data from their studies was completely invalidated? It’s as if they underwent testing for nothing. If an animal was used in research, something good should come out of it. This may not have happened here.

I will write an article about how to use FDA warning letters and other materials to be proactive in your organization

In a future article I will show you how to use the FDA’s warning letter database to your advantage. The point is not just to gawk at the egregious ones. You can use the warning letters to get an edge and understand the agency’s thinking on many biotech and pharmaceutical subjects.

What do I mean by “getting an edge?” Let me explain. Have you noticed how regulations change very slowly? The GLP regulations that we follow were finalized in the 1970s and 1980s! There is nothing in those regulations mentioning cloud-based data capture, gene editing, or artificial organs. The FDA occasionally releases non-binding guidance documents that help industry, with its cutting-edge techniques, from falling afoul of these older regulations. But even those guidelines quickly become dated and cumbersome.

FDA warning letters, on the other hand, reveal findings and enforcement actions from inspections that occurred only weeks or months previously. The letters also indicate the reasoning behind the actions. A useful FDA inspection finding includes three things: a detailed description of the circumstance, the applicable standard (usually a regulation), and the possible consequences of not meeting the standard.

Read closely and you will find all three components in the letter I linked above. In my upcoming article I will break down the warning letters in a way that will make you a better auditor who will be even more valued by your organization or auditee.

Final comments

Visit the FDA warning letter database regularly. Stay tuned to warning letter news. Read my upcoming article on warning letters. Let me know if this article benefited you. And let me know if your jaw dropped as much as mine did!

Image credit

The image is from http://www.cancertreatmentus.org/

How I boosted my knowledge and credibility with the Certified Quality Improvement Associate certification

This past Saturday I sat for the Certified Quality Improvement Associate (CQIA) exam. I passed and now I can draw on this body of knowledge and cite this certification for the rest of my career in quality.

In this article I will explain what this certification is. I will describe my experience in preparing for and taking the exam. I will share what I have learned about certification pathways in quality. Finally I will offer some advice on whether you should pursue the certification yourself.

The Certified Quality Improvement Associate

The American Society for Quality (ASQ) introduced the CQIA certification in 2000 as a way to introduce the basics of quality to individuals and organizations that are not already in traditional quality roles. Getting such employees certified is a way to integrate quality throughout the company. Those already in a quality role can obtain the certification as a first step toward more intensive certifications such as Certified Quality Engineer.

The body of knowledge covered in the materials and coursework touches all aspects of quality. Some examples of learning objectives follow:

“Define quality and use this term correctly in various circumstances.”

“Define and distinguish between common and special cause variation in relation to quality measures.”

“Identify supplier performance measures, including quality, price, delivery and level of service.”

Succeeding on the exam requires remembering, understanding, and applying your learning from the materials I reference below. The exam reflects the body of knowledge presented in the CQIA Primer. Everything you need to know is in this textbook. There are additional materials that will enhance your learning and give you an edge. Actively participating in a prep course, doing as many practice questions as possible, and connecting with your instructors will also boost your chances and make your investment of time worthwhile.

Preparing for the exam

To prepare, I attended four three-hour courses presented by subject matter experts in quality. Each of the instructors were Certified Quality Engineers. At each session we:

  • Did a practice quiz
  • Went over two or three chapters from the CQIA Primer
  • Did practice exam questions as a class
  • Addressed any questions from our readings and end-of-chapter exams

You could prepare for the exam without this kind of course but that would mean doing a lot more solo work. In the class was at least one person who had failed the first time around. So, do whatever you can to get an edge!

Outside of class I read the Primer, took the end-of-chapter exams, and drilled myself again and again with the practice exam software.

It’s this last part that was especially valuable. The software is simple but has two particularly useful functions: it allows you to select the length of your exam and it breaks down your scores by section. This means you can practice for the exact length of the CQIA exam (110 questions in 3.5 hours). Once you are done you can see what areas you are weak in and revisit the Primer.

I read the Primer, highlighted key definitions, charts, figures and tables, and went over them again and again. I did not read the full Certified Quality Improvement Associate Handbook, but I visited it for some longer exposition of a couple of topics from the Primer. It also came in handy during the exam (more on that later).

During my many practice exam drills, I kept aiming for the 80% correct rate needed to pass the exam. I did some drills open book (as the official exam is) and some without. Both ways are valuable because you want to know the information and you want to know where to find it.

I had one classmate who did not have a college degree and was unaccustomed to the test-taking environment. Note-taking, reviewing, and computer-based exams were all unfamiliar to her. She was very anxious about what the exam would be like and she felt she wasn’t retaining anything during the course. If this sounds like you, anything you can do to become accustomed to that environment will help. Tell the instructor about your background. Take the practice exams. Create flashcards or other learning tools that work for you. Set up a timer and get used to how it feels to be timed while answering questions.

The exam itself

My exam was administered by Prometric, a third-party testing company. My exam was on Saturday morning. I had a good experience and found it was the same as Pearson Vue, where I took my EMT-B exam several years ago.

Read all the exam rules in advance and arrive with what you need. Make sure you have removed the practice exam questions from your CQIA Primer and that all your materials are bound.

The exam was the exact level of difficulty as the practice exam questions. As I said above, that program was very, very helpful for focusing on areas where my recall was weak.

You will spend lots of time flipping through the book looking for key tables and definitions. If you can find an answer within two minutes using the index or your own sticky flags (which are permitted), you will succeed on the exam.

There is plenty of time to finish the exam. I left with an hour to spare even though I went back to look up many flagged questions.

I used the exam software features to my advantage: a strikethrough to eliminate distracter choices, a timer to keep an eye on my progress, and a flag function that lets you revisit the question. I did not use the physical dry-erase board or the calculator that Prometric provided.

I did not take a break, but this is an option if you want one.

There was no math. At all.

I used the glossary of the CQIA Handbook several times. It was faster and sometimes more detailed than the index of the Quality Council of Indiana book. When a question was difficult and the choice depended on wording, I consulted the relevant section of both the Primer and the handbook.

ASQ certification pathways

My aim is for further certification. This graphic is pretty ugly but it gives you an idea of how to build on previous certifications as you grow more advanced and more focused on your area of quality.

Don’t take this chart as the final word. If you have questions, attend a chapter meeting of ASQ and talk to the presenter or another quality person. They will have firsthand knowledge of certifications. Tell them your situation and goals and ask for suggestions. Bring the same information to your manager and get their opinion as well, before committing to a specific certification.

As you can see, the CQIA is an initial certification that can lead to several professional ones.

Should you get the certification?

 

Yes, you should. Even if you are not in a traditional quality role, you may benefit from the CQIA. It may open the door to Six Sigma, lean, team dynamics skills, statistical process control, etc. Each of these is just a small section of the Primer but a vast field of practice of their own.

I am fortunate to be in a company that paid for my materials and for the exam as part of their dollars-and-sense interest in furthering my development. I hope you are in the same situation. If not, perhaps you can convince your manager to come around. If they won’t cover the cost, just pay for it on your own. The total will run you about $300. That might hurt a bit, but it will strengthen in your future earnings potential and pay off in the end.

Once you have the certification, you have it for life. You can keep the materials at your desk and share your learnings with coworkers. And if you progress to further, more expensive or labor-intensive certifications, you will walk in with greater confidence.

CQIA Handbook

Final comments

What do I mean when I say I “boosted my credibility?” In quality we are not just people with sharp eyes or attention to detail. People who work in quality do not have a certain personality type. Rather we are following rigorous methods developed toward the end of the Industrial Revolution to ensure safe products, reliable service, unexpected delight and longer, healthier lives.

To study quality is to apply the scientific method in your day-to-day work. Getting a certification such as the CQIA helps you learn these methods and meet the experts who practice them. It thereby signals your credibility. It shows others in your organization that you are on the path toward expertise and that your work is based on the methods of a true profession.

As you consider the CQIA (or if you are preparing for the exam), feel free to reach out to me for more details. I have plenty of study tactics to share and some ideas about quality certifications that I didn’t cover in this article.

I wish you success!

 

Resources

Certified Quality Improvement Associate Primer

Certified Quality Improvement Associate Exam USB Flash

Find both at (http://www.qualitycouncil.com/cqia/)

Get both of these! The Primer is absolutely required. The practice exam software is a near-necessity.

 

The Certified Quality Improvement Associate Handbook, Third Edition 3rd Edition

This is not a necessity, but I would recommend it. It is more attractive and better organized than the Primer, and it will be useful during the exam.

 

CQIA Primer

Career Book Review: “The New Rules of Work”

The New Rules of Work: The Modern Playbook for Navigating Your Career, by Alexandra Cavoulacos and‎ Kathryn Minshew (2017)

I stumbled upon this book after reading a positive review and was very impressed. It is a well-organized and focused presentation of some of the content of themuse.com, an online career resource.

The authors understand the great challenge of the modern career: with more options and tools than ever, many people find themselves without a playbook for this complicated, non-linear career game.

This book is exactly what the subtitle states – a playbook. This means you can select from it what you need without having to read the whole thing. You can also use it as a reference from which to dive deliberately into the content at themuse.com. This may be helpful if you are like me and sometimes lose focus in glossy, bottomless websites brimming with multimedia and links.

First, read “What Color is your Parachute?”

If you have not read the classic career book “What Color is Your Parachute,” read that first. This classic career book was originally published in 1970 but the extremely dedicated author, Richard Bolles, has revised it every year since. Make sure you are reading the most recent edition! Alternatively, read one of the many spinoffs (http://www.jobhuntersbible.com/books) that may fit you more precisely.

“What Color” is wide-ranging in scope and covers pretty much every practical thing you need to do in the job hunt or career pivot. This includes self-assessments, a guide to interviewing, and many other resources. Read this and then return to “The New Rules of Work.”

Read only the sections that are most relevant to you (or, read only the gray emphasis boxes)

Just like “What Color is Your Parachute,” some sections will be more relevant than others. Read only these. For an even more abridged experience, read only the gray emphasis boxes. These boxes contain the most concise, bulleted content. In fact, some of them are simply brief articles from themuse.com. But they are the curated ones.

You’ll find highly actionable and practical advice

The authors drive home the point that you must develop your own personal brand to present to potential employers. This brand will stay with you much longer than your average employment stint. I appreciate this well. In fact, this blog is my way of developing a brand and sharing knowledge with people in my exact situation and with potential career connections. However, this blog was originally inspired by “What Color is Your Parachute” and from my 11 years writing a personal blog.

The résumé writing section stood out to me as particularly practical and concise. This section also points you to resources (such as templates) at the website that you may find useful. It may motivate you to replace your Microsoft Office standard templated résumé (guilty!). One tip: get rid of “References available upon request” line.

Tap into the resources at themuse.com

This site is rich with new career content each day from a network of freelancers and from paid staff. There are portals to paid courses and coaching. And the site partners with employers who recruit through the site.

One problem is separating between sponsored content and professional advice. Since much of this this site is free, you are not necessarily the customer – the advertisers and employers are. The site is rife with embedded content. If you find a way to use this site in a focused way, or if you have a success story from one of their paid modules, I would love to hear about it!

One criticism: the emphasis on LinkedIn

Authors such as Cavoulacos and‎ Minshew seem to believe that having a detailed, up-to-date LinkedIn profile, and being an active user of this site, is a necessity in being considered for an interview. As members of the recruitment/talent sphere, they are no doubt heavy users of this LinkedIn. They seem to think that since it’s free, there is no downside.

But there are many reasons not to join LinkedIn, including the following:

  • It will mine your email addresses and spam your contacts.
  • It was hit with at least one major data breach (in 2012). Your information there may not be secure and is definitely not private.
  • Garbage content, meaningless endorsements, and fake profiles abound.
  • Most important for me: it is a potentially bottomless timesuck! With a career blog (such as the one I am writing right now), I know how much effort I put in and I can see the results. With LinkedIn, you might feel you need to actively cruise the site and hit up your network for many hours a month. You many never know how much is enough, and you may never know what benefit you are getting from it.

If you don’t want to join LinkedIn, don’t! It is not the necessity that so many authors claim it is. And keep in mind that you can access a lot of articles and other content on LinkedIn without creating an account.

Some great advice that I highlighted:

  1. Chapter 4 covers the absolutely essential task of building your personal brand. The authors walk you through the five steps for building a successful brand: determine your brand attributes, draft your branding statement, refine your profiles, create your personal website, and activate your brand.
  2. The résumé editing checklist in Chapter 7. “Does this sell you as the perfect candidate for the types of roles you’re seeking? Does the top third of your résumé serve as a hook to get the hiring manager to read more? Could anything benefit from examples? Does the page look visually appealing?” Lastly, submit the résumé to the employer as a PDF, not a Word document!
  3. Chapter 7 addresses how to rework not-so-relevant experience into something tailored to the job.
  4. Chapter 8 provides tried-and-true interviewing advice, along with some good information on video/Skype interviews and a worksheet and checklist. The authors provide advice on behavioral interview questions and an expanded section on video/Skype interviewing. I find that the tactile activities are the ones that help me to reflect and to retain information the best.
  5. Chapter 9 goes into detail about salary negotiation, which is a mysterious and anxiety-inducing topic for many people. The authors detail tactics as well as other considerations besides salary such as scheduling flexibility, job title, continuing education, vacation time, lifestyle perks, and moving expenses.
  6. The last four chapters are a guide to the modern workplace. The content mirrors what you will find on the website: “The Trick to Communicating Hard Messages,” “21 Unwritten (New) Rules of Running a Meeting,” “Really Struggling to Cross Off Those To-Dos? Use Your Feelings (Yep, Seriously).” Many of these are just blog-like articles and listicles from the website. But I happen to love this stuff and I added the blog to my Feedly news feed. And each topic, such as inbox management, skill development, and collaboration, is a world unto itself. You know where to go for more: the million Youtube videos, blogs and books that flesh out these concepts in exhaustive detail.

What to read next

If you have a suggestion for my next career or Quality book, let me know! Or stay tuned for my next review.

New Rules of Work and notes

How to review a scientific report

Reviewing a scientific report is a great privilege. You are among the first to see new knowledge without having to do the sometimes-tedious investigative work itself. And if you’re like me, you might feel the joy of learning something new.

When reviewing, I encourage you to do two things. Firstly, commit to applying a method, just as the investigators applied a method when conducting their study. Secondly, consider the review, your comments, and the investigator’s response as a conversation between the two of you. This conversation has a subject. It may go off in an unexpected direction. And it may expand to include other people.

In this article I will describe how to approach the important role of reviewer.

Who you are

My primary assumptions about you are twofold:

  • You do this for work (you are not a scholar doing peer review for free).
  • You follow your organization’s standard operating procedures (SOPs) and national or international regulatory guidance.

I also assume that you are the only reviewer. If you are the Quality reviewer, but the report has already undergone a thorough technical review described in your SOPs, then your review will be greatly reduced in scope (and probably much easier!).

Lastly I assume you are properly trained. You must not only be familiar with the procedures of your reviewer or Quality unit, but also with the procedures the investigators followed and with the science underlying their methods. If some calculations were done in Excel, you should be familiar with Excel. If some statistics were done in Minitab, you should be familiar with Minitab.

Gather your materials

You should have four things at a minimum:

  • The draft report
  • The protocol (plus applicable SOPs)
  • The raw data
  • Your checklists and procedures and your form or software for documenting findings

Organize

The following setup is optimal to me: the draft report is on the computer screen in front of me. The raw data and the protocol are on my left. My materials (such as a checklist, inspections from during the study, and any other Quality forms) are on the right. These Quality materials may be software-based, in which case it should be open onscreen so you can document findings in real-time, while reviewing.

Check for completeness

Before beginning, make sure everything is there so that you can plan for a thorough review with minimal handoffs (handoffs are where mistakes often happen).

Read the intro/summary/abstract

Even if this section is at the end of the report, it is a good thing to check before getting deep in the weeds.

Read the body of the report and make sure all the sections specified in the protocol/SOPs are present

Title Page/Cover Page

This should include a unique identifier, the name of the study, and the sample name and lot number (if applicable). Your organization’s logo or letterhead may be required as well. The investigators, the dates of Quality inspections, the date of finalization, and the Quality reviewer (if applicable) may appear here or on a signature page. The client (if applicable) and testing facility are listed here as well.

Table of Contents

Depending on the length of the report, a table of contents may be helpful. Ideally this will automatically update in Word. If changes are anticipated, consider verifying the table of contents at finalization to avoid duplicating your effort.

Abstract

Peer-reviewed reports will require an abstract. The requirements are very specific to the journal. Your organization probably does not require one.

Introduction/Purpose

The introduction should basically be a high-level summary of what is in the protocol. If the writer briefly stated the scientific background, the goal of the current study, and the methods in simple language while still being descriptive, he or she has done a great job!

Some studies may state the overall conclusion in this section as well. If so, check it against the conclusion section below. If this section is labeled Purpose instead of Introduction it should correspond to the Purpose section of the protocol.

Materials and Methods

This section should mirror the protocol because the protocol is basically a promise or agreement about what will be done. The materials, such as reagents, instruments, and suppliers, may be in their own section or in tables interspersed with the text. Check each against the raw data as you go.

Pull up applicable SOPs as you review. Although these documents often read like work instructions, they may provide greater detail than the protocol about what was done. Try to avoid “filling in the blanks” though. The report should stand on its own.

Read for flow and for accuracy to what is described on the lab notebooks or worksheets. Check that all specifics, such as temperatures, weights, and incubation durations match what was recorded. If there are any deviations, these may need to be addressed in the report. The need for a retest or an investigation may first be discovered here.

Raw data and report disposition

This section is important! The retention period of the report and raw data must follow regulations of the FDA, EPA, or other agencies. The report should describe where the raw data will be held so that clients or regulators can compare the report to it during an audit or investigation.

In general, the Quality review documentation you are completing right now will be stored with this package and for the same retention period.

Validity criteria

I like to see validity as a standalone section because it is so important. This and the following (evaluation/acceptance) deserve your full attention.

Validity is the soundness of the entire study. In an animal test for irritation, a valid determination may require that animals in a positive control group developed irritation as expected. In a chromatography study, a valid identification of a compound may require that the signal-to-noise ratio be below a certain threshold. In an ELISA test, a valid assay may require that the calibration curve is linear and not just a jumble of dots.

In routine assays that seldom have validity problems, this section can be easy to overlook. The text for this section may be canned. In your review, take a moment to look at validity. There may be one validity criterion or there may be several.

Without a valid assay, the investigator cannot make an evaluation (below). Or, he or she cannot make an evaluation without a compelling justification.

Evaluation/acceptance criteria

Evaluation/acceptance criteria are very important as well. They might be something specific such as “to pass, the white blood cell count must be within 4,500-10,000 white blood cells per cubic millimeter.”

In a larger study, the evaluation may be much broader. The investigator may look at the totality of health observations, bloodwork, feed consumption, histopathology, and necropsy observations. In this case, a lot of judgement is involved on the part of the investigator. As a reviewer comparing the raw data to the report, you are verifying that the evaluation broadly matches what you see in the raw data.

In both cases, the evaluation should be based on what was described in the protocol. Because of its deciding nature, to alter it or introduce other evaluation criteria requires a protocol amendment or a deviation.

Results

Reviewing results may take up the bulk of your review. Depending on your SOPs you may check 100% of transcription or just do a spot check. The report may present every data point collected, or just summary tables. You should be checking that all of it is traceable to the correct place in the raw data, that any gaps are labeled so in the report tables, and that the text matches what is in the tables and what is in the raw data.

If there are voluminous data tables to check, you may choose to review these last so that you can read the report in one sitting while it is fresh in your mind.

Discussion

The discussion section is where the results are interpreted. It should connect to the introduction or purpose of the study and then move beyond. You should check it against everything that is applicable: the protocol, the results you have just reviewed, the evaluation and validity criteria, and the relevant citations. This section is free to be lengthy but it should not restate too much from elsewhere in the report. Instead, it should lean heavily toward context, synthesis and interpretation.

Depending on your training and role in the organization, you may be responsible for assessing the scientific rationale of the report. But a typical Quality reviewer will not be responsible for this. Instead, questions of scientific judgement are left to the investigator and his or her peers, who will have already reviewed the report or at least the associated raw data.

Conclusion

The conclusion is best kept brief! If it is lengthy, suggest in your comments that some of it be moved to the Discussion section.

Both short-and-simple and long-and-complex studies are free to have a one-sentence conclusion. As a reviewer, you should double-check that the conclusion matches the interpretation arrived at in the Discussion section and does not go beyond it.

Although the conclusion is one sentence, a word on scope and a qualifier are always scientifically justified. Consider the following one-sentence conclusion:

“Under the conditions of this protocol, the levels of endotoxin in the sample were found to be below the limit of detection.”

The investigator referenced the scope of the study (the current protocol and its limitations), provided a scientifically cautious qualifier (there could be endotoxin, but not enough to detect), and clearly stated the conclusion, all in one sentence.

References

For a non-peer-reviewed study, this is not as important. It is the responsibility of the investigators to base their study on sound science. However, you should check that the resources cited exist, are cited correctly, and are accessible. If your organization is issuing reports based on papers from obscure journals from the distant past that can’t be found online or in your reference library, you might have a problem.

Check transcription

A 100% transcription check is an unambiguous requirement. It is clear for the reviewer. It is clear to the investigator that their transcription will be verified. After establishing a 100% transcription check requirement, you will start getting higher-quality raw data by the time of review. This means less back-and-forth and less reprocessing of data (data transcribed incorrectly into Excel or statistical software causes incorrect results).

Spot-check calculations

Your organization may require spot-checks or that a certain percentage of calculations be checked, or that 100% be checked. Generally, you will not need to check calculations done within statistical software. If possible, you should have a digital copy of any Excel spreadsheets used so that you can display the functions and check the calculations the easy way.

See the investigator if needed

Consulting the investigator before your review is finished is a matter of judgement. Here’s what I mean: if you can’t find what you are looking for or can’t understand a calculation, you will definitely want to ask the investigator to clarify. However, if you identify a major deviation or some other issue, you want to ensure that this is documented and winds up in your quality system. You want each major finding to be documented and to have a documented response.

The same goes for minor errors. You don’t want to hand the report and raw data back and forth, for multiple review cycles, while they address issues that come up during your review. Instead, there should be one efficient but thorough review of the complete package, followed by a write-up. Subsequent reviews should be brief, involving only signing and sending.

You might also consult a third party such as a statistician or manager.

Write it up

Organize your findings cohesively and fill out any checklists you may have. If applicable, categorize your findings by major and minor. If possible, each of your findings should describe what was found, reference the standard it is being compared to (the protocol, an SOP, etc.), and the consequences of not meeting the standard.

If the report is not acceptable or not acceptable without additional experimental work, make this clear. If a major deviation was identified, you should let the investigator know in person so they can resolve it quickly. Any required investigations should be closed and addressed in the report when the report comes back to you for finalization.

At finalization, see that all findings are addressed

When the report comes back to you for finalization, you will ideally only be reviewing minor changes and the addition of finalization dates and cover pages, logos, etc. You will want to ensure that all investigations are closed and that all your findings are addressed. Each finding should be closed out individually.

Do a final look-over

Finally, check the boring stuff! Make sure the pagination is correct and that all the pages are there. Be ready for third and fourth reviews as minor printing or electronic signature issues are sorted out.

Sign off and then thank the investigator for their hard work.

Report and raw data - neo - Copy png (2)

Some further reading

An excellent and much-shared article geared toward peer-reviewed papers. There is good advice in the comments section as well:

https://violentmetaphors.com/2013/12/13/how-to-become-good-at-peer-review-a-guide-for-young-scientists/

 

A well-organized breakdown of the elements of a scientific report from the University of Waikato (New Zealand):

http://www.waikato.ac.nz/library/study/guides/write-scientific-reports

 

A detailed guide to writing a discussion section from the University of Southern California:

http://libguides.usc.edu/writingguide/discussion

How to review audit trails

 

The FDA regularly issues warning letters regarding data integrity. Now, the agency is encouraging companies involved in GMP to get on top of these issues with regular audit trail review. You may have read some FDA or EU publications relating to data integrity and audit trails and wondered how they may apply to your organization. This article will demystify audit trail review and provide you an outline for your organization’s data integrity initiative.

What is an audit trail? What are some other data integrity terms?

Audit trail means “a secure, computer-generated, time-stamped electronic record that allows for reconstruction of the course of events relating to the creation, modification, or deletion of an electronic record. An audit trail is a chronology of the ‘who, what, when, and why’ of a record.”

The usefulness and power of the audit trail can vary quite a bit by system. In an older hemocytometer for example, you may find that there is no audit trail – the printout containing the time, date, result, and sample name is the extent of the data produced by the machine.

In the case of a newer analytical instrument such as a mass spectrometer, you will find that the audit trail is exhaustive and essentially unalterable. It will be a dynamic, all-inclusive record that displays information about every user and instrument action, in a human-readable format. It may be searchable by date, user, type of action, etc. It may even be in its own view-only, data manager module for ease of use. It may be exportable as an Excel file or printable as a PDF.

Another instrument may be intermediate in user-friendliness: you can display a page or two of printable, static information surrounding a specific file or sample, but cannot easily search or scroll.

Audit trails are part of a broader concept called metadata. “Metadata is the contextual information required to understand data.” For example, a set of data may be a list of compounds and their concentrations in a sample. The metadata surrounding this data would include the username, time and date, the data path (where the files are automatically saved), any error messages that came up, and any changes to the sample name. As outlined above, the extent of the metadata collected by the system can vary quite a bit.

An even broader concept is data integrity. This term refers to “the completeness, consistency, and accuracy of data. Complete, consistent, and accurate data should be attributable, legible, contemporaneously recorded, original or a true copy, and accurate (ALCOA).”

A concrete analogy

To make data integrity a little more concrete, consider how you treat crossouts on a paper record. Most crossouts are justified (mistakes happen). But when someone does make a crossout to update a raw data entry, your organization’s SOPs require them to address it. They do this by drawing a single line through the original entry (leaving it still legible). They then need to provide an explanation (such as “incorrect entry”) along with the identity of the person who did the correction, and the date of the correction.

In this way the date and recorder of the original entry, the date and recorder of the correction, and the reason for the correction are all documented. In addition the original entry remains legible for comparison to the corrected entry.

Data integrity refers to these same controls but in a digital format. And reviewing audit trails helps the Quality unit to review these deletions and changes in the same way they review them on paper.

Everyone in your organization plays a part in ensuring data integrity. Reviewing audit trails is part of that effort.

Where do I start?

Start by looking at the list of all computerized systems in your organization. If you have no such list, make one! A spreadsheet is adequate.

Classify each system by what regulated activity it is used for (GMP, GLP, etc.) and by its criticality. Document who is the system administrator (i.e. who controls access and sets up and deactivates users). Identify the subject matter expert who can answer your questions (see below). Sort the systems by criticality. Audit trail review is resource-intensive, so you will want to start with the critical ones.

Do a brief, informal review with a regular user of the system. See if you can easily access the audit trail from the main screen of the program or if you need to log into a separate module such as a data manager, security module, or report mode.

Audit trail functionality may have been documented in the validation. Check the validation paperwork for hints on how to access it. Check the manufacturer’s materials as well.

Consider the feasibility and resource-intensiveness of audit trail review for each system. Does reviewing it mean booting someone off the terminal? Is there a limited number of user licenses? All these considerations will influence the requirements you define in your data integrity SOP.

Take notes and write down any questions to address during a more detailed review to follow. Update the spreadsheet you started.

An assessment of each system is needed

This involves a more detailed assessment. A member of the Quality unit will sit down for about an hour with one subject matter expert in the software/instrument. In advance, you will want to come up with a template Word document with items to address. The idea is to get accurate information, including screenshots, to help with training and crafting the SOP. Topics may include the following:

What type of audit trail is it?

• Is it a dynamic event log, capturing every action automatically? Or does it just display the parameters of the individual run?
• Can the reviewer select a date range?
• Can he or she search by user, batch or project number?
• Are there any error messages for the subject matter expert to look into? Are there error messages that qualify as red flags for the reviewer?
• Does the audit trail capture when a user edits a sample name or lot number? When critical parameters are changed? When background scans are re-run? When the results from standards are rejected?
• What do manipulations and deletions look like? Is there a way to compare versions of a file?
• In analytical chemistry, is there an audit trail for acquisition of data, but no audit trail for processing of data? In chromatography, is there a way to compare integrations of peaks? What does excessive integration look like, and is this something the Quality unit should review?
• Compare records from a recent project or run to what you see onscreen in the audit trail. Go step by step and write down some typical audit trail events.

Don’t duplicate the effort of the original validation. But do look at the system through a data integrity mindset.

You may find that routine audit trail review opens new conversations about a system. Although an audit trail cannot be “corrected,” review of the audit trail may point to deficiencies elsewhere in the project’s documentation.

For example, if one person signed the batch record, lab notebook or worksheet, but the audit trail shows a different user (such as a trainer) performing that procedure, then both signatures should be in the documentation.

Another example is retesting or reprocessing: if the audit trail clearly shows a retest, but this was not documented, then this should be addressed in the documentation before Quality unit approval of the record.

For legacy systems, make sure any deficiencies are documented and that alternative data integrity measures are in place

You may look at an older instrument or software and find it is not 21CFR11 compliant, or not compliant in the way you thought it was. You can add a brief risk assessment to the validation paperwork with a memo referencing the new audit trail review procedure that is being put in place.

Look to the manufacturer’s compliance software

If the software is older, ask the vendor about updates. They may have released a data security module that you can add to the software. If there is no audit trail functionality, the system may still be Part 11 compliant. Don’t worry – audit trail functionality is not an absolute Part 11 requirement.

Get access for the QA reviewer for each system

Once this assessment is complete, get the Quality reviewer access to each system.

Some software will have a reviewer (read-only) mode. This is ideal because they will not be able to accidentally delete a file or alter a method. If the Quality reviewer is a standard user, that’s fine too!

Efficiency side note: to avoid duplication of effort, the periodic review can be reduced somewhat in scope.

Although audit trails will now be reviewed routinely, the periodic review is still important because this is where failed login attempts, periodic vendor maintenance, and changes to the overall method are captured. Keep in mind the risk-based approach.

Write the SOP!

Reference information technology SOPs, validation SOPs, ethics and training SOPs, and Quality review SOPs. Make sure it has caveats that address the range of software at your company. This procedure should involve organization-wide training. Having an executive or director champion it would be very valuable. Everyone should know what is expected of them with respect to data integrity.

Revisit this procedure in a year

To follow up, continue reviewing FDA warning letters for the agency’s thinking on data integrity matters. Distribute the pertinent letters to your team. Connect the audit trail reviewers with those involved with equipment/software validations so that audit trails are set up and understood proactively. Even better, ask for the reviewers’ input on the next set of compliance software that the vendor is trying to sell you.

A year after effectivity, revisit this procedure and see how it’s going:

• Is it too impractical? Are reports being delayed because the reviewer can’t get into the system while others are logged on?
• Is system access an issue? Does only one person in the Quality unit have the needed access?
• If you have technical data reviewers and a Quality reviewer, are they duplicating each other’s review? It can be hard to separate a technical review from a procedural review. Perhaps only one group should review.
• Look at the overall value of the project. If you find that reviewing audit trails in the way recommended in the FDA’s draft guidance is not a value-added step, let the FDA know! Now is the time to comment before the draft guidance becomes a requirement.

Lastly, take the long and broad view. Consider audit trail review to be one of many tools in your organization’s data integrity efforts. Keep in mind that other organizations are grappling with these issues as well, and there are no experts out there who have all the answers. You will have to treat data integrity as an ongoing commitment, with every data integrity procedure open to change, optimization and improvement.

If you have had successes, failures or questions in your audit trail efforts, I’d love to hear about them!

Mississippi River St Paul Feb 2017

Explore further

Data Integrity and Compliance With CGMP Guidance for Industry (draft guidance). The FDA published this in April 2016 as draft guidance. But as we know, you need to get ahead of the guidance!

(https://www.fda.gov/downloads/drugs/guidances/ucm495891.pdf)

 

This FDA slideshow, released a year later, provides some helpful elaboration on the guidance:

(https://www.fda.gov/downloads/AboutFDA/CentersOffices/OfficeofMedicalProductsandTobacco/CDER/UCM561491.pdf)

 

This article provides an concise summary of the challenges of starting an audit trail review process, and the importance of a risk-based approach based on an assessment of each system. The link opens a PDF:

(https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=5&ved=0ahUKEwj63Lvzu5nXAhUk4YMKHch_BmgQFgg-MAQ&url=http%3A%2F%2Fwww.ivtnetwork.com%2Fprintpdf%2Farticle%2Faudit-trails-reviews-data-integrity&usg=AOvVaw197_k-WCTeF3Ju5zL45gPY)