Career Book Review: “The New Rules of Work”

The New Rules of Work: The Modern Playbook for Navigating Your Career, by Alexandra Cavoulacos and‎ Kathryn Minshew (2017)

I stumbled upon this book after reading a positive review and was very impressed. It is a well-organized and focused presentation of some of the content of, an online career resource.

The authors understand the great challenge of the modern career: with more options and tools than ever, many people find themselves without a playbook for this complicated, non-linear career game.

This book is exactly what the subtitle states – a playbook. This means you can select from it what you need without having to read the whole thing. You can also use it as a reference from which to dive deliberately into the content at This may be helpful if you are like me and sometimes lose focus in glossy, bottomless websites brimming with multimedia and links.

First, read “What Color is your Parachute?”

If you have not read the classic career book “What Color is Your Parachute,” read that first. This classic career book was originally published in 1970 but the extremely dedicated author, Richard Bolles, has revised it every year since. Make sure you are reading the most recent edition! Alternatively, read one of the many spinoffs ( that may fit you more precisely.

“What Color” is wide-ranging in scope and covers pretty much every practical thing you need to do in the job hunt or career pivot. This includes self-assessments, a guide to interviewing, and many other resources. Read this and then return to “The New Rules of Work.”

Read only the sections that are most relevant to you (or, read only the gray emphasis boxes)

Just like “What Color is Your Parachute,” some sections will be more relevant than others. Read only these. For an even more abridged experience, read only the gray emphasis boxes. These boxes contain the most concise, bulleted content. In fact, some of them are simply brief articles from But they are the curated ones.

You’ll find highly actionable and practical advice

The authors drive home the point that you must develop your own personal brand to present to potential employers. This brand will stay with you much longer than your average employment stint. I appreciate this well. In fact, this blog is my way of developing a brand and sharing knowledge with people in my exact situation and with potential career connections. However, this blog was originally inspired by “What Color is Your Parachute” and from my 11 years writing a personal blog.

The résumé writing section stood out to me as particularly practical and concise. This section also points you to resources (such as templates) at the website that you may find useful. It may motivate you to replace your Microsoft Office standard templated résumé (guilty!). One tip: get rid of “References available upon request” line.

Tap into the resources at

This site is rich with new career content each day from a network of freelancers and from paid staff. There are portals to paid courses and coaching. And the site partners with employers who recruit through the site.

One problem is separating between sponsored content and professional advice. Since much of this this site is free, you are not necessarily the customer – the advertisers and employers are. The site is rife with embedded content. If you find a way to use this site in a focused way, or if you have a success story from one of their paid modules, I would love to hear about it!

One criticism: the emphasis on LinkedIn

Authors such as Cavoulacos and‎ Minshew seem to believe that having a detailed, up-to-date LinkedIn profile, and being an active user of this site, is a necessity in being considered for an interview. As members of the recruitment/talent sphere, they are no doubt heavy users of this LinkedIn. They seem to think that since it’s free, there is no downside.

But there are many reasons not to join LinkedIn, including the following:

  • It will mine your email addresses and spam your contacts.
  • It was hit with at least one major data breach (in 2012). Your information there may not be secure and is definitely not private.
  • Garbage content, meaningless endorsements, and fake profiles abound.
  • Most important for me: it is a potentially bottomless timesuck! With a career blog (such as the one I am writing right now), I know how much effort I put in and I can see the results. With LinkedIn, you might feel you need to actively cruise the site and hit up your network for many hours a month. You many never know how much is enough, and you may never know what benefit you are getting from it.

If you don’t want to join LinkedIn, don’t! It is not the necessity that so many authors claim it is. And keep in mind that you can access a lot of articles and other content on LinkedIn without creating an account.

Some great advice that I highlighted:

  1. Chapter 4 covers the absolutely essential task of building your personal brand. The authors walk you through the five steps for building a successful brand: determine your brand attributes, draft your branding statement, refine your profiles, create your personal website, and activate your brand.
  2. The résumé editing checklist in Chapter 7. “Does this sell you as the perfect candidate for the types of roles you’re seeking? Does the top third of your résumé serve as a hook to get the hiring manager to read more? Could anything benefit from examples? Does the page look visually appealing?” Lastly, submit the résumé to the employer as a PDF, not a Word document!
  3. Chapter 7 addresses how to rework not-so-relevant experience into something tailored to the job.
  4. Chapter 8 provides tried-and-true interviewing advice, along with some good information on video/Skype interviews and a worksheet and checklist. The authors provide advice on behavioral interview questions and an expanded section on video/Skype interviewing. I find that the tactile activities are the ones that help me to reflect and to retain information the best.
  5. Chapter 9 goes into detail about salary negotiation, which is a mysterious and anxiety-inducing topic for many people. The authors detail tactics as well as other considerations besides salary such as scheduling flexibility, job title, continuing education, vacation time, lifestyle perks, and moving expenses.
  6. The last four chapters are a guide to the modern workplace. The content mirrors what you will find on the website: “The Trick to Communicating Hard Messages,” “21 Unwritten (New) Rules of Running a Meeting,” “Really Struggling to Cross Off Those To-Dos? Use Your Feelings (Yep, Seriously).” Many of these are just blog-like articles and listicles from the website. But I happen to love this stuff and I added the blog to my Feedly news feed. And each topic, such as inbox management, skill development, and collaboration, is a world unto itself. You know where to go for more: the million Youtube videos, blogs and books that flesh out these concepts in exhaustive detail.

What to read next

If you have a suggestion for my next career or Quality book, let me know! Or stay tuned for my next review.

New Rules of Work and notes

How to review a scientific report

Reviewing a scientific report is a great privilege. You are among the first to see new knowledge without having to do the sometimes-tedious investigative work itself. And if you’re like me, you might feel the joy of learning something new.

When reviewing, I encourage you to do two things. Firstly, commit to applying a method, just as the investigators applied a method when conducting their study. Secondly, consider the review, your comments, and the investigator’s response as a conversation between the two of you. This conversation has a subject. It may go off in an unexpected direction. And it may expand to include other people.

In this article I will describe how to approach the important role of reviewer.

Who you are

My primary assumptions about you are twofold:

  • You do this for work (you are not a scholar doing peer review for free).
  • You follow your organization’s standard operating procedures (SOPs) and national or international regulatory guidance.

I also assume that you are the only reviewer. If you are the Quality reviewer, but the report has already undergone a thorough technical review described in your SOPs, then your review will be greatly reduced in scope (and probably much easier!).

Lastly I assume you are properly trained. You must not only be familiar with the procedures of your reviewer or Quality unit, but also with the procedures the investigators followed and with the science underlying their methods. If some calculations were done in Excel, you should be familiar with Excel. If some statistics were done in Minitab, you should be familiar with Minitab.

Gather your materials

You should have four things at a minimum:

  • The draft report
  • The protocol (plus applicable SOPs)
  • The raw data
  • Your checklists and procedures and your form or software for documenting findings


The following setup is optimal to me: the draft report is on the computer screen in front of me. The raw data and the protocol are on my left. My materials (such as a checklist, inspections from during the study, and any other Quality forms) are on the right. These Quality materials may be software-based, in which case it should be open onscreen so you can document findings in real-time, while reviewing.

Check for completeness

Before beginning, make sure everything is there so that you can plan for a thorough review with minimal handoffs (handoffs are where mistakes often happen).

Read the intro/summary/abstract

Even if this section is at the end of the report, it is a good thing to check before getting deep in the weeds.

Read the body of the report and make sure all the sections specified in the protocol/SOPs are present

Title Page/Cover Page

This should include a unique identifier, the name of the study, and the sample name and lot number (if applicable). Your organization’s logo or letterhead may be required as well. The investigators, the dates of Quality inspections, the date of finalization, and the Quality reviewer (if applicable) may appear here or on a signature page. The client (if applicable) and testing facility are listed here as well.

Table of Contents

Depending on the length of the report, a table of contents may be helpful. Ideally this will automatically update in Word. If changes are anticipated, consider verifying the table of contents at finalization to avoid duplicating your effort.


Peer-reviewed reports will require an abstract. The requirements are very specific to the journal. Your organization probably does not require one.


The introduction should basically be a high-level summary of what is in the protocol. If the writer briefly stated the scientific background, the goal of the current study, and the methods in simple language while still being descriptive, he or she has done a great job!

Some studies may state the overall conclusion in this section as well. If so, check it against the conclusion section below. If this section is labeled Purpose instead of Introduction it should correspond to the Purpose section of the protocol.

Materials and Methods

This section should mirror the protocol because the protocol is basically a promise or agreement about what will be done. The materials, such as reagents, instruments, and suppliers, may be in their own section or in tables interspersed with the text. Check each against the raw data as you go.

Pull up applicable SOPs as you review. Although these documents often read like work instructions, they may provide greater detail than the protocol about what was done. Try to avoid “filling in the blanks” though. The report should stand on its own.

Read for flow and for accuracy to what is described on the lab notebooks or worksheets. Check that all specifics, such as temperatures, weights, and incubation durations match what was recorded. If there are any deviations, these may need to be addressed in the report. The need for a retest or an investigation may first be discovered here.

Raw data and report disposition

This section is important! The retention period of the report and raw data must follow regulations of the FDA, EPA, or other agencies. The report should describe where the raw data will be held so that clients or regulators can compare the report to it during an audit or investigation.

In general, the Quality review documentation you are completing right now will be stored with this package and for the same retention period.

Validity criteria

I like to see validity as a standalone section because it is so important. This and the following (evaluation/acceptance) deserve your full attention.

Validity is the soundness of the entire study. In an animal test for irritation, a valid determination may require that animals in a positive control group developed irritation as expected. In a chromatography study, a valid identification of a compound may require that the signal-to-noise ratio be below a certain threshold. In an ELISA test, a valid assay may require that the calibration curve is linear and not just a jumble of dots.

In routine assays that seldom have validity problems, this section can be easy to overlook. The text for this section may be canned. In your review, take a moment to look at validity. There may be one validity criterion or there may be several.

Without a valid assay, the investigator cannot make an evaluation (below). Or, he or she cannot make an evaluation without a compelling justification.

Evaluation/acceptance criteria

Evaluation/acceptance criteria are very important as well. They might be something specific such as “to pass, the white blood cell count must be within 4,500-10,000 white blood cells per cubic millimeter.”

In a larger study, the evaluation may be much broader. The investigator may look at the totality of health observations, bloodwork, feed consumption, histopathology, and necropsy observations. In this case, a lot of judgement is involved on the part of the investigator. As a reviewer comparing the raw data to the report, you are verifying that the evaluation broadly matches what you see in the raw data.

In both cases, the evaluation should be based on what was described in the protocol. Because of its deciding nature, to alter it or introduce other evaluation criteria requires a protocol amendment or a deviation.


Reviewing results may take up the bulk of your review. Depending on your SOPs you may check 100% of transcription or just do a spot check. The report may present every data point collected, or just summary tables. You should be checking that all of it is traceable to the correct place in the raw data, that any gaps are labeled so in the report tables, and that the text matches what is in the tables and what is in the raw data.

If there are voluminous data tables to check, you may choose to review these last so that you can read the report in one sitting while it is fresh in your mind.


The discussion section is where the results are interpreted. It should connect to the introduction or purpose of the study and then move beyond. You should check it against everything that is applicable: the protocol, the results you have just reviewed, the evaluation and validity criteria, and the relevant citations. This section is free to be lengthy but it should not restate too much from elsewhere in the report. Instead, it should lean heavily toward context, synthesis and interpretation.

Depending on your training and role in the organization, you may be responsible for assessing the scientific rationale of the report. But a typical Quality reviewer will not be responsible for this. Instead, questions of scientific judgement are left to the investigator and his or her peers, who will have already reviewed the report or at least the associated raw data.


The conclusion is best kept brief! If it is lengthy, suggest in your comments that some of it be moved to the Discussion section.

Both short-and-simple and long-and-complex studies are free to have a one-sentence conclusion. As a reviewer, you should double-check that the conclusion matches the interpretation arrived at in the Discussion section and does not go beyond it.

Although the conclusion is one sentence, a word on scope and a qualifier are always scientifically justified. Consider the following one-sentence conclusion:

“Under the conditions of this protocol, the levels of endotoxin in the sample were found to be below the limit of detection.”

The investigator referenced the scope of the study (the current protocol and its limitations), provided a scientifically cautious qualifier (there could be endotoxin, but not enough to detect), and clearly stated the conclusion, all in one sentence.


For a non-peer-reviewed study, this is not as important. It is the responsibility of the investigators to base their study on sound science. However, you should check that the resources cited exist, are cited correctly, and are accessible. If your organization is issuing reports based on papers from obscure journals from the distant past that can’t be found online or in your reference library, you might have a problem.

Check transcription

A 100% transcription check is an unambiguous requirement. It is clear for the reviewer. It is clear to the investigator that their transcription will be verified. After establishing a 100% transcription check requirement, you will start getting higher-quality raw data by the time of review. This means less back-and-forth and less reprocessing of data (data transcribed incorrectly into Excel or statistical software causes incorrect results).

Spot-check calculations

Your organization may require spot-checks or that a certain percentage of calculations be checked, or that 100% be checked. Generally, you will not need to check calculations done within statistical software. If possible, you should have a digital copy of any Excel spreadsheets used so that you can display the functions and check the calculations the easy way.

See the investigator if needed

Consulting the investigator before your review is finished is a matter of judgement. Here’s what I mean: if you can’t find what you are looking for or can’t understand a calculation, you will definitely want to ask the investigator to clarify. However, if you identify a major deviation or some other issue, you want to ensure that this is documented and winds up in your quality system. You want each major finding to be documented and to have a documented response.

The same goes for minor errors. You don’t want to hand the report and raw data back and forth, for multiple review cycles, while they address issues that come up during your review. Instead, there should be one efficient but thorough review of the complete package, followed by a write-up. Subsequent reviews should be brief, involving only signing and sending.

You might also consult a third party such as a statistician or manager.

Write it up

Organize your findings cohesively and fill out any checklists you may have. If applicable, categorize your findings by major and minor. If possible, each of your findings should describe what was found, reference the standard it is being compared to (the protocol, an SOP, etc.), and the consequences of not meeting the standard.

If the report is not acceptable or not acceptable without additional experimental work, make this clear. If a major deviation was identified, you should let the investigator know in person so they can resolve it quickly. Any required investigations should be closed and addressed in the report when the report comes back to you for finalization.

At finalization, see that all findings are addressed

When the report comes back to you for finalization, you will ideally only be reviewing minor changes and the addition of finalization dates and cover pages, logos, etc. You will want to ensure that all investigations are closed and that all your findings are addressed. Each finding should be closed out individually.

Do a final look-over

Finally, check the boring stuff! Make sure the pagination is correct and that all the pages are there. Be ready for third and fourth reviews as minor printing or electronic signature issues are sorted out.

Sign off and then thank the investigator for their hard work.

Report and raw data - neo - Copy png (2)

Some further reading

An excellent and much-shared article geared toward peer-reviewed papers. There is good advice in the comments section as well:


A well-organized breakdown of the elements of a scientific report from the University of Waikato (New Zealand):


A detailed guide to writing a discussion section from the University of Southern California:

How to review audit trails


The FDA regularly issues warning letters regarding data integrity. Now, the agency is encouraging companies involved in GMP to get on top of these issues with regular audit trail review. You may have read some FDA or EU publications relating to data integrity and audit trails and wondered how they may apply to your organization. This article will demystify audit trail review and provide you an outline for your organization’s data integrity initiative.

What is an audit trail? What are some other data integrity terms?

Audit trail means “a secure, computer-generated, time-stamped electronic record that allows for reconstruction of the course of events relating to the creation, modification, or deletion of an electronic record. An audit trail is a chronology of the ‘who, what, when, and why’ of a record.”

The usefulness and power of the audit trail can vary quite a bit by system. In an older hemocytometer for example, you may find that there is no audit trail – the printout containing the time, date, result, and sample name is the extent of the data produced by the machine.

In the case of a newer analytical instrument such as a mass spectrometer, you will find that the audit trail is exhaustive and essentially unalterable. It will be a dynamic, all-inclusive record that displays information about every user and instrument action, in a human-readable format. It may be searchable by date, user, type of action, etc. It may even be in its own view-only, data manager module for ease of use. It may be exportable as an Excel file or printable as a PDF.

Another instrument may be intermediate in user-friendliness: you can display a page or two of printable, static information surrounding a specific file or sample, but cannot easily search or scroll.

Audit trails are part of a broader concept called metadata. “Metadata is the contextual information required to understand data.” For example, a set of data may be a list of compounds and their concentrations in a sample. The metadata surrounding this data would include the username, time and date, the data path (where the files are automatically saved), any error messages that came up, and any changes to the sample name. As outlined above, the extent of the metadata collected by the system can vary quite a bit.

An even broader concept is data integrity. This term refers to “the completeness, consistency, and accuracy of data. Complete, consistent, and accurate data should be attributable, legible, contemporaneously recorded, original or a true copy, and accurate (ALCOA).”

A concrete analogy

To make data integrity a little more concrete, consider how you treat crossouts on a paper record. Most crossouts are justified (mistakes happen). But when someone does make a crossout to update a raw data entry, your organization’s SOPs require them to address it. They do this by drawing a single line through the original entry (leaving it still legible). They then need to provide an explanation (such as “incorrect entry”) along with the identity of the person who did the correction, and the date of the correction.

In this way the date and recorder of the original entry, the date and recorder of the correction, and the reason for the correction are all documented. In addition the original entry remains legible for comparison to the corrected entry.

Data integrity refers to these same controls but in a digital format. And reviewing audit trails helps the Quality unit to review these deletions and changes in the same way they review them on paper.

Everyone in your organization plays a part in ensuring data integrity. Reviewing audit trails is part of that effort.

Where do I start?

Start by looking at the list of all computerized systems in your organization. If you have no such list, make one! A spreadsheet is adequate.

Classify each system by what regulated activity it is used for (GMP, GLP, etc.) and by its criticality. Document who is the system administrator (i.e. who controls access and sets up and deactivates users). Identify the subject matter expert who can answer your questions (see below). Sort the systems by criticality. Audit trail review is resource-intensive, so you will want to start with the critical ones.

Do a brief, informal review with a regular user of the system. See if you can easily access the audit trail from the main screen of the program or if you need to log into a separate module such as a data manager, security module, or report mode.

Audit trail functionality may have been documented in the validation. Check the validation paperwork for hints on how to access it. Check the manufacturer’s materials as well.

Consider the feasibility and resource-intensiveness of audit trail review for each system. Does reviewing it mean booting someone off the terminal? Is there a limited number of user licenses? All these considerations will influence the requirements you define in your data integrity SOP.

Take notes and write down any questions to address during a more detailed review to follow. Update the spreadsheet you started.

An assessment of each system is needed

This involves a more detailed assessment. A member of the Quality unit will sit down for about an hour with one subject matter expert in the software/instrument. In advance, you will want to come up with a template Word document with items to address. The idea is to get accurate information, including screenshots, to help with training and crafting the SOP. Topics may include the following:

What type of audit trail is it?

• Is it a dynamic event log, capturing every action automatically? Or does it just display the parameters of the individual run?
• Can the reviewer select a date range?
• Can he or she search by user, batch or project number?
• Are there any error messages for the subject matter expert to look into? Are there error messages that qualify as red flags for the reviewer?
• Does the audit trail capture when a user edits a sample name or lot number? When critical parameters are changed? When background scans are re-run? When the results from standards are rejected?
• What do manipulations and deletions look like? Is there a way to compare versions of a file?
• In analytical chemistry, is there an audit trail for acquisition of data, but no audit trail for processing of data? In chromatography, is there a way to compare integrations of peaks? What does excessive integration look like, and is this something the Quality unit should review?
• Compare records from a recent project or run to what you see onscreen in the audit trail. Go step by step and write down some typical audit trail events.

Don’t duplicate the effort of the original validation. But do look at the system through a data integrity mindset.

You may find that routine audit trail review opens new conversations about a system. Although an audit trail cannot be “corrected,” review of the audit trail may point to deficiencies elsewhere in the project’s documentation.

For example, if one person signed the batch record, lab notebook or worksheet, but the audit trail shows a different user (such as a trainer) performing that procedure, then both signatures should be in the documentation.

Another example is retesting or reprocessing: if the audit trail clearly shows a retest, but this was not documented, then this should be addressed in the documentation before Quality unit approval of the record.

For legacy systems, make sure any deficiencies are documented and that alternative data integrity measures are in place

You may look at an older instrument or software and find it is not 21CFR11 compliant, or not compliant in the way you thought it was. You can add a brief risk assessment to the validation paperwork with a memo referencing the new audit trail review procedure that is being put in place.

Look to the manufacturer’s compliance software

If the software is older, ask the vendor about updates. They may have released a data security module that you can add to the software. If there is no audit trail functionality, the system may still be Part 11 compliant. Don’t worry – audit trail functionality is not an absolute Part 11 requirement.

Get access for the QA reviewer for each system

Once this assessment is complete, get the Quality reviewer access to each system.

Some software will have a reviewer (read-only) mode. This is ideal because they will not be able to accidentally delete a file or alter a method. If the Quality reviewer is a standard user, that’s fine too!

Efficiency side note: to avoid duplication of effort, the periodic review can be reduced somewhat in scope.

Although audit trails will now be reviewed routinely, the periodic review is still important because this is where failed login attempts, periodic vendor maintenance, and changes to the overall method are captured. Keep in mind the risk-based approach.

Write the SOP!

Reference information technology SOPs, validation SOPs, ethics and training SOPs, and Quality review SOPs. Make sure it has caveats that address the range of software at your company. This procedure should involve organization-wide training. Having an executive or director champion it would be very valuable. Everyone should know what is expected of them with respect to data integrity.

Revisit this procedure in a year

To follow up, continue reviewing FDA warning letters for the agency’s thinking on data integrity matters. Distribute the pertinent letters to your team. Connect the audit trail reviewers with those involved with equipment/software validations so that audit trails are set up and understood proactively. Even better, ask for the reviewers’ input on the next set of compliance software that the vendor is trying to sell you.

A year after effectivity, revisit this procedure and see how it’s going:

• Is it too impractical? Are reports being delayed because the reviewer can’t get into the system while others are logged on?
• Is system access an issue? Does only one person in the Quality unit have the needed access?
• If you have technical data reviewers and a Quality reviewer, are they duplicating each other’s review? It can be hard to separate a technical review from a procedural review. Perhaps only one group should review.
• Look at the overall value of the project. If you find that reviewing audit trails in the way recommended in the FDA’s draft guidance is not a value-added step, let the FDA know! Now is the time to comment before the draft guidance becomes a requirement.

Lastly, take the long and broad view. Consider audit trail review to be one of many tools in your organization’s data integrity efforts. Keep in mind that other organizations are grappling with these issues as well, and there are no experts out there who have all the answers. You will have to treat data integrity as an ongoing commitment, with every data integrity procedure open to change, optimization and improvement.

If you have had successes, failures or questions in your audit trail efforts, I’d love to hear about them!

Mississippi River St Paul Feb 2017

Explore further

Data Integrity and Compliance With CGMP Guidance for Industry (draft guidance). The FDA published this in April 2016 as draft guidance. But as we know, you need to get ahead of the guidance!



This FDA slideshow, released a year later, provides some helpful elaboration on the guidance:



This article provides an concise summary of the challenges of starting an audit trail review process, and the importance of a risk-based approach based on an assessment of each system. The link opens a PDF:


How to validate a spreadsheet

Over the years, the spreadsheets circulating in a testing or manufacturing company can become a real zoo. This presents a regulatory and client-relevant problem. It is important that you take a risk-based approach and validate your spreadsheets according to their use.

A spreadsheet used for dosing, reporting data or in dilutions may be a “legacy document”: a digital file developed years earlier by someone who may not even be employed with you anymore. It may have been developed in a previous version of Excel.  Its workings may include macros that the current users only vaguely understand. Or it may have functions and hidden rows that have not been examined in years.

Although configurable off-the-shelf spreadsheet software such as Excel has many good security and data integrity features, you must validate spreadsheets according to a defined procedure. Regulators and clients increasingly expect this. This article will show you how to get started.

Define the three responsibilities involved: developer, verifier, and quality assurance


The developer is the person who creates the spreadsheet or who initiates the validation of an existing spreadsheet. This person will be familiar with the use of the spreadsheet and with Excel (or whatever spreadsheet software you use).

The developer is responsible for understanding the requirements in the SOP that defines the development, use and control of spreadsheets. He or she will complete a form, ideally a checklist, that refers to each of these requirements. He or she will also document the spreadsheet’s inputs, processing, and outputs for verification (see below).

This is the lengthiest and most involved step of the three. Making it easier is your job. You can help by making sure the checklist is user-friendly and flows nicely with the SOP!


Like the developer, the verifier will also be someone who is familiar with the context the spreadsheet will be used in and with Excel. For example, this person may a technician who uses the spreadsheet every day, a peer of the developer, or an Excel expert within the company.

To save space on the form and to show the verification clearly, you can simply include an additional column where the verifier adds a check mark indicating each requirement was met.

Another term for verifier is simply technical reviewer.


When this verification step is done, QA can serve as the final sign-off. This may include spot-checks, format review, a wiping of personal identifying information and comment history, and administering a final document password that will only be distributed to authorized users.

If problems come up during the spot check, the QA reviewer will take a closer look and possibly reject the document, kicking it back to the developer, who must address it and get the spreadsheet verified again. The QA reviewer should not sign until all issues have been resolved.

When these three people have done their work, you have a validated spreadsheet. And the overall process is called the spreadsheet validation. (You may use different terms than what I have described, of course.)

Break it down into three steps: inputs, processing, and outputs

I recommend this approach because it reminds you to visualize the data flow in three parts. On the left is raw data such as a column of animal weights in grams.

On the right is the outputs, such as the required dose of a test article extract in milliliters.

In the middle is the processing. In our example, the individual animal’s weight is multiplied by a specific factor (according to the protocol or SOP) and then presented in the correct unit. The spreadsheet may also provide the group’s average weight, highlight any animals whose weight is outside a defined range, and present the highest and lowest weights for inclusion in the final report.

What goes on during the processing is like a black box. The spreadsheet validation explains, verifies and documents this processing so that independent reviewers can understand the spreadsheet’s purpose and be assured of the quality of the data that comes out. It brings the black box out into the light!

Define any additional requirements

The above are very general requirements that apply to all spreadsheets. Depending on use, some more specific requirements might include:

  • All cells except those needed to enter data or sample information are locked down. This one is very important!
  • Data validation is enabled. In other words, nonsensical entries are not permitted or will result in an error message.
  • Conditional formatting is appropriate and user-friendly. It may help in the study interpretation or trigger validity issues per protocol, so getting correct conditional formatting is important. Note that conditional formatting rules are not displayed in Excel’s Show Formulas function.
  • Drop-down lists are in the desired order
  • Rounding, significant figures, and display of digits is appropriate. Only the final value is rounded (there are no averaging of averages, for example).
  • Input cells display the value exactly as entered. This helps reviewers who must check transcription.
  • There are places to enter sample information. There is a place to enter the user’s name and date (or, better, a function that does this automatically).
  • The spreadsheet has been checked for robustness. Expected and extreme values were tried. You attempted to “break” it.
  • Personal identifying information has been wiped using the “inspect document” function in Office 2016.
  • The print preview was inspected. Saving and copying and pasting data works fine. The font matches the final report table where the results will be copied and pasted, if applicable.
  • Pagination, logos and the form identifier are included and correct.
  • Printouts are included in the validation paperwork with all the formulas, row numbers, and column numbers displayed.
  • This is optional: an additional sheet has been included that helps users understand and use the spreadsheet. (e.g. What raw data do I enter and when? When do I print it? What secure network folder do I save it in?) We like to think every user will read and understand the SOP. But in reality this sheet may be what they rely on!

Create a form where the three responsible groups document their work.

There should be a space for free text where the developer can describe the inputs, processing, and outputs. There should be a checklist for any additional requirements, and space to address the ones that are not applicable. There should be a clear separation of the developing from the checking.

The form should be comprehensive in case the developer and verifier do not actually read the spreadsheet validation SOP (we all know this happens). Finally, in exchange for asking busy people to do this validation, you should make sure the form is easy to use!

Hawk Ridge 9-11-16 (2)

Some further reading

Surprisingly, the FDA provides very little guidance on spreadsheet validation:



However, the FDA has issued warning letters that cite spreadsheet deficiencies, including spreadsheets whose functions are not locked down:



Two more good resources:



Independence: the importance of separating the checking from the doing

Independence is a prominent theme in the field of Quality. At its heart is the separation of doing and checking.

The example of the Hubble Space Telescope

Why is independence important? Consider the Hubble Space Telescope. It is a major achievement of science. But the first images it returned were found to be blurry! After a costly spacewalk and replacement, the problem was fixed. A fascinating 1990 NASA report pointed to why:


“The Perkin-Elmer plan for fabricating the primary mirror placed complete reliance on the reflective null corrector as the only test to be used in both manufacturing and verifying the mirror’s surface with the required precision.”


Basically, the instrument used to test the mirror was the same one used to manufacture it. In fact, cruder (but more independent) tests had detected the same error, but these results were disregarded.


This report clearly implicates the lack of independent checking. The instrument, although exquisitely precise, contributed to this costly error because it was used for both manufacturing the mirror and for verifying it.


The report touches on other Quality concepts as it goes on to say, “the engineering unit responsible for the mirror was insulated from review or technical supervision and were unaware that discrepant data existed, and were subject to great concern about cost and schedule, which further inhibited consideration of independent tests.”’

Reasons for independent review

Building a space telescope can seem simple compared to the dynamics of a typical workplace. The following are some more “human-factors” reasons for independent review.

Dumb mistakes

An independent reviewer can catch dumb mistakes. In drafting a scientific report, for example, the writer can become blind to certain errors that are readily apparent to a reviewer. Each draft is so subtly different from the previous one that the final draft may have drifted, or become too wordy, or have been dragged along with the author’s interpretation so far that the protocol and predefined endpoints have been underemphasized or not addressed at all.


The reviewer will also notice more prosaic omissions right before finalization such as a missing company watermark, a missing part of the header, or the wrong formatting.

Context-free understanding

Reviewers in scientific and technical fields can have varying levels of independence from the author. Some may be within the same company but in a different department. Some may be contracted. Some may be peers with the same title as the author but a different specialty.


In each of these arrangements, the reviewer has the advantage of looking at the report or product from a perspective that is closer to that of the customer. The reviewer will question confusing technical jargon. The reviewer will avoid “filling in the gaps” with their own knowledge.


Most importantly, an independent reviewer will apply a standard (such as a checklist) during their review instead of relying on their expertise, however extensive it may be, in the subject. If this standard is valid (see below) and matches the customer’s requirements and any other pertinent requirements, then problems will be identified well in advance of the product being released to the customer.

The independent reviewer will be more familiar with the requirements

Understanding the customer’s requirements

Members of a well-organized Quality program will regularly review client feedback, client audits, and returned product. Ideally they will be involved in requirements elicitation, where other sales and design people elicit and define exactly what the customer wants. These customer requirements then are translated into the standard that the product is checked against during the review.

Understanding business requirements

Ideally, scientists and technical personnel involved in testing and interpretation of results will be insulated from customer pressures. This should apply to the reviewer as well. But the reviewer may be better versed in the business environment in which the research, testing or manufacturing is occurring.

Understanding other requirements such as regulations and standards

Although scientists are usually enthusiastic about learning the best techniques and methods, there can be a great amount of drift from the best practices. To obtain consistent results in a particular assay, an entire lab may rely on a technique developed 30 or 40 years previously and defined in a single published paper. Without pressure from a Quality group or other reviewers, they may never make the undertaking of updating their methods to current standards.


In fact, in many organizations the Quality program includes a regulatory affairs unit. This is because Quality reviewers check the product against the standard. The regulatory affairs unit keeps the correct and current standards on file and available to the scientists. When there is a gap between the current methods and the standards, members of the Quality group document this and ask for justification. If the gap is too large, the physical products, validations, protocols, standard operating procedures, and final reports may be rejected, and not accepted until those gaps are closed.

Applying a validated standard

This idea is crucial and warrants its own article. Suffice it to say that an independent reviewer, because they are approaching someone else’s work, is more likely than the originator/author/producer to use a standard that has been demonstrated to match requirements.


I will expand on the role of standards in Quality later.

An independent reviewer is also insulated from customer pressures.

This is not to say they will impose undue delay, rather that they are less likely to be influenced by pressures outside the predetermined standards they use during review.


You will find something similar in the editorial policy of a news organization: the sales team, which sells advertising space to businesses such as car dealerships, will not have extensive involvement with the editorial team, which may be very vocal about air pollution, traffic deaths and climate change.


If the independent reviewer adheres to clear, written requirements, and the originator is familiar with these requirements, then problems are more likely to be fixed upstream, in advance of the review. Even problems that the originator could have easily hidden from the customer will have been addressed, because of the likelihood that they will be discovered during review.

Suggestions for your organization

Independence is difficult to define and implement. Here are some open-ended questions to ask of your organization.

Is your Quality unit too involved in the nitty-gritty of things?

Do you remain the final check on most processes before release or finalization? When you review something, are things mostly polished and complete? Or does your Quality role involve identifying minute corrections, which then require rework? Keep in mind that the more errors there are in the product submitted to the Quality group, the more likely there is to be further rework and further review handoffs.


To fix errors further upstream, at lower cost, consider introducing a technical data review unit that does these checks. QA can implement spot checks and other verifications that the technical data reviewers’ processes are in place and that the defined reviews were done. Consider doing this for the most error-prone processes first. For other processes, that are found to be running smoothly, the number of checks and reviews can be reduced.

Are you poised to adapt, wherever the company goes in terms of growth?

You are responsible for timely reviews so you do not want to be a bottleneck at the end of the process. If the Quality review often triggers investigations, rework and holds that delay the release of the product, then it is time to add a technical review upstream.


You want to be able to grow with the organization. If your business expands 10% more than predicted in one year, the Quality group should be able to absorb that, and decide later on whether it should expand too. An independent approach to review allows management to control how much the reviewers’ work will grow and change with the growth of testing or production.

Do your Quality people have routine, repetitive work that can be delegated to a technician instead?

An example may be temperature and humidity monitoring, or sampling for bacteria in the water system. If these are routine and repetitive, a technician can quickly be trained to do them. The QA personnel can then sign off on these checks and provide management with assurance that the process is under control. While checking these logs and reports, the Quality group may find they have the time to define new processes for better trending and reporting as well.

Do you have a way to address drift?

Drift must be addressed periodically. There is no tried and true way to do this, but a couple of things may help: periodic reviews, regular looks at industry best practices, and the normal churn that results from new people being hired from other companies and longtime employees departing.


That last one – churn – is important. Often a longstanding but flawed practice goes unquestioned until a new employee says, “You know, at my old job we did it this way…”


When someone says this, listen!

Some further reading

A costly error partly caused by checking with the same instrument that was used for doing:

The Hubble Space Telescope Optical Systems Failure Report


What is a quality assurance unit?


[This article pertains to nonclinical laboratory studies.]


Suppose you were just hired in a Quality role (QA specialist, tester, auditor, inspector, reviewer, technical data specialist, etc.). During your orientation and training people bandy about such acronyms and initialisms as the GLPs, the OECD, the QAU, and 21 CFR 58.


You probably have questions. Are you supposed to know all these regulations and guidance documents in addition to training on your organization’s protocols and standard operating procedures? Not to mention the underlying science of the testing your company does? What is the history of this kind of role? Where does your role fit in with the overall business and with the science and technology going on around you?


First, take a deep breath and remind yourself that you don’t need to figure it out on your own. This article will give you an overview of the QAU from the United States regulatory perspective, the international guidance, and some organizational particulars. Finally, you’ll find links to help you learn further on your own.

US FDA regulations

Take a few minutes to read the Quality Assurance Unit (QAU) section of 21 CFR 58. Do it. It’s several concise paragraphs that are free of jargon:




What you find is that the QAU is one of three very important groups of personnel defined in the regulations:


  • Test facility management
  • The study director
  • The QAU


The section on the QAU is the lengthiest of the three.


Five things stand out. The QAU:


  1. Monitors each study to assure management that facilities, equipment, personnel, methods, practices, records, and controls are in conformance with the regulations in this Part.
  2. Inspects each nonclinical laboratory study at intervals adequate to assure the integrity of the study.
  3. Determines that no deviations from approved protocols or standard operating procedures were made without proper authorization and documentation.
  4. Reviews the final study report to assure that the methods and results are true to the standard operating procedures and raw data.
  5. Prepares and signs a statement to be included with the final study report which specifies the dates inspections were made and findings reported to management and to the study director.


Scanning the above for keywords, you see such action terms as monitoring, inspecting, reviewing, and reporting, all of which you will do as a Quality professional.


Consider this a touchpoint. As you learn and grow in Quality and come take on decisionmaking responsibilities, revisit 21CFR58 occasionally. Read the Guidance for Industry document (linked below) for questions raised by other professionals and answers provided directly by the FDA.

International guidance

The Organisation for Economic Co-operation and Development (OECD) is not a governmental body, but it is composed of representatives of governments.


The OECD published their own GLPs, and these align for the most part with the FDA regulations. Foreign governments can hold organizations to these OECD standards. Reading them can provide you with an international perspective on the sometimes jargon-filled US regulatory environment.


Some individual differences between the FDA and the OECD GLP requirements regarding the QAU follow:

  • There is more specific language on how the Quality Assurance Programme should be carried out by individuals designated by and directly responsible to management and who are familiar with the test procedures.
  • There is more specific language on how there must be clear lines of communication in a multi-site study between the Study Director, Principal Investigator(s), the Quality Assurance Programme and study personnel.
  • There is language on ensuring that the Quality Assurance personnel have a copy of the study plan and any amendments in a timely manner and that the study director will communicate effectively with the Quality Assurance personnel as required during the conduct of the study.

Particulars of an organization

How your organization meets these QAU requirements will vary quite a bit.


Take the example of “inspecting the study.” You may decide that every study gets inspected at an interval of one week. You might decide that these inspections will occur during critical phases such as preparation or dosing, where mix-ups and mistakes are highly consequential.


The reporting of the inspection to the study director and management could be a paper form, or an online tracking system. You may decide that only deviations from the protocol get reported. Or you may inspect the raw data accumulated so far very thoroughly, with the goal of finding and fixing problems as far upstream as possible.


After getting comfortable with your procedures, you might all of a sudden have an executive, Lean project team, or consultant come in who brings new ideas and revolutionizes the way you do things.


Although the particulars may vary, every company is under constant pressure from regulators, clients, and their own leadership to improve their QAU’s effectiveness and continue the upward spiral of quality.

New Quality software and other techniques

Your QAU will use third-party software to do its job. Some examples of quality management software include:

  • MasterControl
  • uniPoint Quality Management
  • TrueERP
  • QT9 Quality Management
  • Verse Solutions


This list will only grow. You will need to master one, two or more of these systems at the user level. You may need to train others in your organization in how to use them. You may need to work closely with the software vendor to get the specific functionality your organization needs. Finally, as you gain decisionmaking authority, you will need to be a smart buyer and decide which quality management software is worth the price.

What next?

The particulars of the QAU are constantly changing as business and technology change. But the underlying principles of independence, inspection, thorough review, documentation and reporting will remain the same. Some of the GLP regulations we follow go back to 1979. But the software we use to assure quality may have just been invented a year ago.


Be prepared to continually question the way your QAU operates and grows with the rest of the organization. As always, if you found this article insightful (or found it to be the opposite), let me know!

Some further reading

Read about the astonishing scientific fraud scandal that led to the development of the US GLP regulations:



FDA 21CFR58, Good Laboratory Practice for Nonclinical Laboratory Studies



FDA Comparison Chart of FDA and EPA Good Laboratory Practice (GLP) Regulations and the OECD Principles of GLP



FDA Guidance for Industry Good Laboratory Practices Questions and Answers



OECD Quality Assurance and GLP guidance



Why work in Quality?

Quality is a profession at the heart of our advanced, industrialized, information society. It is also a field that is full of opportunity. Job boards are peppered with postings for quality managers, quality assurance specialists, quality auditors, quality control technicians, and other roles. Although academic coursework in Quality is in its infancy, opportunities in business are abundant.


Read on to find out more about the field of Quality. As you read, think about where you and your mix of interests and skills might fit in.

Everyday examples

Reflect for a moment on some everyday things: the marvelous automotive machinery that gets you around town. The safe and effective pharmaceuticals that you take for common or more serious  ailments. The flawless email, word processing and photo backup provided by Google free of charge. The item you ordered online (after Amazon suggested it to you) with a click or two that arrived the next morning.


These consumer objects are at the customer-facing end of highly complex systems. Underlying it all are quality systems that are built and continuously reinvented in order to meet the customer’s requirements and to deliver delight.


This last term – delight – might be unexpected. But it provides perspective on the endgame here. In a competitive marketplace, a company gets an advantage by delivering customer delight, by doing it consistently, and by doing it for a lower price than its competitors.


Amazon again provides a good example. The online giant has continually improved its shipping times while keeping prices low. It adds new perks to its Prime loyalty program regularly. And it has a customer-focused return and refund policy that continues to earn praise from consumer advocates and commentators.


It’s no wonder then that 49% of US households hold an Amazon Prime subscription. Amazon’s astonishing sales and membership numbers will continue to grow as long as the company keeps delighting its customers.


Things can go wrong

On the other hand, we have all had negative experiences with these same companies and their peers. I once read with astonishment how a man lost almost all his personal data when his Google (!) cloud backup somehow failed. I have had frustrating and fruitless email exchanges with third-party Amazon sellers when my packages never arrived (Amazon eventually reimbursed me). I received a surprise bill from my dentist when they performed an x-ray I wasn’t due for under my insurance plan. And I have gotten scary letters from at least two large financial companies apologizing for a hacking breach of my personal and account information.


Sometimes it’s small things and near-misses that spook you. A slip-up can shake your confidence in a company you normally trust and cause you to permanently take your money elsewhere. I recall with horror a time when I worked in a hospital emergency room and the nurse read and took notes in a patient’s paper chart for several minutes before realizing it was the wrong one. The nurse apologized. The patient expressed his totally justified dismay. In response the nurse said, “Yeah, it happens” and continued in the correct chart. I would not blame the patient if he chose to write a negative Yelp review of the hospital, bring the incident to hospital management, and never use that clinic and hospital system again. After all, what if he had been given drugs or another invasive treatment based on someone else’s chart?


Consider some other high-profile quality issues: the Takata airbag deaths and subsequent recalls, the Equifax security breach, the Wells Fargo unauthorized account scandal, etc. These are all quality issues, some of them extremely serious.


Clearly, even the big players, with the slick websites, high-tech equipment and highly paid staff, don’t have it completely figured out.


Bringing it back to you personally

So how does this relate to you, as a student, a job seeker, a technician, a supervisor, or whatever?


Start by asking yourself if any of what I described above appealed to you. Does a commitment to continuous improvement match your values? Are you interested in eliminating waste (wasted resources, wasted time, and wasted human potential)? Are you interested in systems thinking? Do you consider yourself perceptive, thorough, candid, courageous, and helpful to others?


This mix of values, traits and talents are the foundation of career in Quality.


Read through my other articles and contact me with questions and comments.