How to review a scientific report

Reviewing a scientific report is a great privilege. You are among the first to see new knowledge without having to do the sometimes-tedious investigative work itself. And if you’re like me, you might feel the joy of learning something new.

When reviewing, I encourage you to do two things. Firstly, commit to applying a method, just as the investigators applied a method when conducting their study. Secondly, consider the review, your comments, and the investigator’s response as a conversation between the two of you. This conversation has a subject. It may go off in an unexpected direction. And it may expand to include other people.

In this article I will describe how to approach the important role of reviewer.

Who you are

My primary assumptions about you are twofold:

  • You do this for work (you are not a scholar doing peer review for free).
  • You follow your organization’s standard operating procedures (SOPs) and national or international regulatory guidance.

I also assume that you are the only reviewer. If you are the Quality reviewer, but the report has already undergone a thorough technical review described in your SOPs, then your review will be greatly reduced in scope (and probably much easier!).

Lastly I assume you are properly trained. You must not only be familiar with the procedures of your reviewer or Quality unit, but also with the procedures the investigators followed and with the science underlying their methods. If some calculations were done in Excel, you should be familiar with Excel. If some statistics were done in Minitab, you should be familiar with Minitab.

Gather your materials

You should have four things at a minimum:

  • The draft report
  • The protocol (plus applicable SOPs)
  • The raw data
  • Your checklists and procedures and your form or software for documenting findings


The following setup is optimal to me: the draft report is on the computer screen in front of me. The raw data and the protocol are on my left. My materials (such as a checklist, inspections from during the study, and any other Quality forms) are on the right. These Quality materials may be software-based, in which case it should be open onscreen so you can document findings in real-time, while reviewing.

Check for completeness

Before beginning, make sure everything is there so that you can plan for a thorough review with minimal handoffs (handoffs are where mistakes often happen).

Read the intro/summary/abstract

Even if this section is at the end of the report, it is a good thing to check before getting deep in the weeds.

Read the body of the report and make sure all the sections specified in the protocol/SOPs are present

Title Page/Cover Page

This should include a unique identifier, the name of the study, and the sample name and lot number (if applicable). Your organization’s logo or letterhead may be required as well. The investigators, the dates of Quality inspections, the date of finalization, and the Quality reviewer (if applicable) may appear here or on a signature page. The client (if applicable) and testing facility are listed here as well.

Table of Contents

Depending on the length of the report, a table of contents may be helpful. Ideally this will automatically update in Word. If changes are anticipated, consider verifying the table of contents at finalization to avoid duplicating your effort.


Peer-reviewed reports will require an abstract. The requirements are very specific to the journal. Your organization probably does not require one.


The introduction should basically be a high-level summary of what is in the protocol. If the writer briefly stated the scientific background, the goal of the current study, and the methods in simple language while still being descriptive, he or she has done a great job!

Some studies may state the overall conclusion in this section as well. If so, check it against the conclusion section below. If this section is labeled Purpose instead of Introduction it should correspond to the Purpose section of the protocol.

Materials and Methods

This section should mirror the protocol because the protocol is basically a promise or agreement about what will be done. The materials, such as reagents, instruments, and suppliers, may be in their own section or in tables interspersed with the text. Check each against the raw data as you go.

Pull up applicable SOPs as you review. Although these documents often read like work instructions, they may provide greater detail than the protocol about what was done. Try to avoid “filling in the blanks” though. The report should stand on its own.

Read for flow and for accuracy to what is described on the lab notebooks or worksheets. Check that all specifics, such as temperatures, weights, and incubation durations match what was recorded. If there are any deviations, these may need to be addressed in the report. The need for a retest or an investigation may first be discovered here.

Raw data and report disposition

This section is important! The retention period of the report and raw data must follow regulations of the FDA, EPA, or other agencies. The report should describe where the raw data will be held so that clients or regulators can compare the report to it during an audit or investigation.

In general, the Quality review documentation you are completing right now will be stored with this package and for the same retention period.

Validity criteria

I like to see validity as a standalone section because it is so important. This and the following (evaluation/acceptance) deserve your full attention.

Validity is the soundness of the entire study. In an animal test for irritation, a valid determination may require that animals in a positive control group developed irritation as expected. In a chromatography study, a valid identification of a compound may require that the signal-to-noise ratio be below a certain threshold. In an ELISA test, a valid assay may require that the calibration curve is linear and not just a jumble of dots.

In routine assays that seldom have validity problems, this section can be easy to overlook. The text for this section may be canned. In your review, take a moment to look at validity. There may be one validity criterion or there may be several.

Without a valid assay, the investigator cannot make an evaluation (below). Or, he or she cannot make an evaluation without a compelling justification.

Evaluation/acceptance criteria

Evaluation/acceptance criteria are very important as well. They might be something specific such as “to pass, the white blood cell count must be within 4,500-10,000 white blood cells per cubic millimeter.”

In a larger study, the evaluation may be much broader. The investigator may look at the totality of health observations, bloodwork, feed consumption, histopathology, and necropsy observations. In this case, a lot of judgement is involved on the part of the investigator. As a reviewer comparing the raw data to the report, you are verifying that the evaluation broadly matches what you see in the raw data.

In both cases, the evaluation should be based on what was described in the protocol. Because of its deciding nature, to alter it or introduce other evaluation criteria requires a protocol amendment or a deviation.


Reviewing results may take up the bulk of your review. Depending on your SOPs you may check 100% of transcription or just do a spot check. The report may present every data point collected, or just summary tables. You should be checking that all of it is traceable to the correct place in the raw data, that any gaps are labeled so in the report tables, and that the text matches what is in the tables and what is in the raw data.

If there are voluminous data tables to check, you may choose to review these last so that you can read the report in one sitting while it is fresh in your mind.


The discussion section is where the results are interpreted. It should connect to the introduction or purpose of the study and then move beyond. You should check it against everything that is applicable: the protocol, the results you have just reviewed, the evaluation and validity criteria, and the relevant citations. This section is free to be lengthy but it should not restate too much from elsewhere in the report. Instead, it should lean heavily toward context, synthesis and interpretation.

Depending on your training and role in the organization, you may be responsible for assessing the scientific rationale of the report. But a typical Quality reviewer will not be responsible for this. Instead, questions of scientific judgement are left to the investigator and his or her peers, who will have already reviewed the report or at least the associated raw data.


The conclusion is best kept brief! If it is lengthy, suggest in your comments that some of it be moved to the Discussion section.

Both short-and-simple and long-and-complex studies are free to have a one-sentence conclusion. As a reviewer, you should double-check that the conclusion matches the interpretation arrived at in the Discussion section and does not go beyond it.

Although the conclusion is one sentence, a word on scope and a qualifier are always scientifically justified. Consider the following one-sentence conclusion:

“Under the conditions of this protocol, the levels of endotoxin in the sample were found to be below the limit of detection.”

The investigator referenced the scope of the study (the current protocol and its limitations), provided a scientifically cautious qualifier (there could be endotoxin, but not enough to detect), and clearly stated the conclusion, all in one sentence.


For a non-peer-reviewed study, this is not as important. It is the responsibility of the investigators to base their study on sound science. However, you should check that the resources cited exist, are cited correctly, and are accessible. If your organization is issuing reports based on papers from obscure journals from the distant past that can’t be found online or in your reference library, you might have a problem.

Check transcription

A 100% transcription check is an unambiguous requirement. It is clear for the reviewer. It is clear to the investigator that their transcription will be verified. After establishing a 100% transcription check requirement, you will start getting higher-quality raw data by the time of review. This means less back-and-forth and less reprocessing of data (data transcribed incorrectly into Excel or statistical software causes incorrect results).

Spot-check calculations

Your organization may require spot-checks or that a certain percentage of calculations be checked, or that 100% be checked. Generally, you will not need to check calculations done within statistical software. If possible, you should have a digital copy of any Excel spreadsheets used so that you can display the functions and check the calculations the easy way.

See the investigator if needed

Consulting the investigator before your review is finished is a matter of judgement. Here’s what I mean: if you can’t find what you are looking for or can’t understand a calculation, you will definitely want to ask the investigator to clarify. However, if you identify a major deviation or some other issue, you want to ensure that this is documented and winds up in your quality system. You want each major finding to be documented and to have a documented response.

The same goes for minor errors. You don’t want to hand the report and raw data back and forth, for multiple review cycles, while they address issues that come up during your review. Instead, there should be one efficient but thorough review of the complete package, followed by a write-up. Subsequent reviews should be brief, involving only signing and sending.

You might also consult a third party such as a statistician or manager.

Write it up

Organize your findings cohesively and fill out any checklists you may have. If applicable, categorize your findings by major and minor. If possible, each of your findings should describe what was found, reference the standard it is being compared to (the protocol, an SOP, etc.), and the consequences of not meeting the standard.

If the report is not acceptable or not acceptable without additional experimental work, make this clear. If a major deviation was identified, you should let the investigator know in person so they can resolve it quickly. Any required investigations should be closed and addressed in the report when the report comes back to you for finalization.

At finalization, see that all findings are addressed

When the report comes back to you for finalization, you will ideally only be reviewing minor changes and the addition of finalization dates and cover pages, logos, etc. You will want to ensure that all investigations are closed and that all your findings are addressed. Each finding should be closed out individually.

Do a final look-over

Finally, check the boring stuff! Make sure the pagination is correct and that all the pages are there. Be ready for third and fourth reviews as minor printing or electronic signature issues are sorted out.

Sign off and then thank the investigator for their hard work.

Report and raw data - neo - Copy png (2)

Some further reading

An excellent and much-shared article geared toward peer-reviewed papers. There is good advice in the comments section as well:


A well-organized breakdown of the elements of a scientific report from the University of Waikato (New Zealand):


A detailed guide to writing a discussion section from the University of Southern California:

How to review audit trails


The FDA regularly issues warning letters regarding data integrity. Now, the agency is encouraging companies involved in GMP to get on top of these issues with regular audit trail review. You may have read some FDA or EU publications relating to data integrity and audit trails and wondered how they may apply to your organization. This article will demystify audit trail review and provide you an outline for your organization’s data integrity initiative.

What is an audit trail? What are some other data integrity terms?

Audit trail means “a secure, computer-generated, time-stamped electronic record that allows for reconstruction of the course of events relating to the creation, modification, or deletion of an electronic record. An audit trail is a chronology of the ‘who, what, when, and why’ of a record.”

The usefulness and power of the audit trail can vary quite a bit by system. In an older hemocytometer for example, you may find that there is no audit trail – the printout containing the time, date, result, and sample name is the extent of the data produced by the machine.

In the case of a newer analytical instrument such as a mass spectrometer, you will find that the audit trail is exhaustive and essentially unalterable. It will be a dynamic, all-inclusive record that displays information about every user and instrument action, in a human-readable format. It may be searchable by date, user, type of action, etc. It may even be in its own view-only, data manager module for ease of use. It may be exportable as an Excel file or printable as a PDF.

Another instrument may be intermediate in user-friendliness: you can display a page or two of printable, static information surrounding a specific file or sample, but cannot easily search or scroll.

Audit trails are part of a broader concept called metadata. “Metadata is the contextual information required to understand data.” For example, a set of data may be a list of compounds and their concentrations in a sample. The metadata surrounding this data would include the username, time and date, the data path (where the files are automatically saved), any error messages that came up, and any changes to the sample name. As outlined above, the extent of the metadata collected by the system can vary quite a bit.

An even broader concept is data integrity. This term refers to “the completeness, consistency, and accuracy of data. Complete, consistent, and accurate data should be attributable, legible, contemporaneously recorded, original or a true copy, and accurate (ALCOA).”

A concrete analogy

To make data integrity a little more concrete, consider how you treat crossouts on a paper record. Most crossouts are justified (mistakes happen). But when someone does make a crossout to update a raw data entry, your organization’s SOPs require them to address it. They do this by drawing a single line through the original entry (leaving it still legible). They then need to provide an explanation (such as “incorrect entry”) along with the identity of the person who did the correction, and the date of the correction.

In this way the date and recorder of the original entry, the date and recorder of the correction, and the reason for the correction are all documented. In addition the original entry remains legible for comparison to the corrected entry.

Data integrity refers to these same controls but in a digital format. And reviewing audit trails helps the Quality unit to review these deletions and changes in the same way they review them on paper.

Everyone in your organization plays a part in ensuring data integrity. Reviewing audit trails is part of that effort.

Where do I start?

Start by looking at the list of all computerized systems in your organization. If you have no such list, make one! A spreadsheet is adequate.

Classify each system by what regulated activity it is used for (GMP, GLP, etc.) and by its criticality. Document who is the system administrator (i.e. who controls access and sets up and deactivates users). Identify the subject matter expert who can answer your questions (see below). Sort the systems by criticality. Audit trail review is resource-intensive, so you will want to start with the critical ones.

Do a brief, informal review with a regular user of the system. See if you can easily access the audit trail from the main screen of the program or if you need to log into a separate module such as a data manager, security module, or report mode.

Audit trail functionality may have been documented in the validation. Check the validation paperwork for hints on how to access it. Check the manufacturer’s materials as well.

Consider the feasibility and resource-intensiveness of audit trail review for each system. Does reviewing it mean booting someone off the terminal? Is there a limited number of user licenses? All these considerations will influence the requirements you define in your data integrity SOP.

Take notes and write down any questions to address during a more detailed review to follow. Update the spreadsheet you started.

An assessment of each system is needed

This involves a more detailed assessment. A member of the Quality unit will sit down for about an hour with one subject matter expert in the software/instrument. In advance, you will want to come up with a template Word document with items to address. The idea is to get accurate information, including screenshots, to help with training and crafting the SOP. Topics may include the following:

What type of audit trail is it?

• Is it a dynamic event log, capturing every action automatically? Or does it just display the parameters of the individual run?
• Can the reviewer select a date range?
• Can he or she search by user, batch or project number?
• Are there any error messages for the subject matter expert to look into? Are there error messages that qualify as red flags for the reviewer?
• Does the audit trail capture when a user edits a sample name or lot number? When critical parameters are changed? When background scans are re-run? When the results from standards are rejected?
• What do manipulations and deletions look like? Is there a way to compare versions of a file?
• In analytical chemistry, is there an audit trail for acquisition of data, but no audit trail for processing of data? In chromatography, is there a way to compare integrations of peaks? What does excessive integration look like, and is this something the Quality unit should review?
• Compare records from a recent project or run to what you see onscreen in the audit trail. Go step by step and write down some typical audit trail events.

Don’t duplicate the effort of the original validation. But do look at the system through a data integrity mindset.

You may find that routine audit trail review opens new conversations about a system. Although an audit trail cannot be “corrected,” review of the audit trail may point to deficiencies elsewhere in the project’s documentation.

For example, if one person signed the batch record, lab notebook or worksheet, but the audit trail shows a different user (such as a trainer) performing that procedure, then both signatures should be in the documentation.

Another example is retesting or reprocessing: if the audit trail clearly shows a retest, but this was not documented, then this should be addressed in the documentation before Quality unit approval of the record.

For legacy systems, make sure any deficiencies are documented and that alternative data integrity measures are in place

You may look at an older instrument or software and find it is not 21CFR11 compliant, or not compliant in the way you thought it was. You can add a brief risk assessment to the validation paperwork with a memo referencing the new audit trail review procedure that is being put in place.

Look to the manufacturer’s compliance software

If the software is older, ask the vendor about updates. They may have released a data security module that you can add to the software. If there is no audit trail functionality, the system may still be Part 11 compliant. Don’t worry – audit trail functionality is not an absolute Part 11 requirement.

Get access for the QA reviewer for each system

Once this assessment is complete, get the Quality reviewer access to each system.

Some software will have a reviewer (read-only) mode. This is ideal because they will not be able to accidentally delete a file or alter a method. If the Quality reviewer is a standard user, that’s fine too!

Efficiency side note: to avoid duplication of effort, the periodic review can be reduced somewhat in scope.

Although audit trails will now be reviewed routinely, the periodic review is still important because this is where failed login attempts, periodic vendor maintenance, and changes to the overall method are captured. Keep in mind the risk-based approach.

Write the SOP!

Reference information technology SOPs, validation SOPs, ethics and training SOPs, and Quality review SOPs. Make sure it has caveats that address the range of software at your company. This procedure should involve organization-wide training. Having an executive or director champion it would be very valuable. Everyone should know what is expected of them with respect to data integrity.

Revisit this procedure in a year

To follow up, continue reviewing FDA warning letters for the agency’s thinking on data integrity matters. Distribute the pertinent letters to your team. Connect the audit trail reviewers with those involved with equipment/software validations so that audit trails are set up and understood proactively. Even better, ask for the reviewers’ input on the next set of compliance software that the vendor is trying to sell you.

A year after effectivity, revisit this procedure and see how it’s going:

• Is it too impractical? Are reports being delayed because the reviewer can’t get into the system while others are logged on?
• Is system access an issue? Does only one person in the Quality unit have the needed access?
• If you have technical data reviewers and a Quality reviewer, are they duplicating each other’s review? It can be hard to separate a technical review from a procedural review. Perhaps only one group should review.
• Look at the overall value of the project. If you find that reviewing audit trails in the way recommended in the FDA’s draft guidance is not a value-added step, let the FDA know! Now is the time to comment before the draft guidance becomes a requirement.

Lastly, take the long and broad view. Consider audit trail review to be one of many tools in your organization’s data integrity efforts. Keep in mind that other organizations are grappling with these issues as well, and there are no experts out there who have all the answers. You will have to treat data integrity as an ongoing commitment, with every data integrity procedure open to change, optimization and improvement.

If you have had successes, failures or questions in your audit trail efforts, I’d love to hear about them!

Mississippi River St Paul Feb 2017

Explore further

Data Integrity and Compliance With CGMP Guidance for Industry (draft guidance). The FDA published this in April 2016 as draft guidance. But as we know, you need to get ahead of the guidance!



This FDA slideshow, released a year later, provides some helpful elaboration on the guidance:



This article provides an concise summary of the challenges of starting an audit trail review process, and the importance of a risk-based approach based on an assessment of each system. The link opens a PDF: