Illustration of a sheet of paper with many holes punched out of it.
Precise feedback: During the course, students learned how to provide concrete ideas on how to improve an experiment or the manuscript section in question during review.
Illustration by Vahram Muradyan

Let’s teach neuroscientists how to be thoughtful and fair reviewers

Blanco-Suárez revamped the traditional journal club by developing a course in which students peer review preprints alongside the published papers that evolved from them.

By Elena Blanco-Suárez
6 March 2026 | 6 min read

I used to joke, “Someone, somewhere, is trashing my paper right now at a journal club.” All joking aside, I often felt that it was uncomfortably plausible. Years of journal clubs had taught me that when you put on the “reviewer hat” and analyze other people’s work, the main goal is to find flaws. Or at least that’s how it felt we had been trained. We were supposed to look for shortcomings and almost blatantly ignore a study’s strengths and beauty. Critique was currency; appreciation, optional. 

Not long ago, though, I was reviewing a paper and found myself spending a stupid amount of time trying to poke holes in it. Almost out of pure exhaustion, I asked, “What if it’s just a pretty good paper?” 

It made me think back to when I started as a faculty member five years ago and had the chance to run a journal club course. The students were apathetic; they would show up, do a sloppy presentation, paraphrase the paper and move on. And no one seemed particularly inspired, including me. So when I took a new job at San Diego State University last year, I saw it as an opportunity to start over. I made it my mission to develop a journal club course to teach students how to become fair reviewers.

As researchers, far too often we receive comments from reviewers that aren’t helpful. I still remember the single comment from Reviewer 1 on my very first manuscript submission as a sole senior author: “This research has limited translational potential.” This devastating comment not only sealed the fate of our manuscript at that journal, but it was also useless. It offered no direction, no solution, no path forward. It was not constructive criticism. They might as well have said, “I don’t like it.” We have plenty of writing courses available to us during our training and even as faculty, but there are virtually no courses to teach us how to be reviewers.

So I thought: What if I asked students to do a full peer review of preprints that have peer-reviewed published versions? This way, students could evaluate the original work and how the authors initially envisioned the manuscript, and then compare that (side by side, scars and all) with the version that was eventually published in a journal. By doing so, students could see how much manuscripts change in response to reviewers’ suggestions and editors’ decisions. 

I knew that I, too, would benefit from this exercise, something that I rarely make time for outside of research and teaching. Seeing the evolution of a manuscript could help the students and me to grasp the value of peer review in a concrete, almost visceral way.

A

s I began designing the course, I asked the scientific community on social media to share their preprints and published peer-reviewed counterparts. I crowdsourced the preprints because I wanted to avoid only selecting manuscripts in my scientific wheelhouse. I was afraid that no one would volunteer their work to be guinea pigs in this exercise, but to my surprise, my peers happily stepped forward. 

Screenshot of the author requesting on social media that researchers volunteer their preprints for her course on peer review.
Community call: Blanco-Suárez called on researchers to volunteer their preprints for her course on social media. Her colleagues happily obliged.

Every week, students read the first posted version of a preprint and performed their own peer review. The students then talked through the reviews as a group and read the journal version to see how their suggestions compared with the changes that had happened after peer review.

One particular study we enjoyed discussing was a 2023 preprint led by Andrew Boyce from Roger Thompsons lab at the University of Calgary and the peer-reviewed version published in Nature Communications. Students noted how Boyce and his colleagues had refined the figures and made other stylistic changes to make the paper more straightforward. The researchers had also changed and improved the paper’s title, making it more concise and specific. The peer-review process had enhanced the paper’s readability and clarity while preserving the study’s purpose and conclusions.

This preprint offered my students a fantastic example of the amount of work that goes into preparing a manuscript, involving not just the experiments, but framing, storytelling and countless other micro-decisions. This preprint had four different versions that showed the subtle and more dramatic changes a manuscript goes through before its final published iteration.

Students’ peer reviews improved substantially over the semester, as did their discussions. When we talked about the peer reviews in class, the students who gave harsher reviews were often receptive to comments from those who were more lenient. Almost every time, the first group softened their opinion, turning sharp criticism into something more constructive, along with concrete ideas on how to improve an experiment or the manuscript section in question. It reminded me of the dynamics that might arise in a study section.

One of the most interesting things I witnessed was how students had intrinsic reviewing styles. Some students instinctively highlighted flaws and shortcomings, whereas others focused on strengths and innovation above everything else. These students all had minimal background in neuroscience and almost no experience reading papers critically. It made me wonder if some of us are just genetically encoded to be Reviewer 1. I often found myself playing devil’s advocate—explaining, for instance, why it might be unfair to ask for experiments in human tissue samples from a lab that simply doesn’t have access to them. Asking is free, but are those requests reasonable? I would ask the students to consider if the limitations identified during their review invalidated the study’s conclusions. 

Courses such as these are essential to teach students how to be objective and fair reviewers, enforce rigor and reproducibility, and facilitate the dissemination and accessibility of scientific progress. I was thrilled to see that in the final course evaluation, 100 percent of the responding students said they felt more confident in their ability to critically evaluate a manuscript and to provide actionable, constructive feedback. Students agreed that this format had helped them in ways a traditional journal club simply could not. 

From the perspective of an author, it was illuminating for me to see what motivates some of the harsher reviews. But rest assured, Reviewer 1 will always find something to dislike.

Sign up for our weekly newsletter.

Catch up on what you missed from our recent coverage, and get breaking news alerts.

privacy consent banner

Privacy Preference

We use cookies to provide you with the best online experience. By clicking “Accept All,” you help us understand how our site is used and enhance its performance. You can change your choice at any time. To learn more, please visit our Privacy Policy.