How to Critique An Article
A science article is a piece of research written in the form of an academic paper, and it can be either published or unpublished. They are usually peer-reviewed before they are published. Critiquing science articles is important for critical thinking and understanding science more broadly.
This blog post will discuss how to critique science articles, including sample size, publication bias, methodological limitations, the hierarchy of evidence, and subject-specific knowledge which is needed for critique. It also discusses relevant expertise related to critiquing science articles as well as the importance of having good critical thinking skills when critiquing any kind of article.
Keep in mind this largely comes from my own experience and also my research methodology class.
Important Factors For Critiquing
When critiquing a science article, it is important to consider the sample size. Small sample size can introduce bias into the results of an experiment and make it difficult to draw any meaningful conclusions from the data. It is also important to be aware of publication bias, which refers to the tendency of journals to publish studies that show positive results, while studies with negative results are less likely to be published. This can lead to a distorted view of the evidence on a given topic.
Another important consideration when critiquing science articles is methodological limitations. A study may be well designed and carried out, but if the data collected is not analyzed correctly, the conclusions drawn from it may be inaccurate. This problem can be exacerbated if the authors of the study aren't aware of their mistakes. I mentioned measurements on Instagram which is directly tied to methodological limitations as inappropriate measurements might not give scientist the results they are looking for.
Another issue that arises in science is reproducibility or lack thereof. If other researchers cannot replicate an experiment, it casts doubt on both the original findings and how they were analyzed.
This all points to a hierarchical system for science evidence: systematic reviews and meta-analyses (used to pool data from multiple studies) are considered the strongest evidence. RCTs and observational studies that show a strong correlation between two variables or groups of variables, respectively, come next in line as more reliable than case-control designs. Case-control is still useful for studying rare diseases, but it's limited since you cannot determine a causal relationship.
However, here is a word of caution. The hierarchy of evidence is not so rigid. There are times were certain study designs are more appropriate for answering certain questions over others. More will be said about this later.
- Jae
In science, proper peer review is also critical to avoid as much error as possible. If the study was not properly vetted by other experts in that field or if it has been published before and failed to gain consensus from others, you know there's probably something wrong with it. Because science builds off of itself so much, relying on the work of others, bad science can propagate and lead to erroneous conclusions.
It's important to be critical when reading scientific papers, not just of the study itself, but also of the journal it was published in. Publication bias means that journals are more likely to publish positive results than negative ones, so you need to take that into account when evaluating a study. If all of the studies on a given topic show positive results, that's a good indication that something is probably off.
A great way to vet whether or not a journal is reputable would be to look at it’s impact factor. Impact factor tells you the amount of “impact” a journal has within a given field which tells you whether a journal is prestigious and hard to get into or not. But impact factor isn’t always so accurate as some journals attempt to game the system.
Overview
When critiquing an article:
- Be familiar with the topic and relevant expertise related to it (If I were evaluating cardiac studies, for example, I would want someone who is knowledgeable about cardiology). You don't need to be an expert on the study itself, but you should know what questions to ask.
- Look at the sample size: was it large enough to detect a difference if one exists?
- Check for publication bias: has this study been published in a peer-reviewed journal? If not, why not?
- Consider the hierarchy of evidence: what is the study's level of evidence? Are there any meta-analyses or systematic reviews that have been conducted on this topic?
- Look at the methods: were they appropriate for the question being asked? How well did they adhere to the principles of good experimental design?
Expertise Is Important
These are just a few things to consider when critiquing a scientific article. Keep in mind that there is no "right" answer, and everyone will have their own opinion on how well a study was conducted. The important thing is to be able to articulate your reasons for critiquing an article and to do so in a respectful way. As well as backing up your reasons with clear logic and strong justification.
Also, consider your own expertise. There are some science subjects that you may know very little about and may not feel comfortable critiquing. In those cases, it is best to avoid commenting on an article if possible or else make sure to be accepting of expert consensus and or opinion when given.
Final Tips
Here are some tips for how to critique an article:
- If a small sample size (less than 50) or no statistical significance (p>.05) is used within a study know that is probably underpowered. Be careful of too much consistency among the samples used in research studies.
- Look for publication bias, which is the tendency of researchers to publish studies that show a positive result. This can be due to researcher’s own bias or because journals are more likely to publish positive results.
- Check the methodological limitations of the study. Are there any potential biases in how the data was collected or analyzed? What measurements are they using? How strong are these measurements?
- The hierarchy of evidence provides a guide to rank the quality and strength of different types science studies. It ranges from systematic review/meta analysis (highest) to case reports or expert opinion (lowest).
- Include relevant expertise related to critiquing an article as well as subject specific knowledge which is important for critique.