"Do your own research" Such a commonly stated line, perhaps today more than ever.

But when it comes to research. What should you be looking out for and are there any best practices to ascertain which studies to trust?

Let's start with bias, the most obvious one. If you read "eating five eggs a day will give you super strength," and then see that the study was commissioned by a farmer, you'd likely be a little sceptical. Most academics will outline if there's any conflict of interest involved in their research, sometimes you need to do a cursory Google search.

Robot Shortcuts

Chat GPT: this robot brain has stuck its oar in and is causing problems too. As more content is regurgitated by our burgeoning robot overlords, less rigour will be applied to the sources used. From our nascent experience, many of the studies it cites either don't exist or don't actually conclude in the manner that the Al suggests.

Suffice to say, unfortunately, just because you see a citation in an article, it doesn't mean that the paper actually backs up the claim the writer suggests it does. Check the source yourself if you can, do a quick "Ctrl+f" search in the document for the quotes used, or for surprising statistics that you doubt.

Half-Baked Hacks

It's not just the robots that misattribute, though. It's more commonplace than we'd like for writers to create articles based on the summary/abstract of a paper, due to budget issues limiting the amount that can be spent accessing full research. With that said, if they're blocked behind a paywall unless you're a researcher yourself, you'll be faced with the same dilemma.

Sample size is another thing to look out for. It'd be nice if all studies could be with thousands of individuals over a decade, but that's not financially possible. Think about those shampoo commercials that claim all sorts of restorative power, only to mention, in tiny text. "78% of 9 People Agree." If the writer doesn't outline the sample size, it's usually a bad sign. In this instance, size really does matter.

Citations are a sign of quality. Historically the more cited a paper is, the more trustworthy it is. If other researchers are willing to plop it in their paper, it's a good sign that their findings have merit.

Birdseye True

Meta-Analyses/ Reviews: This is the sturdiest route to take when it comes to sources. Meta-Analyses identify multiple scientific papers and draw a conclusion from all of them together. Wherever possible, if you can find these, rather than individual studies, you can be more confident that their findings hold water.

BUT. With all that said, there's only so much a layman can do to assure themselves that their sources are legitimate.

Academics are people too, and they have their own egos to battle. Whilst it'd be nice for them to be eternally rigorous, there are many instances of individuals overblowing their findings. What do you think makes for a more interesting read, a journal article that finds something or one that finds nothing?

Intentions Shmintentions.

And finally, just to throw a spanner in the works, a recent analysis (Serra-Garcia & Gneezy, 2021) found that social science papers that were non replicable were cited 153 more times on average than those that could be replicated successfully.

A frustrating hodgepodge of information, misinformation, bias, ego and sensationalism.

But, as we continue to drift towards a world that requires us to be our own arbiters of truth, we need to be armed with the tools to give it our best shot.

Keep these tips in mind next time you see something outlandish with a hyperlink that just seems too good to be true.

Leave a comment

Please note, comments must be approved before they are published