deepstash

Beta

5 Reasons It's So Hard To Think Like A Scientist

We’re seduced by graphs

It doesn’t take a lot to dazzle the average newspaper or magazine reader using the superficial props of science, be that formulas, graphics or jargon. 

One study found that participants were far more likely to support new evidence when it had a graphic visualisation of the correlational evidence than if they had read the same evidence without a graphic.

40 SAVES


This is a professional note extracted from an online article.

Read more efficiently

Save what inspires you

Remember anything

IDEA EXTRACTED FROM:

5 Reasons It's So Hard To Think Like A Scientist

5 Reasons It's So Hard To Think Like A Scientist

https://digest.bps.org.uk/2017/06/20/5-reasons-its-so-hard-to-think-like-a-scientist/

digest.bps.org.uk

4

Key Ideas

We’re swayed by anecdotes

Most of us are influenced more powerfully by personal testimony from a single person than by impersonal ratings or outcomes averaged across many people. This is the power of anecdote to dull our critical faculties. 

Anecdotal stories can undermine our ability to make scientifically driven judgements in real-world contexts.

We’re overconfident

We overestimate our comprehension of the science. 

Part of the problem seems to be that we infer our understanding of scientific text based on how well we have comprehended the language used. This “fluency bias” can also apply to science lectures when it is delivered by an engaging speaker.

We’re seduced by graphs

It doesn’t take a lot to dazzle the average newspaper or magazine reader using the superficial props of science, be that formulas, graphics or jargon. 

One study found that participants were far more likely to support new evidence when it had a graphic visualisation of the correlational evidence than if they had read the same evidence without a graphic.

Being smart isn’t enough

Even expert researchers suffer from the human foibles that undermine scientific thinking. 

This is why the open science revolution occurring in psychology is so important: when researchers make their methods and hypotheses transparent, and they pre-register their studies, it makes it less likely that they will be diverted by confirmation bias (seeking out evidence to support their existing beliefs).

EXPLORE MORE AROUND THESE TOPICS:

SIMILAR ARTICLES & IDEAS:

The backfire effect

Is a cognitive bias and it means that showing people evidence which proves that they are wrong is often ineffective, and can actually end up backfiring, by causing them to support their o...

Why the backfire effect appears

People experience  as a result of the process that they go through when they encounter information that contradicts their preexisting beliefs.

When people argue strongly enough against unwelcome information, they end up, in their mind, with more arguments that support their original stance.

Reducing other people’s backfire effect

If you’re trying to explain to someone the issues with their stance, you can mitigate the backfire effect by presenting new information in a way that encourages the other person to consider and internalize that information, instead of rejecting it outright.

one more idea

Healthy skepticism
Healthy skepticism
Healthy skepticism does not mean you’re dismissing everything as false — it simply means remembering the things you hear or read in the media could be false, but they could also be true. O...
Find out who is making the claim

When you encounter a new claim, look for conflicts of interest. Ask: Do they stand to profit from what they say? Are they affiliated with an organization that could be swaying them? Other questions to consider: What makes the writer or speaker qualified to comment on the topic? What statements have they made in the past?

The halo effect

Is a cognitive bias that makes our feeling towards someone affect how we judge their claims. If we dislike someone, we are a lot more likely to disagree with them; if we like them, we are biased to agree.

2 more ideas

Intellectual humility

It means being actively curious about your blind spots. It’s not about lacking confidence, or self-esteem. It’s about entertaining the possibility that you may be wrong and being open to learning f...

Why we need more intellectual humility
  1. Our culture promotes and rewards overconfidence and arrogance; 
  2. At the same time, when we are wrong — out of ignorance or error — and realize it, our culture doesn’t make it easy to admit it. Humbling moments too easily can turn into moments of humiliation.
Our reality will always be an interpretation

Even if we might tell ourselves our experience of the world is the truth. Our interpretations of reality are often arbitrary, but we're still stubborn about them. Light enters our eyes, sound waves enter our ears, chemicals waft into our noses, and it’s up to our brains to make a guess about what it all is. 

one more idea