anti-vaxers and other conspiracy theorists exposed, lmaooooo [study]

R

rl⠀

⭐⭐⭐⭐⭐
Joined
May 6, 2023
Posts
2,380
Reputation
3,299

New psychology research reveals the “bullshit blind spot”​



Is there a bullshit blind spot? A series of two studies recently found that people who were the worst at detecting bullshit not only grossly overestimated their detection ability, but also overestimated their ability compared to other people. In other words, they not only believe that they are better at detecting BS than they actually are, they also believe that they are better at it than the average person.

At the same time, those who were best at detecting BS not only underestimated their own performance but also believed that they were slightly worse at detecting BS than the average person. This research was published in Thinking & Reasoning.

“Broadly, I’m interested in figuring out why relatively smart people believe dumb things (and I include myself sometimes in that category!). So, this includes trying to understand what characteristics are common among people who fall for misinformation as well as what characteristics are common in the misleading messages that make them appealing and persuasive to some people (such as the features of the message itself, how it is delivered, etc.),” said Shane Littrell, PhD (@MetacogniShane), a postdoctoral research associate at the University of Miami.

“My co-authors and I recently published a study examining whether people who spread misinformation are also more likely to fall for it – that is, whether one can ‘bullshit a bullshitter’ (open-access version) – and one of the main implications of that work suggests that people who intentionally spread misinformation in some situations can also unintentionally spread it without realizing it in other situations. To me, this seemed to suggest that some people who knowingly spread bullshit are unaware of the fact that they often fall for it themselves, possibly because they think they’re better at detecting it than everyone else.”

“And, on a certain level, that makes intuitive sense. A con man might not think he can be conned because he ‘knows all the tricks,’ so to speak. So, our next set of studies set out to test that idea by examining how confident people who fall for bullshit are in their own bullshit detection skills, and what cognitive processes they use when they evaluate misleading information.”

Across two studies, the researchers recruited 412 participants to examine the link between bullshit detection, overconfidence in one’s abilities, and the perceived thinking processes people engage in when they encounter and evaluate potentially misleading information. In Study 1, the bullshit detection task involved rating 20 statements as profound or not profound.

Half of the statements were real quotes from famous public figures that are typically judged to be profound (e.g., “A river cuts through a rock, not because of its power but its persistence”). The other half were randomly generated by an algorithm to have proper grammatical structure but also be nonsensical and inherently meaningless (e.g., “Wholeness quiets infinite phenomena”).

A bullshit detection score was derived for each person based on the number of real (profound) and fake (not profound) statements that they were able to correctly classify. Participants also estimated their own performance as well as others’ performance on this task, which provided confidence metrics. They also provided a confidence rating for their bullshit detection ability in general.


Past research has suggested that some people fall for bullshit because they are more likely to rely on fast, intuitive thinking rather than slower, reflective thinking. Thus, to test whether this is true, Study 2 examined individual differences in the types of thinking processes people perceive that they engage in when trying to detect bullshit. In other words, did participants feel that they were able to spot bullshit immediately or did they need to reflect on it before making a determination?

To find out, the researchers had participants complete measures that assessed their perceptions of the thinking processes they used when evaluating potentially misleading information (i.e., intuitive versus reflective thinking). To ensure that the perceived speed of their thinking process (faster intuition vs slower reflection) aligned with the actual speed of their evaluations, participants’ subjective ratings of their thinking process were compared with objective measures of their evaluation speed (i.e., time spent evaluating statements), revealing that the two were positively correlated.

Overall, Study 2 found that both intuitive and reflective thinking processes are involved in detecting – and falling for – bullshit, rather than one particular thinking process being dominant.

“Our main finding was that the people who are the most susceptible to falling for bullshit are not only very overconfident in their ability to detect it, but they also think that they’re better at detecting it than the average person. This applied whether they evaluated the BS quickly/intuitively or spent more time reflecting on it,” said Littrell.

“This is kind of a double-whammy in terms of bullshit susceptibility that we call the ‘bullshit blind spot.’ The other interesting finding was that the people who are best at detecting BS are actually underconfident in their detection skills and think they’re worse at it than the average person (i.e., they have a bullshit ‘blindsight’),” he added.

“It’s objectively worse to be a person who is not only bad at spotting BS but thinks they’re awesome at it than it is to be a person who is good at spotting BS but underconfident at it. So, I think the most important thing to take away from our findings is that everyone would be better off practicing more intellectual humility and skepticism. This is tough for most people, because we all like to believe that we’re smart, and in control of what we think and believe, and that we aren’t easily fooled. Unfortunately, many people who believe this are quite wrong.”

With regard to study limitations, the researcher explained that the type of pseudo-profound bullshit stimuli that was used in this work was what might be encountered in conversation, on social media, or from the self-help/inspiration guru industries.

“It could be that people evaluate or otherwise react to bullshit in other types of contexts (e.g., organizational, consumer marketing) differently or that the effect sizes would be different. Past research suggests that our findings would probably generalize to other types of BS and misinformation, but that needs to be empirically tested for us to be sure. Also, we used a Western, English-speaking sample of participants, so we can’t draw any firm conclusions on whether these results would replicate in other types of cultures and languages.”

Are there other lessons we can take from this work? According to Dr. Littrell, “I think our findings underscore the simple truth that all of us not only can be fooled (some more than others), but all of us likely have been fooled at some point in our lives, either by misinformation on social media, biased media coverage, flashy consumer marketing, or even/especially by someone we know bullshitting us.”

“By being more intellectually humble in our day-to-day lives, we’ll be better prepared to resist bullshit and other misinformation by being more mindful of our own cognitive vulnerabilities which will hopefully encourage us to be more attentive to and skeptical of the information we’re exposed to. The phrase, ‘what if I’m wrong?’ can be an incredibly liberating and protective mantra to live by.”

 
  • JFL
Reactions: wollet2 and Deleted member 29167
Pity reply so your thread can get some traction
 
  • JFL
  • +1
Reactions: Deleted member 24781 and rl⠀
Pity reply so your thread can get some traction

it's too boring and dry for a discussion here, i'll stick with insulting other users
 
doesnt these kind of go hand in hand. if you believe youre very good at something doesnt it imply that ure better at it than most people?
 
  • +1
Reactions: rl⠀
I break the rules. I have insane bullshit detector (much better than the avg person) and I know it
 
Conspiracy theorists have the best BS detectors...
 
Anti-vaxxers and conspiracy theorists are so stupid. Just STFU and get your daily booster shot while eating ze bugs, okay?
 

Users who are viewing this thread

Back
Top