Key Messages
How To Exercise Skepticism And Spot Bullshit: Different Types Of Misinformation Examined
When it comes to spotting and interpreting information, it pays to be aware of the possibility of bullshit.
There are more sources than ever that we rely on for information, so it’s up to us to keep our wits about us and practice nuanced skepticism.
In Calling Bullshit: How To Stay Sane in a World Full of Misinformation, you can learn how to recognize BS from scientific info and statistical data.
The book delves into some of the most pervasive types of bullshit out there – for example, is it true that criminals have distinct facial features? Or that rising house prices actually lead to lower fertility levels? By exploring these topics in detail, you’ll be better equipped to see through any false claims or anecdotal evidence.
You’ll also understand why publication bias in science can influence results and skew the truth.
Ultimately, by reading Calling Bullshit: How To Stay Sane in a World Full of Misinformation you’ll gain much-needed insight into just how easy it is spot & refute bullshit – once you know what to look for!
Don’T Believe Everything You See – The Crisis Of Modern Bullshit And How To Fight It
The fact is, we are all vulnerable to the spread of bullshit.
We’ve seen it before with medical journal The Lancet’s 1998 study falsely claiming a link between the widely used MMR vaccine and autism.
It’s easy to be taken in by things which sound logical, but when you look closely are based on flawed evidence or unexamined assumptions.
We must stay alert to the dangers of such misinformation and be willing to question things that we see online.
Social media has enabled stories like the 2013 Boston Marathon bombing story, which claimed an 8-year-old girl from Sandy Hook Elementary was killed – even including a photo – to be shared by over 92,000 people, only for it later turn out not only to be false but easily disprovable.
All these factors have created a crisis of sorts when it comes to bullshit: hyperpartisan news networks, fake news factories and advancements in image manipulation technology have made it easier than ever for Bullshitters to spread their lies far and wide.
That’s why it’s more important than ever that we all do our part in staying vigilant against this kind of information and taking action if necessary.
With truth under attack from all sides, now is the time for us to raise our guard against Bullshit so we can keep ourselves and others safe from false information that can cause harm.
How To Disprove Bullshit: Looking Past The Data To See The Reality
Bullshitters don’t care about the facts; they just want to convince people of something.
They will often use language, statistics, and graphics to bombard the audience with an excess of information in order to get the desired result.
While a lie is simply something that isn’t true, bullshitting relies on creating something that looks as much like the truth as possible.
One type of bullshitting technique is known as black boxes.
This involves collecting data and running it through a complicated algorithm or other scientific process, which then produces a result that many people assume is true despite having no knowledge of what happened inside the “black box”.
A classic example of this comes from an experiment conducted in 2016 which set out to prove that criminals had differently shaped heads to non-criminals.
They used an algorithm to come up with their results however failed to consider the fact that criminals’ photos were taken by governments while people who didn’t have criminal records had their headshots taken professionally – making them less likely not to smile in their photo than those required by law enforcement!
As such, even without understanding exactly how the algorithm ran its calculations, we can still conclude that this study was complete nonsense.
The authors may not have meant for it but it goes to show: if you don’t really care about evidence and only want people to believe something, you’re most likely engaging in bullshit.
The Danger Of Bullshit Correlation: When Statistical Findings Are Overstated By The Media
It’s important to understand the concept of correlation not necessarily implying causation when analyzing a statistic or scientific study.
Think of it this way: Just because two things are related in some way, doesn’t mean that one causes the other.
An example of this is with a 2018 report by Zillow, which showed that cities with rising house prices also tended to have lower fertility rates for women in their late twenties.
The report noted that it couldn’t confirm that the rising house prices caused the lower fertility, but rather suggested that underlying concerns about money and career development might be influencing both housing and parenting decisions.
Yet when the media reported on these findings, they were more careless – using words like “cause” and “effect” as if there was a direct link between high house prices and low fertility rates among young women.
It’s important to remember that just because something has been reported in such a manner, doesn’t mean that it is true.
The same can be said of when two things have a strong correlation, yet don’t have any causal link at all – such as autism prevalence and organic food sales, which rose sharply around the same time due to unrelated factors.
Knowing how to tell apart what appears to be causation from true causation is an essential skill for understanding statistical studies and drawing meaningful conclusions from data – without getting caught up in fallacies born from media sensationalism!
Beware Of Being Misled By Numbers: How To Identify Bullshit With Data Designed To Deceive
It’s alarmingly easy to manipulate numbers to get the message you want across.
Just look at what happened with the “99.9 percent caffeine free” cocoa packet that Carl encountered in a hotel lobby one evening—Sure, it’s true that coffee contains 0.075 percent caffeine, but when making a claim like 99.9 percent and not actually providing the source of comparison, this puts a misleading spin on the truth in order to make an insignificant achievement seem bigger than it really is.
Something even more alarming is how numbers can be used to provide misinformation on critical issues such as immigration.
In 2017, Breitbart wrote that 2,139 people with DACA status had been convicted or accused of crimes – A number that certainly sounds terrifying until you compare that to the 700,000 total amount of DACA recipients; then you see that fewer than one out of every 300 has been convicted or accused of a crime.
When talking about percentage increases or differences between certain numbers, be sure to remember not just the percentages listed, but also their corresponding percentage points differences – In one example given by The Lancet regarding alcohol consumption and developing health problems – drinking only one alcoholic beverage per day increased the risk of those health problems by 0.5 percent; however, the actual percentage point difference was an incredibly small 0.005 (it originally was 1% and went up to 1.005%).
Beware Of Selection Bias: How Wrong Data Skews Statistics
The problem with relying on data to draw conclusions is that the data you’re working with might not be a true representation of the population.
This is known as selection bias, and it can have serious implications when analyzing statistics.
To demonstrate this, imagine a graph which plots attractiveness against niceness among all men.
The dots are randomly scattered across the graph, showing there’s no actual correlation between the two parameters.
But if you filter out all those who you’d never consider dating from the sample, then suddenly there’ll be an apparent correlation – because of the way you’ve adjusted the sample – even though hot guys aren’t necessarily jerks in general.
This type of selection bias also occurs in clinical trials, where people drop out for various reasons and their experiences are not recorded in the data.
As a result, relying on test results from a non-random sample will likely lead to skewed results and misleading conclusions.
In short, selection bias can have serious implications when analyzing statistics and drawing conclusions based on them.
That’s why it’s crucial to ensure that any tests conducted or samples used are neutral so that accurate results can be generated.
If they are not neutral and unbiased then this type of selection bias could lead to false assumptions and incorrect conclusions being drawn from tests and data analysis!
The Danger Of ‘Big Data’: Don’T Be Duped By Machine Learning And Elaborate Diagrams
Big data and machine learning can create impressively intricate diagrams or “subway maps” of varying topics.
However, this does not mean these presentations are automatically accurate nor should it overshadow the need to understand that the underlying data that is being processed is sound and reliable.
This means asking questions – Does the chart include data points below zero on its y-axis? Were there any unforeseen variables that aren’t captured in the dataset? Was a wide enough range of sources used when constructing the information or did technological advances allow for assumptions to be made without careful research?
Ultimately, Natures Nutrition’s message is clear: don’t let impressive visuals distract from their validity and always make sure to double check what facts are presented before trusting them as absolute truths.
To ensure you’re making informed decisions, it pays to do your own research despite machine learning algorithms being available.
The potential of machine learning is great but this also means a possibility of fault – it can take somebody with a human eye to recognize when incorrect assumptions have been made and based off false information.
The Imperfections Of Modern Science Make It Easy For Bullshit To Creep In
The imperfections of modern science can make it hard to differentiate fact from fiction.
From biased published studies to faulty correlation measurements, the scientific system we have today is filled with issues that could lead to misleading information being spread as “truth”.
For instance, positive results are often the ones that get published in journals, while experiments that failed are usually left unreported.
This phenomenon is known as selection bias and it affects the reliability of data available to scientists and laypeople alike.
Another problem is the reliance on statistical measures like p-values to determine whether a correlation is significant.
This may seem like a sufficient measure, but Goodhart’s Law warns us that people will always try to game the system when there’s an incentive for doing so; in this case, getting published!
Finally, there’s also an issue with how new findings are reported in the media – only a small fraction of scientific research gets covered, and what does often gets sensationalized for better headlines.
All of this means that bullshit can easily creep into modern science – be skeptical about any big claims without proper proof!
Use Simple Techniques To Spot And Call Out Bullshit
Are you tired of being inundated with false information? Want to stop being bamboozled by unfounded claims and innacurate reports? Well, Calling Bullshit is here to give you a few simple tips that can help equip you in the fight against misleading claims.
The authors suggest asking three key questions- who is behind this information, how did they get it and what are they trying to sell- as a “journalist would”.
Knowing the source can be incredibly powerful when attempting to decipher factual from bullshit.
Another suggestion for fighting false information is making Fermi estimations.
Essentially these are rough calculations specifically used to analyze scale.
For example, judging the credibility of someone’s claim that 121,000 John Smiths’ exist in the UK by estimating the total population (100 million), then deducing that one in a hundred people are called ‘John’, as well as assuming its equal for people surnamed ‘Smith’.
The estimations should tell you whether or not such figures could be correct.
However, it is also important to remember one thing: confirmation bias.
This is where we unconsciously believe something because it fits what we already know or think – even if it isn’t true!
So always be wary of any information which validates pre-existing opinions.
Finally, try to stay away from online sources such as Twitter due to its unreliability – which could lead a person astray from reality.
When dealing with bullshit, don’t forget to call out inaccuracy’s politely but firmly.
No one thinks fake news is acceptable; so arm yourself with facts and don’t hesitate make your voice heard!
Wrap Up
The Calling Bullshit book is a great resource for those wanting to better equip themselves when it comes to recognizing and debunking bullshit.
The important takeaway is that correlation does not imply causation, so apply extra scrutiny to datasets that are trying to draw connections between things.
It’s also wise to consider the context of any numbers presented and be skeptical about what they really mean.
Last but not least, if you call out someone on their bullshit, double-check your own facts before doing so – and if you make a mistake, admit it.
Doing all these steps will ensure accuracy when trying to separate truth from (bogus) fiction.