Can you spot when an argument has holes bigger than Swiss cheese? What are the tricks people use to spin their arguments and make them seem plausible? Faulty reasoning, circular arguments, hidden messages, false dichotomies. These are just some of the strategies used by politicians, the media, conspiracy theorists, white supremacy groups, and many others, to influence our ability to separate fact from fiction and evaluate arguments rationally. Read on to find out about some of the tricks used and how they work.
Using a false dichotomy
A ‘false dichotomy’ is where two options are presented as if they’re all there is to choose from. Take the argument, “If you’re anti-racist then you’re anti-white”. This argument puts people in a position of having to choose one or the other and they can forget that there are other options to choose from. For example, a person could be accepting of other races AND completely accepting of white people.
Over-generalising occurs when somebody says, “It happened to me so it must be true for everyone”. Sometimes this might just happen to be the case, but it’s a very big assumption to make without looking into it further. For example, a man gets bitten by a dog and then overgeneralises and assumes that all dogs bite and are dangerous to humans. Most people would query this ‘truth’ and recognise that there could have been a range of variables involved which made the dog bite. Maybe the dog was a guard dog and the man was trespassing? Some other examples of over-generalising might include: I got bashed by someone from a different race so all people from that race hate white people like me. A man stole my car so all men are thieves.
Implied messages and the dog whistle effect
This strategy uses implied messages hidden within explicit messages. It’s known as the ‘Dog Whistle Effect’ because the implied messages are only heard by those who are attuned to hear them. Politicians are good at using this trick. For example, they might say something like “We want people to feel safe in their homes”. On the surface this message seems quite nice and most people would agree with it. But the ‘Dog whistle’ implied message is: “There’s something to be afraid of, and we’re the people who are going to protect you – so vote for us”. This implied message can influence people and make them feel afraid without them even realising why.
Consider the explicit message “We don’t want Sharia law in Australia”. With the Dog Whistle Effect, the implied message is that “Muslims are trying to impose Sharia law on all Australians”. So the first message, “We don’t want Sharia law in Australia” subtly influences people and creates a fear that Muslims will forcibly impose Sharia law in Australia, and that Australians need to be worried about this. Because people aren’t even aware that they’ve heard a second message, it’s very hard for anyone to challenge it or present other information.
Inferring causation from correlation
Correlation means there is a link between things. Sometimes this link is incorrectly interpreted to mean that one thing causes the other. However, the reality is that changes in both things are sometimes caused by other factors. Basically, correlation does not equal causation.
Example 1: The rates of violent crime and murder have been known to increase when ice cream sales do. Butice cream doesn’t cause people to become murderers. It’s much more likely that they are both affected by a rise in temperature.
Example 2: There is a link between being Aboriginal and having a high likelihood of involvement in the criminal justice system. However, being Aboriginal does not cause a person to commit a crime. There are other factors which are more likely to be the cause of this, such as poverty and lack of social opportunities, that Aboriginal people are particularly likely to experience..
Logically consistent nonsense
Just because a theory seems to fit the facts and sounds logical, doesn’t always mean it’s true. In, fact, it can even be complete nonsense.
For example, even though Al-Qaeda terrorists have claimed responsibility for the attacks that occurred on September11, 2001, some conspiracy theorists claim that US government either planned the attack or aided in the destruction of the twin towers to fuel hatred and to justify the wars in Afghanistan and Iraq.
Meanwhile another theory put forward by conspiracy theorists is that Israel carried out the attacks on the towers in the hope that the West would then initiate an attack on rogue Islamic states.
When considering a theory, think about the likelihood that the theory offers the best possible explanation. If it looks like a horse, and sounds like a horse, consider the fact that it’s probably a horse, not a zebra.. This of course doesn’t mean that the truth could be more complex than it first appears, but if you need to apply your critical thinking skills to the conspiracy theories, as well as ‘official’ explanations. The following questions can act as a starting point: How reliable if the evidence the theory uses? What evidence there is against the theory? Who is promoting the theory – do they benefit from making people believe it?
Immunisation against error
With some arguments you can end up in a double bind where it’s a case of “heads the argument wins, tails you lose”. These arguments are ‘immunised’ against being proven wrong by claims that anyone who disputes them is ignorant/ brainwashed/ in denial and so on.
Example 1: “Aliens exist but the reason we haven’t seen any is the Government covers it up”. If you choose to dispute this claim then someone who believes it may argue that you’re falling for the Government’s lies.
Example 2: “Jews are secretly ruling the world by getting others to do their dirty work”. If you argue against this claim then someone who believes it could propose you’re a brainwashed, weak minded follower who can’t see the truth.
The power of a bold assertion
Sometimes people get away with making false claims simply by being bold enough to stand up and make them. Goebbels (a WWII Nazi) once said “If you tell a lie big enough and keep repeating it, people will eventually come to believe it”. The reason why this trick works is that unless we have a reason to doubt a claim, most people will accept it at face value and assume it is right. Especially if we admire or support the person making the statement. So the lesson from this is, don’t just believe everything you read or hear. Check your facts,, and think carefully about whether the source of those facts are credible. It is really easy to publish information that isn’t true, particularly online.
Ask yourself whether the author is reputable (e.g. from a recognised research institution or government department)? If it uses statistics, is there information about how the research was carried out?
Finally, our ability to think critically and assess arguments impartially is affected by our own confirmation bias. We all pay more attention to information that fits with our prior convictions and are more likely to discount information that doesn’t fit with our existing beliefs.
Example 1: If you like McDonalds you are less likely to pay attention to health warnings about the dangers of diets high in sugar and fat. You may also be more likely to believe a news story about someone who’s eaten McDonalds every day of their life and is now 95 years old.
Example 2: If we think multiculturalism increases crime, we will notice when someone from a different culture commits a crime and give this more emphasis. And we won’t notice the times when a white person commits a crime, or we’ll discount that information.
Everyone gets caught in confirmation bias and it can be hard to notice when you are doing it. One way of testing for this is if you can’t even consider another point of view, then you are most likely caught in confirmation bias. Ask yourself if you only pay attention to the facts that confirm your opinion and challenge yourself to try and take a different viewpoint.