We all want to have a solid opinion. We all want people to listen when we speak. And if not we just want to make sense of it all without making a big deal about it.
The truth is that more often than not we take opinions from other people, friends, or authority figures and think mistakenly that it is an opinion that we derived by ourselves after careful consideration and unbiased judgment.
We also tend to view ourselves and our opinions as being rational and that we are somehow immune from prejudice, stereotypes, and quick judgment while we view other people that don’t agree with us as biased, brainwashed, and affected by powers that be — victims of their upbringing. And even in the instances when we realize our wrong and admit that we were duped, we are quick to forgive ourselves. We say “I was wrong because I was under pressure” while when see other people being wrong we say “They have poor judgment because they are fools or uneducated”.
The above behavior, besides being a phycological phenomenon known as trait ascription bias it is also partly affected by our environment and in the digital spaces where we spend our time in.
We engage with the world and with the rise of the abundance of information provided by digital mediums we evolved into a machine that partly filters its inputs and magnifies all of its outputs. The conviction that we are informed is ever-growing.
With all this ocean of information, we are swimming in and our desire to make sense of it and drive logical and meaningful conclusions about the state of the world, we fall into patterns in order to cope with all this abundance and complexity.
These patterns of thinking and behavior give us the impression that we are well informed when in fact we are not. The tricky part is being aware of these patterns and the environment where they flourish and how they affect us.
This is not to say that digital spaces or the Internet are inherently bad. These patterns exist outside in the real world as well. Being online is simply a place where it’s easier for them to flourish. Like a virus exists everywhere but it thrives and becomes more infectious in certain places under the right conditions.
The problem is that the more informed person you want to be overall, the more exposed to all these you will be and the more well-read you are the more susceptible you are to the traps mentioned. No one is immune. And once you are in one of these bubbles, you can’t know if you are really in one. Of course, this is due to our fallibility as humans and being unable to realize it. The solution lies partly in accepting our fallibility.
But what are all these environments and just how we are fooled by our own minds?
I’m going to lay some of them here.
If you are well-educated chances are that you are all about the facts. Whatever the subject of debate is, you want the peer-reviewed study, the professor who wrote that article, or the well-trusted media source that your university professor recommended. All this is fine. It makes sense to say I will trust the experts since they are better informed than me.
(Defining an expert: It is not a person who is right and not someone who you should trust blindly. Experts are just better curators of all the findings of what a complex field produces)
But the problem with all this is that it is based on trust or faith. You have peer review studies coming from one or more opposing sides, all of them being experts. In the end, your opinion, if rushed, will be formed based on either emotion or blind trust. I will trust the experts is an emotional way of getting informed. Experts are not to be blindly believed but sure to be listened to. They too, like all of us, though are vulnerable to Gell-Mann amnesia. In the end, you end up in a situation where you have put experts on a pedestal and treated them as gods which seems like it has quite a religious connotation to it.
7. Gibson's Law:— Gurwinder (@G_S_Bhogal) February 11, 2022
“For every PhD, there is an equal and opposite PhD.”
In matters of law & policy, anyone can find a subject-matter expert who supports their view, because having a PhD doesn’t necessarily make someone right, it often just makes them more skilled at being wrong.
(The above tweet is part of Gurwinder’s mega-thread. High-value alert!⚠️)
Years ago experts were talking about the food pyramid and advocating strongly for it. This was perceived as settled science and it was the way to go if you wanted to be healthy*.* The **problem with this is that it was never really true and with little science backing it up. This is not to say they are all liars but to realize that they, although experts, can be wrong and change their opinion.
Every single expert you trust once was totally ignorant only to learn and improve gradually. Ideas and knowledge don’t grow in a cocoon. At some point, all the experts were talking without knowing, and even if they know, they still are ignorant when it comes to creating new knowledge. The battlefield of ideas is where new lands are discovered and you must be prepared to make a lot of mistakes.
All knowledge starts with conjecture and is tested with criticism and is not prophesied by higher entities or written in some textbook. A theory doesn’t need any special credentials to get into the scientific realm but once it’s in it should be ruthlessly criticized.
Quite often, true enough is good enough. Quite often, less wrong is the best we have at the moment.
Assign Me an Opinion
Having a solid opinion does not happen overnight. It takes years and a lot of thought put into it. Often in academia, the experts are not necessarily the best communicators because it is not their job to find an easy and catchy way to explain a complex topic to the masses(if such a thing can be done). Their job is to come up with new ideas if possible, conduct research and criticize ideas so the best ideas win. That is how science makes progress.
Masses want an opinion but they don’t have the time to form it nor the inclination to acquire all the necessary knowledge. Furthermore, one’s opinion has become the native way of the online world to value a person’s character, making everyone a customer of the opinion-making market — the media.
(If you want more, this piece from Gurwinder brilliantly describes the situation)
The link between these two worlds — public and experts — is commentators and the platform where they are active is the one that gets hijacked the most. Commentators take sides in ideological issues and twist information until a compelling narrative is formed and served to you, the critical thinker.
If you don’t have some kind of expertise in a field and you have a strong opinion, then it is likely assigned to you rather than organically formed. Once assigned, you are redirected to the proper echo chamber.
A large part of our opinion is shaped by listening and reading to people with who we agree and are aligned ideologically. In the world of social media — where echo chambers shine — we follow people that think like us or with people who reinforce our misconceptions and with what we wished was true. It is not uncommon for people to publicly criticize their peers if they follow so-called controversial figures on social media or are exposed to different opinions. You follow means you like it and thus automatically endorse it.
If an echo chamber gets organized it is often called a mob. We yet have to realize how much we are affected by this since it creates the safest environment for us so our opinions are not ever challenged and only further affirmed. If someone even raises a question asking for nuance is part of the problem and if a member of the group raises concern about the group’s practices is accused as a traitor, all the while you continue to leave in a prison so perfect you don’t even know you are locked up.
Confirmation bias is interpreting information in a way that supports existing beliefs.
If you start searching for just about anything, looking to arrive at a certain conclusion, you will for sure find it. You can check facts all day long only for your bias to keep growing since we — usually unintentionally — ignore inconsistent information.
I never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do — Charlie Munger
Under the hood what really happens is that we recognize patterns that will neatly fit into our framework of thinking so that what we read is conveniently aligned with what we already believe.
We can’t process everything on the spot every time we encounter new information. We don’t have the time for that. So we are forced to engage in this pattern recognition which is a strength but it is also our biggest defect.
Pattern recognition will help you call the police if you hear a weird noise in the middle of the night, which is good. Weird noises at night are suspicious. Conversely, pattern recognition makes you think that a group of a certain ethnicity are all criminals because you had three unfortunate events thus making you a racist, which is bad.
Cognitive dissonance is the feeling of discomfort when real-world evidence conflicts with existing beliefs.
Ideally, evidence is then re-interpreted to remove discomfort.
Cognitive dissonance is a natural part of re-thinking and gaining new knowledge. The problem is when, instead of reevaluating we take a defensive stance and double down even to the extent of fabricating data or doing personal attacks. Yes, reactive devaluation **is real.
When we are experiencing cognitive dissonance we are demolishing a part of what we thought was true. It’s not a pleasant experience and no one wants to do it.
Each opinion of ours is like a brick in a building and each additional brick we place a top of others previously placed. Changing opinions or reevaluating is so hard because we don’t want to tear down part of that building.
9. Belief Perseverance:— Gurwinder (@G_S_Bhogal) February 11, 2022
Our opinions are like bricks in masonry; each supports & is supported by others. Changing a belief means tearing down all beliefs atop it. Such demolition is hard to bear (easier to live with a skewed building) so people will rarely let that 1 brick budge. pic.twitter.com/L9dYYArW1a
(The above tweet is part of Gurwinder’s mega-thread. High-value alert!⚠️)
In her book The Scout Mindset, Julia Galef describes motivated reasoning as the phenomenon of believing what is convenient. It is far easier to accept as true what is seemingly true on the surface than to make the effort to conduct research without wanting a certain result in the end. It is very hard to resist not knowing for a while and suspending judgment rather than wanting to prove your point without being aware of the whole picture, filling the blanks with a convenient narrative.
This kind of thinking, as it was beautifully described in the book, is called soldier mindset where you pick a side and then defend it at all cost. Whereas collecting data and conducting research to create a more accurate map of understanding to only then form an opinion is called scout mindset, as the book’s title suggests.
With motivated reasoning, we believe what is convenient since it saves us from the hustle of looking at the evidence and considering edge cases.
One Screen, Two Movies
Often times we see what we want to see partly from motivated reasoning or blind conviction. This happens typically in the news for example. You have a story and people conclude different things due to the facts being passed through an ideological or religious filter. This only adds fogginess though. A seemingly random event like an asteroid for a non-religious person might seem just random whereas for a believer might carry great significance. If a certain law is about to pass by the government both the right and the left will conclude different things about its application and effect. The event is the same yet two worlds unfold between the ears of the individual.
The result of this is having seemingly straightforward events twisted even without malicious intent.
One screen, two movies is a term coined by Scott Adams the creator of Dilbert.
Believing is Seeing/Groupthink
We don’t see or read an event and then rationally interpret it. Rather our beliefs shape the lenses of how we are going to interpret them.
Being part of an organized ideological group is a guarantee that your own judgment and opinions going to be diluted. Conforming to the supposed universal truths the group entails is a one-way street if you want to be part of it.
The result of this group thinking is following a set of rules while witch-hunting everyone that diverges even slightly from the doctrine. Your beliefs were shaped by your ideology’s template without even realizing it, a package of beliefs on how to act and what to say.
If you see people marching in the street for some cause they almost seem like fine-tuned robots executing commands. Likewise, if you have ever talked to a friend fully indulged in political ideology, they never seem to listen to a different opinion since they are part of a group and groups are notoriously famous for never admitting they are wrong. The individual’s mind will get destroyed so the ideology survives.
You might wonder why all the above. Well, the bubble is moving, is well sealed and we are in it. If what we think, we become, then how we think controls that what.
Whether your model of getting informed is right or wrong you always have a way to further improve and be a bit more aware of all the aforementioned situations. Being a soldier and feeling attacked when your ideas are challenged won’t get you far. If you want to be informed in the truest way, you will embrace the complexity, value your ignorance and, allow thoughts to sink in before forming an opinion — if any — so all the above fallacies can be diminished. At least partially.