What are my biased thinking patterns?

Being too critical

Jurij Fedorov
15 min readJun 11, 2020

I’m constantly thinking about how other people fail by miscalculating things and this has been my area of expertise. So once in a while I need to write about something else to get out of those negative thinking patterns. Being critical is great but it’s a negative state I don’t want to consistently be in.

This lead me to think about how too few academics openly admit to their biases. And that I could at least be open about this myself to make the world just a tiny bit more transparent. It would be better if all thinkers were fair and transparent. And thinking about my own thinking for a change makes me calm instead of irritated.

What if you are not ideological?

The second reason for looking at my biases is that I’m not really a person who joins single focus groups to fight other groups. I’m always critical of all ideologies and don’t belong to any one group or culture. I’m as critical of my own groups as most outside groups. So I tend to be critical of everything and not develop one set of rigid ideological in-group biases.

Such rationalist thinkers without a group often don’t have huge ideological biases forcing worldviews onto them. Yet rationalists are still biased in what they focus on, what they remember about a topic, what they assume is basic fairness, and what simple critical calculations they do in their head when looking into an issue. They don’t lie and deceive for selfish or ideological reasons and they are not afraid to accept any new evidence or reject any old evidence given time. But no one can know everything even without a clear ideology bias. So clearly I do have biases too! They are just harder to describe as I can’t just say that my thinking is communist, fascist, libertarian or progressive. It’s a bit of everything as my morals are a mix of various ideas and concepts that constantly adapt and change to new evidence.

I’ll go over social science and news article examples and show how my focused thinking may lead to wrong conclusions. I always end up getting to the truth at some point but my initial thinking may cause me to overlook facts or lead me astray as I have fast thinking guides made to throw out bad evidence that sometimes fail and throw out good evidence too.

How I think

It’s hard to clearly describe how I think in a short blog-post. But hopefully my explanation will be clear at the end. It’s a 2-step thinking manual I use automatically. I will mention the 2 steps a bit further down.

First I spot new information. The first seconds of looking at new info my brain stays biased as I still had not had enough time to think over alternative explanations for the story. Then afterwards I apply logic I do and try to remember what science and history I have read on the topic to see if this new info is realistic. It’s not ideal as in my field, psychology, you may have read 5 books on a topic and 10 years later none of the studies the books were based on can be replicated successfully. This means that everything you know about an area may be false and using past knowledge to explain new info may just lead you further into the ignorant darkness. But what can you do besides staying critical?

I try to control all info I see and try to think about it without passion or emotion. But initially I do need to keep my emotional state activated to be motivared and engaged with what I study. So I let some emotions and jokes get to the surface at times.

Here is my 2-step line of thinking:

Step 1

In the first 20 seconds I mostly apply fast instinctual heuristics. So any news article will be emotionally engaging and make me feel something. A death will make me feel initial sadness. A vandal will make me feel initial anger. Any fact will be judged in the moment. With studies I’m much less emotional but still let my engagement lead me somewhere.

Step 2 — morals

As a social scientist I can never just react after reading an emotional story. If emotions are what you react on you are not a critical thinker so your ideas and concepts are not worth anything in social science. You need to think deeper. So this step is needed if the story is an emotional anecdote or a social policy opinion.

This is where I apply my logical simplified “society morals”. They are close to being automatic but slower and less dependable than inborn instincts as I had to learn them from ground up as an adult to solve logical societal moral issues. I picked them up because they to me seemed like universally fair morals. Among these “social science moral thinking” thought patterns are; personal liberty, utilitarianism and veil of ignorance. I will explain them later. They are very crude tools and seldom solve detailed or small problems, but they are great at “solving” most of the logical moral problems one encounters. As news stories are often written to create irrational shock and anger logical moral thinking becomes essential to escape that trickery.

For example, when I see a police shooting story I may be repulsed for a few minutes but then see the episode from a greater societal top-down perspective and hopefully figure out that the shooting was not easy to avoid because of how criminals act towards the police in that city. The police may be on the edge and fear for their lives. This means that what at first glance seems like something horrible and evil may have been less so in real life if you consider the larger circumstances instead of only the local ones.

If the step 2 logic is applied it also means that I avoid most emotional debates as I get over the emotional phase faster than people who don’t have the same need to be critical. I plan and “solve” nearly all moral issues in my head even as an extremely emotional person. Often I remain angry but 90% of the time my anger turns and points towards the real problem behind the issue or fades away. This top-down thinking also means that single sad anecdotes don’t interest me as much as the larger issues causing the problem. This may make me seem cold and calculated. Holistic and ideological thinkers may see this as a big problem or even a moral fallacy as they may feel the need to use their emotional state to get to the core of issues. I did that too when I started off in psychology. But in my experience, and in my opinion, it’s a lousy way to think about academic topics as confirmation bias will completely dominate your thinking.

Step 2 — logic

As I’m applying my step 2 moral thinking to figure out what to think about the case morally I, at the very same time, try to figure out the details of the case and look into how valid the story is. Step 2 point 2. This is basic rational thinking that many programmers and STEM people are uniquely great at. Most intellectuals are not as good at large scale moral thinking but it’s not essential in most real life settings anyhow unless you are a manager. My own critical thinking is so strong that it even becomes a bias as I tend to not believe things unless there is plenty of corroborating evidence. You learn to think this way in social science as you can easily find a study to support any type of claim. If you relax too much you let confirmation bias dominate in which case you can easily “prove” anything in social science. This means that I even question very simple and obvious things at times. Are we sure good teachers are better than just okay teachers? Most people would just believe it is so without questioning it as it feels correct. I feel it’s true too. But yet logically I don’t fully support that claim because I just have not seen convincing proof to support that hypothesis. So it may feel true but if you can’t prove it you probably ought to assume the effect is at most very small. If the effect was consistent and large it would have been uncovered many years ago by millions of researchers studying this stuff. We would have figured out how to use it and apply it to schools to better the world and outcompete other schools and countries. Yet this hasn’t happened in any systematic way. Lack of proof is often the very best evidence we have in social science. But it doesn’t conclusively prove the effect is not there.

This is how I’m critical of news articles:

Check facts. Are the facts clearly supported by evidence? Are the statements made by people who would suffer greatly if caught in a lie or maybe a mother of the victim who is emotional and biased? Some smaller pop-papers or low quality papers don’t mind getting stuff wrong at times either as they don’t have any noble image to uphold. Besides that I always look at what sources the articles use. Anonymous sources are to me pretty iffy. No source for data used is a huge red flag too.

Bias. What bias does the paper and writer have? The bias of the story will influence what info they include or exclude so make sure you avoid any biased site! MediaBiasFactCheck.com will tell you how a news company is biased politically. Make sure to avoid all sites that are not near the middle politically. If they are very ideological they will mislead you daily even if you agree with their political view.

Also, beware of biased fact-checkers and bias checkers. adfontesmedia, another media bias site, had CNN in the middle politically. Obviously most people know that CNN is less left-leaning than MSNBC, but they may think that MSNBC is just slightly left-biased and that CNN is centrist. From their ideological point of view it seems that way.

I avoid reading most stuff that’s from pop sites, The Guardian, The Independent, Huffington Post, Vox, Breitbart or other such lower tier news sites with great ideological bias. Some of these sites are better than others. Vox, for example, has a good intro to g factor. But I’ve seen them all slightly manipulate info so many times that I just avoid clicking a link if it leads to those sites. They of course get stories right too but why even waste time figuring out the manipulation in each article? Besides the sites I mentioned there are surely many other conservative sites that are very iffy too. I just don’t know about them as I don’t read conservative news unless I plan to seek out a specific point of view. While the bad left-leaning sites I mentioned are sites I constantly see online and constantly am let down by. I think this reveals my biggest “bias” by far. 60% of what I read is left-leaning news or left-leaning academia. Only 20% is fairly neutral. It’s not that I try to make it this way it’s just that most stuff in my field is left-leaning so it would be very hard for me to avoid reading texts with this bias. But I work hard at keeping my critical state high. It’s not like in university where teachers told me some writer was evil and I’d just mostly avoided the writer and their whole field. I know how to spot ideological fights between groups now. I know that sometimes a researcher is called evil not because of bad science but because of ideological dislike against them.

I also mostly avoid news without a conclusion to the story because it’s misleading at best. If there is no conclusion the news sites guess a lot and just estimate what has happened and what will happen. How often are they wrong? No one knows because they don’t apologise for guessing wrong. They only apologize for getting facts wrong and that’s only if someone notices it. So stories without a conclusion will be misleading and extra ideological as they leave more room for guesswork which is often guided by confirmation bias.

As stated earlier, being critical is second nature to me. I have even dismissed big rumors for lack of evidence that then later turned out to be true. Some may pick another level to be critical at. Just be aware that if you lower your criticality level you’ll for sure start to apply your ideological bias to things as you allow yourself to think more emotionally about things. Guesswork invites confirmation bias.

The 3 moral foundations I apply to news stories

I do have some ideological biases that initially make me perceive something as moral or immoral. I have gained these ideological foundations from reading modern moral philosophy. So I don’t develop this thinking to solve some personal problems but rather to understand how to effectively and fairly to run a society in a way that can make most other people happy. Obviously there is more detailed thinking to leadership than this, but the 3 check boxes are great to spot if a story or value would feel fair or not to most people. For me it comes down to good leadership. If you don’t have any moral voice you can’t lead people while being fair.

So let’s get into; personal liberty, utilitarianism and veil of ignorance.

Personal liberty

Personal liberty is the liberty to move about in a society, free speech, the right to mingle with any group of people, the right to start a company, the right to buy and sell the products you want. As long as you don’t hurt anyone you should more or less be allowed to do as you please.

I subscribe to this line of thinking as I want the government to keep the population free and having this as a basic philosophy makes me see most of the cases where the state oversteps their boundary. I just don’t trust the state or any other monopoly with too much power. Lots of small powers in a country create a fair power balance as they make sure no one can dominate and break the moral laws of personal liberty. This moral right keeps groups from taking away your liberty which is why most people subscribe to this moral doctrine to some level.

How the assumption fails

Most academia is left-leaning and often see free speech and other personal liberties as less important than the satisfaction level of the assumed marginalized citizens. So they may not support free speech if a small poor group says that some speech makes them sad or mad.

It also means that I may support some factual evidence that is seen as an anti-thesis to a worldview where social science is supposed to help specific people instead of just inform with facts. In this line of thinking people want to morally control each other to achieve a given moral standpoint. They thereby decide with great authority what the layman population should know about the world. In that line of thinking I’m biased as I don’t pick what facts to accept based on their imminent value to marginalized groups in society. It’s a long discussion about what science is and should be. But I do fail if you use those moral criteria.

Furthermore, personal liberty is a right only so far as it’s personal. Owning an atomic bomb, for example, may not be advisable because it’s too dangerous for too many people. Even if you don’t plan to use it.

Utilitarianism

Another one of my ideological foundations for fair politics is utilitarianism know from John Stuart Mill’s writing.

Definition of the term, utilitarianism:

Though there are many varieties of the view discussed, utilitarianism is generally held to be the view that the morally right action is the action that produces the most good.

https://www.welovephilosophy.com/2012/11/26/trolley-problems/

When controlling top-down as a state or manager one should consider decisions that help out the most people while hurting the least amount of people. So shutting down a shoe factory may make a noise sensitive complaining neighbor happy. But since it would cause 200 people to lose their jobs the amount of unhappiness it causes is greater than the amount of happiness the neighbor may then feel.

Hence my initial moral interest would be to keep the factory open and maybe look at other solutions for the angry neighbor problem.

How the assumption fails

I’m not sure utilitarianism ever changed my opinion on a scientific finding but it does guide my interest to read some research over other research. So in that way it biases me academically.

Utilitarianism itself as a philosophy doesn’t always solve moral group problems as there is no perfect way to measure or compare emotional problems. One person may think that keeping people largely employed is important while another person may think that we should protect nature at all cost even though it would require 40% of the working force to lose their jobs. There is no factual right or wrong here even though causing mass unemployment would cause other problems and maybe even cause a bigger destruction of nature long-term.

The doctrine also fails as you can’t perfectly compare the great suffering of one person to the slight happiness of the group overall. The moral example of this is a town locking a single girl into a cellar for the rest of her life. Her suffering magically makes the whole town healthy and happy. Is this still morally correct? Should the town keep her down there to sustain the max amount of happiness in the town overall? Utilitarianism by itself doesn’t seem to make sense without the moral foundation of personal liberty. It’s too inflexible just like personal liberty may be extreme by itself. Should you be allowed to own an atomic bomb if you don’t plan to use it? Probably not, says utilitarianism.

Veil of ignorance

John Rawls and John Stuart Mill are some of the many thinkers who made this moral concept popular.

Description of the image below:

Symbolic depiction of Rawls’s veil of ignorance. The citizens making the choices about their society make them from an "original position" of equality and ignorance (left), without knowing what gender, race, abilities, tastes, wealth, or position in society they will have (right). Rawls claims this ensures they will choose a just society.

A state is not supposed to support a certain specific group for any reason. Any individual in any walk of life needs to have the same liberties and laws apply to him from up top. So you can’t create an affirmative action system that only supports a specific sex or race as that would break the veil of ignorance doctrine . And these state laws usually apply to hiring rules too. Though there is likely no country with no affirmative action at all as it’s an easy way for a politician to gain points with some minority group.

The doctrine means that regular people coming in for a job interview need to be able to speak up and show what they are as individuals so that they are not punished for what their group overall does. The doctrine implies that you shouldn’t think: “Hmm, last 3 Vietnamese guys I met were all lazy. So I won’t hire this Vietnamese guy even though everyone says he is a hard worker.”

If I notice that a woman has written an article I don’t judge it better or worse based on that fact alone. Some may see that as “unfair” as I don’t take experiences into account. But often it’s less biased to describe a group behavior as an outsider. So a man or a woman with no female friends would often be better able to write about women in a critical light. Though if you want strong emotional anecdotes in the story too, for the entertainment factor, you go for personal experience.

How the assumption fails

As a general rule it works very well to create a fair government. But in private industries you need to pick winners and losers without perfect information to get by.

If you cannot study the individual then the group will be all you have to go on. You may want to employ a random man, rather than a random woman, for a job requiring a strong individual. Or you may need a nurturing individual and would be better off with a random woman. In such extreme hypothetical cases the veil of ignorance is -100%.

So because we do need to make fast and profitable choices we cannot personally use a 100% veil of ignorance in all cases. You do need to use group stats to make decisions for an area of a city for example. But try to avoid doing it for individuals you meet.

Many progressives think that being race blind due to veil of ignorance is itself a moral fallacy as you cannot support or punish individuals from groups that deserve to be punished or helped for various historical reasons. This emotional argument proposes that race blind people like myself are racist.

Conclusion

I wanted to understand how I think because as I often hear the argument: “all people have biases”. I wanted to explore my own thinking indepth to understand how the statement applies to a rational thinker. I also wanted to be open about it as maybe one or two readers may read this blog-post and hopefully feel that it’s a good idea to reveal their biases to the world too. Obviously revealing my biases 10 years ago would have made this blog-post 20 times as long but as you become more rational the biases become less ideological and instead mostly consist of singular calculation errors.

I have developed a few arbitrary heuristics to know how to spot right from wrong morally and factually in systematic and fast ways. And clearly, as all logical tools tend to do, my own tools fail a lot. But overall I do feel like my tools help me avoid nearly all big biases and ideological traps and pretty much most fake news.

Sometimes my thinking makes me overlook glaring errors or see something as true that is actually false. False negatives and false positives. Which is why it’s always important to improve how you think about the world. We want rational minds, and we never reach that goal as humans, but we always try.

--

--

Jurij Fedorov
Jurij Fedorov

Written by Jurij Fedorov

Psychology nerd writing about movie writing and psychology

No responses yet