One basic human characteristic is tribalism. Humans have a kind of ingrained fear or distrust of the "out-group." It's a previously adaptive trait that binds small groups of individuals together and prevents them from wandering off or joining other groups. But it also leads to ethnocentrism and divisions between groups. For example, medica studies show that the use of oxytocin may increase feelings of trust between individuals, it also increases fear of others. This characteristic was obviously important back when we lived in family clans or tribal arrangements, but today it leads to all sorts of social problems, including racism, prejudice, and our inability to empathize with people we don't immediately know.
The
human brain is capable of 1016 processes per second, which makes it far
more powerful than any computer currently in existence. But that speed of thought has annoying glitches that cause us to make questionable decisions or make wrong conclusions. These mistakes are a consequence of our limited intelligence and predisposed
tendencies. Examples include the confirmation bias (we love to agree
with people who agree with us), the gambler's fallacy (the tremendous
weight we tend to put on previous events that aren't causal factors),
our tendency to neglect or misjudge probability, and the status-quo bias
(we often make choices that guarantee that things remain the same).
Some of these are adaptive traits, but others are simply cognitive
deficiencies.
Confirmation bias is an easy one to understand. We love to agree with people who agree with us. It's why we only visit
websites that express our political opinions, and why we mostly hang
around people who hold similar views and tastes. We tend to be put off
by individuals, groups, and news sources that make us feel uncomfortable
or insecure about our views — what the behavioral psychologist B. F.
Skinner called "cognitive dissonance."
It's this preferential mode of behavior that leads to the confirmation
bias — the often unconscious act of referencing only those perspectives
that fuel our pre-existing views, while at the same time ignoring or
dismissing opinions — no matter how valid — that threaten our world
view. And paradoxically, the internet has only made this tendency even
worse. As a result, people are less and less inclined to listen to other people's views or even to think about reaching compromises because somewhere on the net they can find support for their positions.
In-group bias is somewhat similar to the confirmation bias is the ingroup bias, a manifestation of our innate tribalistic tendencies. It is said that part of this effect may have to do with oxytocin — the so-called "love molecule." This neurotransmitter helps to forge bonds between people in our in-group, performs the exact opposite function for those outside our group. To outsiders, it makes us suspicious, fearful and distain other people. Ultimately, the in-group bias causes us to overestimate the abilities and value of our immediate group at the expense of people we don't really know.
The Gambler's Fallacy is a glitch in our thinking. We tend
to put a tremendous amount of weight on previous events, believing that
they'll somehow influence future outcomes. The classic example is
coin-tossing. After flipping heads, say, five consecutive times, our
inclination is to predict an increase in likelihood that the next coin
toss will be tails — that the odds must certainly be in the favor of
heads. But in reality, the odds are still 50/50. As statisticians say,
the outcomes in different tosses are statistically independent and the probability of any outcome is still 50%. This gives a person the false notion that events are not random. One can ride the dice rolls to victory. This positive expectation feeling often fuels gambling addictions. There is also the sense that our luck has
to eventually change and that good fortune is on the way. It also
contributes to the "hot hand" misconception. Similarly, it's the same
feeling we get when we start a new relationship that leads us to believe
it will be better than the last one.
Rationalization is a process which tends to gloss over one's mistakes to justify a prior decision. Example, if you bought something totally unnecessary, faulty, or overly
expensive item, people tend to rationalize the purchase to such an extent that
you convinced yourself it was a great idea all along. It is mental mechanism
that makes us feel better after we make crappy decisions, especially at
the cash register. Also known as "Buyer's Stockholm Syndrome," it's a way
of subconsciously justifying our purchases — especially expensive ones.
Social psychologists say it stems from the principle of commitment, our
psychological desire to stay consistent and avoid a state of cognitive dissonance.
Neglecting clear probability is another strange twist in brain analytical function. Very few of us have a problem getting into a car and going for a drive, but many of us experience great trepidation about stepping inside an airplane and flying at 35,000 feet. Flying, quite obviously, is a wholly unnatural and seemingly hazardous activity. Yet virtually all of us know and acknowledge the fact that the probability of dying in an auto accident is significantly greater (1 in 84 chance) than getting killed in a plane crash (1 in 12,000) — but our brains won't release us from this crystal clear logic. It's the same phenomenon that makes us worry about getting killed in an act of terrorism as opposed to something far more probable, like falling down the stairs or accidental poisoning. This is our inability to properly grasp the sense of peril or risk - - - which often leads us to overstate the risks of relatively harmless activities, while forcing us to overrate more dangerous ones.
Observational Selection bias is that effect of suddenly noticing things we didn't notice that
much before — but we wrongly assume that the frequency has increased. A
perfect example is what happens after we buy a new car and we
inexplicably start to see the same car virtually everywhere. A
similar effect happens to pregnant women who suddenly notice a lot of
other pregnant women around them. Or it could be a unique number or
song. It's not that these things are appearing more frequently, it's
that we've (for whatever reason) selected the item in our mind, and in
turn, are noticing it more often. Trouble is, most people don't
recognize this as a selectional bias, and actually believe these items
or events are happening with increased frequency — which can be a very
disconcerting feeling. It's also a cognitive bias that contributes to
the feeling that the appearance of certain things or events couldn't
possibly be a coincidence (even though it is).
Status quo bias is an inherit form of security in one's place. We humans tend to be apprehensive of change, which often leads us to make choices that guarantee that things remain the same, or change as little as possible. Needless to say, this has ramifications in everything from politics to economics. We like to stick to our routines, political parties, and our favorite meals at restaurants. Part of the perniciousness of this bias is the unwarranted assumption that another choice will be inferior or make things worse. The status-quo bias can be summed with the saying, "If it ain't broke, don't fix it" — an adage that fuels our conservative tendencies.
Negativity bias is the tendency to fixate on negative things. People tend to pay more attention to bad news — and it's not just
because we're morbid. Social scientists theorize that it's on account of
our selective attention and that, given the choice, we perceive
negative news as being more important or profound. We also tend to give
more credibility to bad news, perhaps because we're suspicious (or
bored) of proclamations to the contrary. More evolutionarily, heeding
bad news may be more adaptive than ignoring good news (e.g. "saber tooth
tigers suck" vs. "this berry tastes good"). Today, we run the risk of
dwelling on negativity at the expense of genuinely good news. If presented with studies that crime, violence, war, and other injustices are steadily
declining, most people would argue that things are getting worse —
what is a perfect example of the negativity bias at work.
Bandwagon effect is a principle that one gets acceptance in accepting the consensus. Though we're often unconscious of it, we love to go with the flow of
the crowd. When the masses start to pick a winner or a favorite, that's
when our individualized brains start to shut down and enter into a kind
of "group think" or "hive mind" mentality. But it doesn't have to be a large
crowd or the whims of an entire nation; it can include small groups,
like a family or even a small group of office co-workers. The bandwagon
effect is what often causes behaviors, social norms, and memes to
propagate among groups of individuals — regardless of the evidence or
motives in support. This is why opinion polls are often maligned, as
they can steer the perspectives of individuals accordingly. Much of this
bias has to do with our built-in desire to fit in and conform.
Projection bias is a way individuals look at the world around them. As individuals trapped inside our own minds 24/7, it's often difficult for us to project outside the bounds of our own consciousness and preferences. We tend to assume that most people think just like us — though there may be no justification for it. This cognitive shortcoming often leads to a related effect known as the "false consensus bias" where we tend to believe that people not only think like us, but that they also agree with us. It's a bias where we overestimate how typical and normal we are, and assume that a consensus exists on matters when there may be none. Moreover, it can also create the effect where the members of a radical or fringe group assume that more people on the outside agree with them than is the case. Likewise, it can create a sense of exaggerated confidence one has in predicting the winner in a sports match or an election.
Current moment bias is the tendency to live in the present moment. We humans have a really hard time imagining ourselves in the future and
altering our behaviors and expectations accordingly. Most of us would
rather experience pleasure in the current moment, while leaving the pain
for later. This is a bias that is of particular concern to economists
(i.e. our unwillingness to not overspend and save money) and health
practitioners. In a 1998 study, when making food choices for the coming week, 74% of participants
chose fruit. But when the food choice was for the current day, 70%
chose chocolate.
The anchoring effect or the the "relativity trap," this is the tendency we have to compare and
contrast only a limited set of items. We tend to fixate on a value or number that in turn gets
compared to everything else. The classic example is an item at the store
that's on sale; we tend to see (and value) the difference in price, but
not the overall price itself. This is why some restaurant menus feature
very expensive entrees, while also including more (apparently)
reasonably priced ones. It's also why, when given a choice, we tend to pick the middle option — not too expensive, and not too cheap. But that thinking does make the end choice a logically better one.
It is not perfectly clear why these biases and effects are so ingrained into human behavior. Some sociologists believe that these may be leftover defense mechanisms from the hunter-gathering period in human development. Others believe that much of these traits are learned behavior from the environment around you. If you are a child with a single parent welfare mother who does not work or betters herself, that environment could cause the child to integrate living for the moment traits or accept one's lot in life quicker than a person in a middle class family where goals and change is taught on a daily basis.
If you re-read the above list again, your own mind can flash to various characters who displayed those bias or traits during the series. It is like a mental popcorn maker. It also helps understand the notion on why so many of the main characters never could change their paths.