Search Results for “Cognitive Bias”

March 4, 2016

Affective Forecasting

stumbling on happiness

hedonic treadmill

Affective forecasting (also known as the ‘hedonic forecasting mechanism’) is the prediction of one’s affect (emotional state) in the future. As a process that influences preferences, decisions, and behavior, affective forecasting is studied by both psychologists and economists, with broad applications.

Psychologist Daniel Kahneman and business school professor Jackie Snell began research on hedonic forecasts in the early 1990s, examining its impact on decision making. The term ‘affective forecasting’ was later coined by psychologists Timothy Wilson and Daniel Gilbert. Early research focused solely on measuring emotional forecasts, while subsequent studies examined accuracy, revealing that people are surprisingly poor judges of their future emotional states. For example, in predicting how events like winning the lottery might affect their happiness, people are likely to overestimate future positive feelings, ignoring the numerous other factors that might contribute to their emotional state outside of the single lottery event.

read more »

October 1, 2015

Doctor Fox Effect

WC Fields by Nick Reekie

The Dr. Fox effect states that even experts will be fooled by a nonsensical lecture if it is delivered with warmth, liveliness, and humor. A 1980 study found that the perceived prestige of research is increased by using a confounding writing style, with research competency being positively correlated to reading difficulty.

The original experiment was conducted at USC School Of Medicine in 1970. Two speakers gave lectures to a classroom of psychiatrists and psychologists on a topic the attendees were unfamiliar with (‘Mathematical Game Theory as Applied to Physician Education’). The control group was lectured by an actual scientist and the other by an actor who was given the identity ‘Dr. Myron L. Fox,’ a graduate of Albert Einstein College of Medicine.

read more »

August 3, 2015

Functional Fixedness

candle problem

outside the box by Leo Cullum

Functional fixedness [fiks-ed-nes] is a cognitive bias that limits a person to using an object only in the way it is traditionally used. The concept originated in Gestalt Psychology, which emphasizes holistic processing (e.g., ‘the whole is greater than the sum of its parts’). German American psychologist Karl Duncker defined functional fixedness as a ‘mental block against using an object in a new way that is required to solve a problem.’ This ‘block’ limits the ability of an individual to use components given to them to complete a task, as they cannot move past the original purpose of those components.

For example, if someone needs a paperweight, but they only have a hammer, they may not see how the hammer can be used as a paperweight. Functional fixedness is this inability to see a hammer’s use as anything other than for pounding nails; the person couldn’t think to use the hammer in a way other than in its conventional function. When tested, five year old children show no signs of functional fixedness. At that age, any goal to be achieved with an object is equivalent to any other goal. However, by age seven, children have acquired the tendency to treat the originally intended purpose of an object as special.

read more »

November 14, 2014

Curse of Knowledge

jargon

curse of knowledge by Igor Kopelnitsky

The curse of knowledge is a cognitive bias that leads better-informed parties to find it extremely difficult to think about problems from the perspective of lesser-informed parties. It is related to public policy engineer Baruch Fischhoff’s work on the hindsight bias (the knew-it-all-along effect). In economics the bias is studied to understand why the assumption that better informed agents can accurately anticipate the judgments of lesser informed agents is not inherently true, as well as to support the finding that sales agents who are better informed about their products may, in fact, be at a disadvantage against other, less-informed agents. It is believed that better informed agents fail to ignore the privileged knowledge that they possess, thus ‘cursed’ and unable to sell their products at a value that more naïve agents would deem acceptable.

In one experiment, one group of subjects ‘tapped’ a well-known song on a table while another listened and tried to identify the song. Some ‘tappers’ described a rich sensory experience in their minds as they tapped out the melody. Tappers on average estimated that 50% of listeners would identify the specific tune; in reality only 2.5% of listeners could. Related to this finding is the phenomenon experienced by players of charades: The actor may find it frustratingly hard to believe that his or her teammates keep failing to guess the secret phrase, known only to the actor, conveyed by pantomime.

September 17, 2014

Error Management Theory

Johnny Bravo

Error Management (EM) is an extensive theory of perception and cognitive biases that was created by psychologists David Buss and Martie Haselton. They describe a set of heuristics (mental shortcuts) that have survived evolutionary history because they hold slight reproductive benefits. The premise of the theory is built around the drive to reduce or manage costly reproductive errors. According to the theory, when there are differences in the cost of errors made under conditions of uncertainty, selection favors ‘adaptive biases,’ which ensure that the less costly survival or reproductive error will be committed.

When faced with uncertainty, a subject can make two possible errors: type I (false-positive or playing it safe, e.g. a fire alarm that later turns out to be a false alarm) and type II (false-negative or siding with skepticism, e.g. ignoring an often faulty fire alarm during an actual emergency). Error Management Theory asserts that evolved ‘mind-reading’ agencies will be biased to produce more for the first type of error, which explains the ‘sexual overperception bias,’ the tendency for men to incorrectly assume a platonic gesture from a woman is a sexual signal.

read more »

September 16, 2014

Stereotypes of Blondes

Gentlemen Prefer Blondes

Blonde hair has several stereotypes associated with it. In women is has been considered attractive and desirable, but is also associate with the negative stereotypes of the women ‘who relies on her looks rather than on intelligence.’ The latter stereotype of a ‘dumb blonde’ is exploited in ‘blonde jokes.’ In cognitive linguistics, the stereotype uses expressivity of words to affect an emotional response which determines a gender role of a certain kind. In feminist critique, stereotypes like the blonde bombshell or the dumb blonde’ are seen as negative images that undermine the power of women.

Some blonde jokes rely on sexual humor to portray or stereotype their subjects as promiscuous. Many of these are rephrased ‘Valley girl’ or ‘Essex girl’ jokes. Others are based on long-running ethnic jokes, such as humor denigrating the intelligence of Polish people. Similar jokes about stereotyped minorities have circulated since the seventeenth century with only the wording and targeted groups changed. In 20th century, a class of meta-jokes about blondes (i.e. jokes about blonde jokes) has emerged where a blonde person complains about the unfairness of the stereotype propagated by blonde jokes, with a punch line actually reinforcing the stereotype.

read more »

August 4, 2014

Anchoring

daniel kahneman

Zillion Dollar Frittata

Anchoring or focalism is a cognitive bias that describes the common human tendency to rely too heavily on the first piece of information offered (the ‘anchor’) when making decisions. Once an anchor is set, other judgments are made by adjusting away from that anchor, and there is a bias toward interpreting other information around the anchor. For example, the initial price offered for a used car sets the standard for the rest of the negotiations, so that prices lower than the initial price seem more reasonable even if they are still higher than what the car is really worth.

Anchoring is also called the focusing effect (or focusing illusion) because it occurs when people place too much importance on one aspect of an event, causing an error in accurately predicting the utility of a future outcome. Individuals tend to focus on notable differences, excluding those that are less conspicuous, when making predictions about happiness or convenience.

read more »

Tags:
July 2, 2014

Illusion of Control

Placebo button

The illusion of control is the tendency for people to overestimate their ability to control events, for instance to feel that they control outcomes that they demonstrably have no influence over. The effect was named by psychologist Ellen Langer and has been replicated in many different contexts.

It is thought to influence gambling behavior and belief in the paranormal. Along with illusory superiority (overestimating positive abilities and underestimating negative qualities) and optimism bias (unrealistic or comparative optimism), the illusion of control is one of the positive illusions, unrealistically favorable attitudes that people have towards themselves or to people that are close to them. Positive illusions are a form of self-deception or self-enhancement that feel good, maintain self-esteem or stave off discomfort at least in the short term.

read more »

April 9, 2014

Risk Perception

cultural theory of risk

Risk perception is the subjective judgment that people make about the characteristics and severity of a risk. The phrase is most commonly used in reference to natural hazards and threats to the environment or health, such as nuclear power. Several theories have been proposed to explain why different people make different estimates of the dangerousness of risks. Three major families of theory have been developed: psychology approaches (heuristics and cognitive), anthropology/sociology approaches (cultural theory) and interdisciplinary approaches (social amplification of risk framework).

The study of risk perception arose out of the observation that experts and lay people often disagreed about how risky various technologies and natural hazards were. The mid 1960s saw the rapid rise of nuclear technologies and the promise for clean and safe energy. However, fears of both longitudinal dangers to the environment as well as immediate disasters creating radioactive wastelands turned the public against this new technology. The governmental communities asked why public perception was against the use of nuclear energy when all of the scientific experts were declaring how safe it really was. The problem, from the perspectives of the experts, was a difference between scientific facts and an exaggerated public perception of the dangers.

read more »

August 14, 2013

Bad Science

bad science

Bad Science‘ is a 2008 book by British physician and science writer Ben Goldacre, criticizing mainstream media reporting on health and science issues. The book contains extended and revised versions of many of his ‘Guardian’ columns.

The book discusses topics such as detoxification (Aqua Detox, ear candles etc.) that can easily be shown to be bogus by simple experiments, and discusses the ‘detox phenomenon’ and purification rituals. He also addresses the claims for Brain Gym, a program of specific physical exercises that its commercial promoters claim can create new pathways in the brain. The uncritical adoption of this program by sections of the British school system is derided.

read more »

Tags:
June 14, 2013

Thinking, Fast and Slow

Two Brains Running by David Plunkert

Thinking, Fast and Slow is a 2011 book by Nobel Memorial Prize winner in Economics Daniel Kahneman which summarizes research that he conducted over decades, often in collaboration with cognitive scientist Amos Tversky. It covers all three phases of his career: his early days working on cognitive biases (unknowingly using poor judgement), prospect theory (the tendency to base decisions on the potential value of losses and gains rather than the final outcome), and his later work on happiness (e.g. positive psychology).

The book’s central thesis is a dichotomy between two modes of thought: System 1 is fast, instinctive and emotional; System 2 is slower, more deliberative, and more logical. The book delineates cognitive biases associated with each type of thinking, starting with Kahneman’s own research on loss aversion (the tendency to favor avoiding losses over acquiring gains). From framing choices (the tendency to avoid risk when a positive context is presented and seek risks when a negative one is) to attribute substitution (using an educated guess to fill in missing information), the book highlights several decades of academic research to suggest that people place too much confidence in human judgment.

read more »

Tags:
June 14, 2013

Attribute Substitution

Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment (of a target attribute) that is computationally complex, and instead substitutes a more easily calculated heuristic (‘rule of thumb’) attribute. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system.

Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean.

read more »