Perception of probability of being right

Perception of probability of being right

We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

The probability that people percept may be different from the real one due to a number of factors, including the form in which their are presented, their context and biases (due to misinformation or wishful thinking).

My question if there is any research showing the subjective probability judgement of being correct ($p_mathrm{perceived}$) as a function of actual probability of correctness ($p_mathrm{real}$) for situation were people estimate the chance that they are right in a test/trial (e.g. in a two answers forced choice task)?

Also, I am interested mathematical models with foundation in experimental data.

You'd probably want to take a look at signal detection theory. This is a set of tools that you use to analyse how well a person (or an animal, or a machine) is able to discern trials where there is a signal present (e.g. a picture is shown that has been shown before, a tumour is present in an x-ray image, a dot on a radar is an enemy aircraft) from trials where there is no-signal present (e.g. a picture is shown that hasn't been shown before, a tumour is not present in an x-ray image, a dot on a radar is not an enemy aircraft).

Whether a person perceives a signal or not is not always easy to discern. There will always be a certain interplay between the hit rate and the false alarm rate (in that you for example, if you always answered "signal", would have a perfect hit rate, although your false alarm rate would be terrible), and based on a number of criterias (time pressure, instructions, reward structure et cetera) one can manipulate the response patterns of a certain individual.

There has been attempts to create models where accuracy, speed and certainty all are taken into concideration; see for example Plescak and Busemeyer (2010).

Unless I'm mistaken, it sounds like you're actually interested in meta-cognition and type-2 signal detection theory (the form of SDT that Speldosa has pointed to is type-1 signal detection theory). It's used to study our ability to reflect on our own knowledge, or what we think about what we think.

This wikipedia article might get you started on meta-cognition in general.

A type-2 SDT task might involve asking participants to detect a stimulus, as would be the case in a type-1 SDT task, giving hits and false alarms. However, the difference is that, in a type-2 SDT task, after reporting whether they thought the stimulus was present or absent, participants are then asked a second question. This second question asks them how confident they were that they were right in their decision.

With this additional information, type-2 SDT tasks can then probe the extent to which participants can tell the difference between when they have the correct answer and when they have the incorrect answer, which I believe is what you are interested in.

Here's a sample paper that explores this in detail, albeit related to a different task, but should be enough to get you started:

Higham, P.A. (2007) No Special K! A signal detection framework for the strategic regulation of memory accuracy. Journal of Experimental Psychology: General, 136(1): 1-22.

The news is accidentally warping our perception of reality – and not necessarily for the better.

The bias may also be responsible for the fact that the news is rarely a light-hearted affair. When one website – the City Reporter, based in Russia – decided to report exclusively good news for a day in 2014, they lost two-thirds of their readership. As the science fiction writer Arthur C Clarke put it, the newspapers of Utopia would be terribly dull.

Could this extra dose of negativity be shaping our beliefs?

Scientists have known for decades that the general public tend to have a consistently bleak outlook, when it comes to their nation’s economic prospects. But in reality, this cannot be the case. The existence of “economic cycles” – fluctuations in the economy between growth and hardship – is one of the cornerstones of modern economics, backed up by decades of research and experience.

People tend to worry about how a crisis will make them feel in the future – and this can lead them to consume more news (Credit: Getty Images)

The view that the future is always worse is plainly wrong. It’s also potentially damaging. If people think they won’t have a job or any money in five years, they aren’t going to invest, and this is harmful for the economy. Taken to the extreme, our collective pessimism could become a self-fulfilling prophecy – and there’s some evidence that the news might be partly responsible.

For example, a 2003 study found that economic news was more often negative than positive – and that this coverage was a significant predictor of people’s expectations. This fits with other research, including a study in the Netherlands which found that reporting about the economy was often out of step with actual economic events – painting a starker picture than the reality. This consistent negativity led the perceptions of the general public away from what the actual markers of the health of the economy would suggest. More recently, the authors of one paper even went so far as to argue that media coverage amplifies periods of prolonged economic growth or contraction.

The news is accidentally warping our perception of reality – and not necessarily for the better. Another example is our perception of risk.

Take global tourism. As you might expect, people don’t usually fancy going on holiday where there is political instability, war or a high risk of terrorism. In some cases, the news is a source of direct advice on these matters – conveying government instructions to, say, come home amid a global pandemic. But even when there is no official line to stay away – or rational need to – it might be influencing us through subconscious biases and flaws in our thinking.

The news can shape our views about the safety of foreign countries (Credit: Getty Images)

One way this is thought to happen is through “framing effects”, in which the way something – such as a fact or choice – is presented affects the way you think about it. For example, a drug which is “95% effective” in treating a disease sounds more appealing than one which “fails 5% of the time”. The outcome is the same, but – as a pair of economists discovered in the 70s and 80s – we don’t always think rationally.

In one study, when scientists presented participants with news stories containing equivalent, but differently phrased, statements about political instability or terrorist incidents, they were able to manipulate their perception of how risky that country seemed. For example, saying a terrorist attack was caused by “al-Qaeda and associated radical Islamic groups” was considerably more concerning than saying “Domestic rebel separatist group” – though both have the same meaning.

Sometimes, these subtle influences might have life or death consequences.

A 2014 study found that the public generally view cancers which are overrepresented in the news – such as brain cancer – as far more common than they really are, while those which aren’t often discussed – such as male reproductive cancers – are seen as occurring much less frequently than they do. People who consume the most news generally have the most skewed perceptions.

The research, conducted by the health communication expert Jakob Jensen from the University of Utah, along with scientists from across the United States, raises some alarming possibilities. Are people underestimating their own risk of certain cancers, and therefore missing the early warning signs? Previous studies have shown that a person’s ideas about their own risk can influence their behaviour, so the team suggest that this is one possible side-effect.

Intriguingly, the public perception of a cancer’s prevalence is closely mirrored by federal funding for research into its causes and treatment. Jensen and his colleagues suggest that news coverage might be shaping public perception, which, in turn, could be influencing the allocation of government resources. (Although it’s also possible that the public and the media are both reinforcing each other).

The news can lead us to miscalculate risks, such as the probability of developing certain cancers (Credit: Getty Images)

Finally, there’s growing evidence that the news might even infiltrate our dreams.

Amid the current global lockdowns, a large number of people – anecdotally, at least – are reporting dreams which are unusually vivid and frightening. One explanation is that these “pandemic dreams” are the result of our imaginations going wild, as millions of people are largely shut off from the outside world. Another is that we’re remembering our dreams better than we usually would, because we’re anxiously waking up in the middle of REM sleep, the phase in which they occur.

But they could also be down to the way the outbreak is being portrayed by the news. Research has shown that the 9/11 attacks led to significantly more threatening dreams. There was a strong link between the dream changes and exposure to the events on television. “This was not the case for listening to them on the radio, or for talking to friends and relatives about them” says Ruth Propper, a psychologist at Montclair State University, New Jersey, who led the research. “I think what this really shows is that it’s caused by seeing images of death – they’re traumatic.”

News is bad for us

Indeed, it turns out that wallowing in the suffering of seven billion strangers – to paraphrase another science fiction author – isn’t particularly good for our mental health.

After months of nonstop headlines about Covid-19, there are hints of an impending crisis of coronavirus anxiety. Mental health charities across the world are reporting unprecedented levels of demand, while many people are taking “social media holidays”, as they strive to cut their exposure to the news.

When the news makes us stressed, there’s emerging evidence that it can affect our health years later (Credit: Getty Images)

While some of this stress might be down to the new reality we’re all finding ourselves in, psychologists have known for years that the news itself can add an extra dose of toxicity. This is particularly apparent following a crisis. After the 2014 Ebola crisis, the 9/11 attacks, the 2001 anthrax attacks, and the 2008 Sichuan Earthquake, for example, the more news coverage a person was exposed to, the more likely they were to develop symptoms such as stress, anxiety and PTSD.

The impact of news is something of a psychological mystery, because most of it doesn’t actually affect us directly, if at all. And when it does, several studies have found that – as with the Boston Marathon Bombings – the coverage can be worse for our mental health than the reality.

One possible explanation involves “affective forecasting”, which is the attempt to predict how we will feel about something in the future. According to Rebecca Thompson, a psychologist at the University of Irvine, most people feel fairly confident in their ability to do this. “Like if you were to imagine winning the lottery tomorrow, you would think you would feel great,” she says.

Oddly, when you ask people how they actually feel after these “life-changing” events, it turns out they often have far less of an impact on our emotions than we expect. A classic 1978 study compared the happiness of those who had recently had their lives transformed by winning the lottery or becoming paralysed. The lottery winners were no less happy than the controls and only slightly happier than the accident victims. In short, we really don’t know our future selves as well as we think we do.

The same thing happens during a crisis. Thompson explains that right now many people are likely to be fixated on their future distress. In the meantime, this mistake is steering us towards unhealthy behaviours.

“If you have a really big threat in your life that you're really concerned about, it’s normal to gather as much information about it as possible so that you can understand what's going on,” says Thompson. This leads us into the trap of overloading on news.

The news can sneak into our subconscious and affect the content of our dreams (Credit: Getty Images)

For example, those who thought they were more likely to develop post-traumatic stress after Hurricane Irma made its way across Florida in September 2017, also tended to consume the most news in the run up to it. Ironically, these people did have the worst psychological outcomes in the end – but Thompson thinks this is partly because of the amount of stressful information they were exposed to. She points out that much of the media coverage was heavily sensationalised, with clips of television reporters being buffeted by high winds and rain while emphasising worst-case scenarios.

In fact, not only can news coverage of crises lead us to catastrophise about them specifically, but also everything else in our lives – from our finances to our romantic relationships. A 2012 study found that women – but mysteriously, not men – who had been primed by reading negative news stories tended to become more stressed by other challenges, leading to a spike in their levels of the stress hormone, cortisol.

“Men normally show quite high levels [of cortisol], so it might be that they just can’t go any higher,” says Marie-France Marin, a psychologist at the University of Quebec in Montreal, who authored the study. However, the women also had better memories for the negative news – suggesting that they really were more affected.

Negative news also has the power to raise a person’s heart rate – and there are worrying signs that it might have more serious implications for our long-term health.

When Holman and colleagues looked into the legacy of stress about the 9/11 attacks, they found that those who had reported high levels at the time were 53% more likely to have cardiovascular problems in the three years afterwards – even when factors such as their previous health were taken into account.

In a more recent study, the team investigated if the news itself might be responsible for this – and found that exposure to four or more hours of early 9/11 coverage was linked to a greater likelihood of health problems years later.

“What's especially remarkable about that study is that that the majority of people were only exposed to 9/11 through the media,” says Holman. “But they received these lasting effects. And that makes me suspect that there's something else going on and that we need to understand that.”

Just a few hours of news coverage each day can have an impact far beyond what you might expect (Credit: Getty Images)

Why do events that are happening to strangers, sometimes thousands of miles away, affect us so much?

Holman has a few ideas, one of which is that the vivid depictions found in televised media are to blame. She explains that sometimes the news is on in the background while she’s in the gym, and she’ll notice that for the whole time the reporter is telling a story, they’ll have the same images repeating over and over. “You've got this loop of images being brought into your brain, repeat, repeat, repeat, repeat. What we're looking at is not a horror movie that's fake. We're looking at real life things – and I suspect that somehow the repetitiveness is why they have such an impact.”

Holman points out that the news is not – and has never been – just about faithfully reporting one event after another. It’s a form of entertainment, that the media uses to compete for our precious time. Many of these organisations are dependent on advertising revenue, so they add a sense of drama to hook in viewers and keep them watching. As a result, the prizes for being the most watched are great. In America, news anchors are major celebrities, sometimes earning tens of millions of dollars a year.

Even when they’re reporting on already-traumatic incidents, news channels often can’t resist adding an extra frisson of tension. After the Boston Marathon bombings, coverage often appeared alongside urgent, sensationalising text such as “new details” and “brand new images of marathon bombs”.

Holman is already looking into how the news coverage of the Covid-19 pandemic is affecting us, though her results haven’t been published yet. “I really wish that I could say ‘I think it will be OK, we’ve got it covered’, but I do think there are going to be some lasting effects for some people,” she says.

Part of the problem, Holman suggests, is that global dramas have never been so accessible to us – today it’s possible to partake in a collective trauma from anywhere in the world, as though it were happening next door. And this is a challenge for our mental health.

So the next time you find yourself checking the headlines for the hundredth time that day, or anxiously scrolling through your social media feed, just remember: the news might be influencing you more than you bargained for.

As an award-winning science site, BBC Future is committed to bringing you evidence-based analysis and myth-busting stories around the new coronavirus. You can read more of our Covid-19 coverage here.

Perception of probability of being right - Psychology

Our behavior is not only a function of our personality, values, and preferences, but also of the situation. We interpret our environment, formulate responses, and act accordingly. Perception may be defined as the process with which individuals detect and interpret environmental stimuli. What makes human perception so interesting is that we do not solely respond to the stimuli in our environment. We go beyond the information that is present in our environment, pay selective attention to some aspects of the environment, and ignore other elements that may be immediately apparent to other people. Our perception of the environment is not entirely rational. For example, have you ever noticed that while glancing at a newspaper or a news Web site, information that is interesting or important to you jumps out of the page and catches your eye? If you are a sports fan, while scrolling down the pages you may immediately see a news item describing the latest success of your team. If you are the parent of a picky eater, an advice column on toddler feeding may be the first thing you see when looking at the page. So what we see in the environment is a function of what we value, our needs, our fears, and our emotions. Higgins, E. T., & Bargh, J. A. (1987). Social cognition and social perception. Annual Review of Psychology, 38, 369–425 Keltner, D., Ellsworth, P. C., & Edwards, K. (1993). Beyond simple pessimism: Effects of sadness and anger on social perception. Journal of Personality and Social Psychology, 64, 740–752. In fact, what we see in the environment may be objectively, flat-out wrong because of our personality, values, or emotions. For example, one experiment showed that when people who were afraid of spiders were shown spiders, they inaccurately thought that the spider was moving toward them. Riskind, J. H., Moore, R., & Bowley, L. (1995). The looming of spiders: The fearful perceptual distortion of movement and menace. Behaviour Research and Therapy, 33, 171. In this section, we will describe some common tendencies we engage in when perceiving objects or other people, and the consequences of such perceptions. Our coverage of biases and tendencies in perception is not exhaustive—there are many other biases and tendencies on our social perception.


The ability to anticipate is a hallmark of cognition. Inferences about what will occur in the future are critical to decision making, enabling us to prepare our actions so as to avoid harm and gain reward. Given the importance of these future projections, one might expect the brain to possess accurate, unbiased foresight. Humans, however, exhibit a pervasive and surprising bias: when it comes to predicting what will happen to us tomorrow, next week, or fifty years from now, we overestimate the likelihood of positive events, and underestimate the likelihood of negative events. For example, we underrate our chances of getting divorced, being in a car accident, or suffering from cancer. We also expect to live longer than objective measures would warrant, overestimate our success in the job market, and believe that our children will be especially talented. This phenomenon is known as the optimism bias, and it is one of the most consistent, prevalent, and robust biases documented in psychology and behavioral economics.

Many people who enter the lottery already recognize that their odds are slim. In fact, that may be the reason why they enter in the first place.

Mark Reinecke, t he chief of psychology at Northwestern, told Business Insider in a statement that people tend to be more frustrated when their expectations are disrupted, like when there's an uncharacteristically long line at the 7-11.

Robinson agrees, saying that "the average person is OK with throwing away a couple dollars for the chance at something that matters. When put into the person's relative day, it feels trivial."

Critical Thinking Question

The central tenet of Gestalt psychology is that the whole is different from the sum of its parts. What does this mean in the context of perception?

This means that perception cannot be understood completely simply by combining the parts. Rather, the relationship that exists among those parts (which would be established according to the principles described in this chapter) is important in organizing and interpreting sensory information into a perceptual set.

Take a look at the following figure. How might you influence whether people see a duck or a rabbit?

Playing on their expectations could be used to influence what they were most likely to see. For instance, telling a story about Peter Rabbit and then presenting this image would bias perception along rabbit lines.

A new psych paper perfectly explains why being extremely online makes you cynical

By Keith A. Spencer
Published January 26, 2020 10:00AM (EST)



Those of us who spend too much time on social media are familiar with how profoundly our online interactions differ from real ones. Face-to-face with another human, I cannot say I have ever been issued a death threat but through the online veil of pseudonymity, I have — like most people who work in journalism — received plenty.

In short, there is a selection bias at work online, which skews both the perception of who is on social media, as well as drives our perception of our fellow humans online as disrespectful. Now, a new psychology paper on cynicism and disrespect reveals that being cynical is a self-perpetuating vortex that only makes people become even more cynical about human behavior.

While the paper has implications for how people perceive the world after spending too much time among people who are disrespectful, it also perfectly describes the depressive, cynical vortex that plagues many people who spend too much time online.

The paper, titled "Victims, Perpetrators or Both? The Vicious Cycle of Disrespect and Cynical Beliefs about Human Nature," was published in the Journal of Experimental Psychology this month and written by Professors Olga Stavrova, Daniel Ehlebracht, and Kathleen D. Vohs. The researchers conducted six different studies on cynicism — both how cynicism arises in people, how it perpetuates, and how people who hold cynical views are far more likely to be "treated disrespectfully." They define cynicism as "the tendency to expect that others will engage in exploitation and deception, based on the perspective that people, at their core, are morally bankrupt and behave treacherously to maximize their self-interest."

"Cynical views are not only an unflattering portrayal of humanity, they are associated with undesirable consequences for those who hold them," the researchers write, noting that cynical views "worse physical and psychological health, undermine performance, predict financial strife and increase the odds of premature death." "We tested how cynicism emerges and what maintains it," the authors explain of their studies.

These studies included one on the relationship between cynicism and "the experience of disrespect," which were "positively and significantly related" in 28 of the 29 countries' populations studied, the authors write. "These findings suggest that disrespect may be a route to cynicism beyond and apart from other negative social experiences, such as feeling that one lacks support from others." They also note that disrespect was a strong predictor of cynicism.

Curiously, the paper noted that those who were already quite cynical about human nature would be apt to grow even more cynical over time. And if my experience is typical, nothing makes one cynical about humanity as much as spending a lot of time online.

Indeed, one of the most intriguing facets of all these studies within this one research paper is that, taken together, they describe everyday experiences of those of us who spend a lot of time on pseudonymous social media sites like Twitter, Facebook, or news comments sections. Virtually everyone who tries to participate in these kinds of forums has experienced disrespect, self included. And those who are extremely online probably feel the most cynical.

This contributes to a growing body of literature that seems to hint that spending a lot of time online might make us feel shitty. There is a named phenomenon known as the online disinhibition effect, which theorizes that the cloak of online anonymity and distance from one's peers is what leads to cruel and dehumanizing behavior online in short, it's why many people may call you stupid online, but you (hopefully) rarely experience that in real life. One wonders if the cycle of cynicism and disrespect could be linked to the online disinhibition effect, too.

This paper also suggests a psychological basis for irony poisoning, an online disposition in which one loses the ability to discern when someone is being facetious or not (or loses the ability to discern if oneself is being facetious). Cynicism, increasingly embedded by the constant torment of online disrespect, could probably breed such a condition.

But the paper also, intriguingly, has implications for economics and its understanding of "human nature." Since the 1960s or so, economics has been dominated by the Austrian and Chicago school economists, who justified the transition to an undemocratic, free market capitalist economy by arguing that humans were innately selfish. That worldview is innately right-wing, as it precludes the idea that humans could possibly ever have good intentions towards one another, and thus precludes the idea that the social welfare state is a good idea at all. It is alarming to think that those who hold cynical views about human nature (like Chicago School economists) might end up stuck in this self-perpetuating cycle and doubly disappointing to think of the cruel fount of disrespect that one experiences online as potentially driving people towards right-wing worldviews.

The authors of the paper do note this political dimension to cynicism. "Cynicism has been blamed for the rise of far right political parties in Europe and as a driving force behind the U.K.'s decision to exit the European Union," they write.

As far as the online dimension to cynicism vis-a-vis being disrespected online, I do think this, at least, is something that is fixable — either through better forum moderation or through the cultivation of online social media platforms that aren't built around the attention economy, in which social media giants cruelly profit off of trolling and cyberbullying.

Keith A. Spencer

Keith A. Spencer is a senior editor for Salon. He manages Salon's science, tech, economy and health coverage. His book, "A People's History of Silicon Valley: How the Tech Industry Exploits Workers, Erodes Privacy and Undermines Democracy," was released in 2018. Follow him on Twitter at @keithspencer, or on Facebook here.

MORE FROM Keith A. SpencerFOLLOW keithspencer


Sensation occurs when sensory receptors detect sensory stimuli. Perception involves the organization, interpretation, and conscious experience of those sensations. All sensory systems have both absolute and difference thresholds, which refer to the minimum amount of stimulus energy or the minimum amount of difference in stimulus energy required to be detected about 50% of the time, respectively. Sensory adaptation, selective attention, and signal detection theory can help explain what is perceived and what is not. In addition, our perceptions are affected by a number of factors, including beliefs, values, prejudices, culture, and life experiences.

Why Our Brains Constantly Create New Threats

W hy do many problems in life seem to stubbornly stick around, no matter how hard people work to fix them? It turns out that a quirk in the way human brains process information means that when something becomes rare, we sometimes see it in more places than ever.

Partner content, op-eds, and Undark editorials.

Think of a “neighborhood watch” made up of volunteers who call the police when they see anything suspicious. Imagine a new volunteer who joins the watch to help lower crime in the area. When they first start volunteering, they raise the alarm when they see signs of serious crimes, like assault or burglary.

Let’s assume these efforts help and, over time, assaults and burglaries become rarer in the neighborhood. What would the volunteer do next? One possibility is that they would relax and stop calling the police. After all, the serious crimes they used to worry about are a thing of the past.

But you may share the intuition my research group had – that many volunteers in this situation wouldn’t relax just because crime went down. Instead, they’d start calling things “suspicious” that they would never have cared about back when crime was high, like jaywalking or loitering at night.

You can probably think of many similar situations in which problems never seem to go away, because people keep changing how they define them. This is sometimes called “concept creep,” or “moving the goalposts,” and it can be a frustrating experience. How can you know if you’re making progress solving a problem, when you keep redefining what it means to solve it? My colleagues and I wanted to understand when this kind of behavior happens, why, and if it can be prevented.

To study how concepts change when they become less common, we brought volunteers into our laboratory and gave them a simple task – to look at a series of computer-generated faces and decide which ones seem “threatening.” The faces had been carefully designed by researchers to range from very intimidating to very harmless.

As we showed people fewer and fewer threatening faces over time, we found that they expanded their definition of “threatening” to include a wider range of faces. In other words, when they ran out of threatening faces to find, they started calling faces threatening that they used to call harmless. Rather than being a consistent category, what people considered “threats” depended on how many threats they had seen lately.

This kind of inconsistency isn’t limited to judgments about threat. In another experiment, we asked people to make an even simpler decision: whether colored dots on a screen were blue or purple.

As blue dots became rare, people started calling slightly purple dots blue. They even did this when we told them blue dots were going to become rare, or offered them cash prizes to stay consistent over time. These results suggest that this behavior isn’t entirely under conscious control – otherwise, people would have been able to be consistent to earn a cash prize.

After looking at the results of our experiments on facial threat and color judgments, our research group wondered if maybe this was just a funny property of the visual system. Would this kind of concept change also happen with non-visual judgments?

To test this, we ran a final experiment in which we asked volunteers to read about different scientific studies, and decide which were ethical and which were unethical. We were skeptical that we would find the same inconsistencies in these kind of judgments that we did with colors and threat.

Why? Because moral judgments, we suspected, would be more consistent across time than other kinds of judgments. After all, if you think violence is wrong today, you should still think it is wrong tomorrow, regardless of how much or how little violence you see that day.

But surprisingly, we found the same pattern. As we showed people fewer and fewer unethical studies over time, they started calling a wider range of studies unethical. In other words, just because they were reading about fewer unethical studies, they became harsher judges of what counted as ethical.

Why can’t people help but expand what they call threatening when threats become rare? Research from cognitive psychology and neuroscience suggests that this kind of behavior is a consequence of the basic way that our brains process information – we are constantly comparing what is front of us to its recent context.

Instead of carefully deciding how threatening a face is compared to all other faces, the brain can just store how threatening it is compared to other faces it has seen recently, or compare it to some average of recently seen faces, or the most and least threatening faces it has seen. This kind of comparison could lead directly to the pattern my research group saw in our experiments, because when threatening faces are rare, new faces would be judged relative to mostly harmless faces. In a sea of mild faces, even slightly threatening faces might seem scary.

It turns out that for your brain, relative comparisons often use less energy than absolute measurements. To get a sense for why this is, just think about how it’s easier to remember which of your cousins is the tallest than exactly how tall each cousin is. Human brains have likely evolved to use relative comparisons in many situations, because these comparisons often provide enough information to safely navigate our environments and make decisions, all while expending as little effort as possible.

Sometimes, relative judgments work just fine. If you are looking for a fancy restaurant, what you count as “fancy” in Paris, Texas, should be different than in Paris, France.

But a neighborhood watcher who makes relative judgments will keep expanding their concept of “crime” to include milder and milder transgressions, long after serious crimes have become rare. As a result, they may never fully appreciate their success in helping to reduce the problem they are worried about. From medical diagnoses to financial investments, modern humans have to make many complicated judgments where being consistent matters.

How can people make more consistent decisions when necessary? My research group is currently doing follow-up research in the lab to develop more effective interventions to help counter the strange consequences of relative judgment.

One potential strategy: When you’re making decisions where consistency is important, define your categories as clearly as you can. So if you do join a neighborhood watch, think about writing down a list of what kinds of transgressions to worry about when you start. Otherwise, before you know it, you may find yourself calling the cops on dogs being walked without leashes.

David Levari is a postdoctoral researcher in psychology at Harvard University.

This article was originally published on The Conversation. Read the original article.

Individual Level

Though there is relatively little direct exploration of the explicit use of probabilities in DM for individual action choices, it is well known that performers make informal “calculations” and use probabilities based on the situation. In competition, athletes implicitly assign weightings, or probabilities, to the likelihood of an event occurring and use this information to guide their decisions. These implicit, subjective probabilities influence the ability to respond. For instance, a tennis player’s position on the court can influence the type of shot he or she is able to play. Knowing this, a defender can assign probabilities to each possible shot to anticipate the opponent’s actions. If a defender thinks the probability of a particular shot is high, the defender will show better anticipation of this shot. A shot the defender thinks is less probable will not be as well anticipated if the opponent makes this choice. The attacking player can exploit this information by playing a low-probability shot when another outcome is highly probable and thus catch the defender ill-prepared or off guard.

The effective use of probabilities can also increase performance based on availability of specific situational information. This has been explored by manipulating the situational information accompanying video clips when performers are asked to decide on their next action. For example, a baseball pitcher is more likely to try to throw a strike pitch rather than a pitch outside the strike zone when the pitch count contains three balls than when it contains none. This situational information allows a batter to make an informed decision regarding the pitch location and, by reducing uncertainty, make a quicker decision about whether or not to swing.

The use of probabilities also distinguishes between more and less skilled performers. The use of verbal reports, in which athletes report their thoughts during play, have shown that elite athletes, compared with less skilled athletes, are able to create more detailed and sophisticated event profiles using past and current information to “diagnose” what is currently taking place, as well as predict and anticipate future play. This probabilistic information can be used to effectively manage energy expenditure during the course of an event. In squash, for instance, the probability of winning a game is based on the probability of winning a rally and the relative importance of each rally. Athletes naturally use this information to regulate energy expended on a particular point and energy trade-offs in the context of competition as a whole and its likely outcome.

Relying on probabilities is not always beneficial, however athletes who believe in the “hot hand” in basketball, for instance, think there is a greater probability of a player making a successful shot after they have made one or two previous shots. Teammates are thus more likely to allocate or pass the ball to a “hot” player even though the evidence to support the hot-hand belief is equivocal. Sports officials are also prone to using situational probabilities when making decisions they are more likely to penalize teams in an alternating fashion rather than penalizing the same team twice in a row. Cognitive biases and decision errors such as these provide a place for the use of probabilities in sport on a more tactical level.