POLSCI 240 / PSY 225: Political Psychology
February 5, 2025
We have talked a lot about the nature of opinions and how they may be constructed
Why do we care?
Explore rational theories for how people should update their beliefs about politics
Explore (some) ways in which actual people deviate from these models
In subsequent weeks, we will continue to explore sources of such biases
Three components:
We have talked about 3. (max expected utility) - what about 1. and 2.?
At least two problems related to optimal info search:
Optimal selection of sources
Optimal stopping rule
Under what conditions can someone change your mind? (i.e., persuasion)
One person (Speaker) sends a Message in favor of policy \(X\) to another person (Receiver)
For persuasion to occur:
Persuasion is harder than we think, because of speaker incentives to lie
Even if S is believed to be knowledgeable, how can we trust them?
Of course they will say we can trust them, but why should we believe that?
This is the problem of cheap talk
Lupia and McCubbins say yes! If and when there are “institutions” or background conditions that create incentives for S to be trustworthy (incentive compatibility)
People need some way to determine whom to trust, but this is difficult and ripe for bias
People need some way to determine whom to trust, but this is difficult and ripe for bias
Politicians are typically highly constrained: well-described by left vs right ideology
People need some way to determine whom to trust, but this is difficult and ripe for bias
You might say: “just trust the experts!”
How do you decide when to stop gathering information?
Very hard problem b/c of potential for infinite regress
Choosing a stopping point is a decision
Making an optimal decision requires maximizing expected utility
It is inevitable people will need to rely on “heuristics” (rules of thumb)
A confidence threshold is the amount of information you need before you are willing to make a decision (or form an opinion, judgment, etc.)
Optimal information gathering is a very hard problem!
Ultimately, we rely heavily on heuristics (rules of thumb), which often have systematic biases
Belief rigidity
Mass conformity
Social group polarization
Overconfidence
To help us think it through, we will start with a very simple general situation:
Theories of rational belief state that people should update using Bayes’ Rule
\[ p(X|D) = p(X) \frac{p(D|X)}{p(D|X)p(X) + p(D|\neg X)p(\neg X)} \]
\(\text{updated belief} = \text{prior belief} \times \frac{\text{prob of data if X is true}}{\text{total prob of data}}\)
There is a disease in a person’s population with a rate of 1 in 10,000 - so the prior probability “person has disease” is \(p(X) = 0.0001\)
There is a test for the disease such that:
The person tests positive (\(D = +\)): what should their posterior belief be that they have the disease (\(p(X|D)\))?
\(p(X|+) = 0.0001 \frac{0.99}{(0.99)(0.0001) + (0.05)(0.9999)}\)
\(= 0.0001 \frac{0.99}{0.050094}\)
\(\sim= 0.002 = 1 / 500\)
Use frequencies instead of probabilities:
1 in 10,000 people have the disease
If you test 10,000 people:
If you test positive, what is probability you have the disease?
We might be skeptical people (can or do) perform these calculations, even implicitly
Even if they try, there may be systematic biases relative to optimality
Even if people have priors about \(X\), they usually need to construct the probability of the data - both conditional (\(p(D|X)\)) and unconditional (\(p(D)\))
People are less responsive to new information than Bayes’ rule dictates (“cautious Bayesians”, too conservative)
In Hill (2017) study, people updated in response to information, but only about 75% of what they should have
Galef suggests we have two modes of confronting new information:
When that information is consistent with our prior opinion:
When that information is inconsistent with our prior opinion:
People use more rigorous standards for evaluating information inconsistent with their preferred beliefs
Cognitive: problems of information processing, judgment - examples:
Motivational: people have preferences over beliefs
Can be personal (Galef -> “emotional”)
Can be social