By Gary Berg-Cross
2002 Nobel Prize-winning psychologist Daniel Kahneman delivered the 12th Annual Sackler Lecture recently. See my earlier Blog on this. Some interested parties may have missed this talk entitled "Thinking that We Know". You can see a review of the talk as well as a video of the entire thing at the NYT Review. If you don’t have time for that, the following are some notes from the hour lecture which touched on topics from his recent book (“Thinking, Fast and Slow”).
Kahneman started off in a commonsense philosophical manner talking about knowing. To know implies absence of doubt and true belief. But truth is a philosophical concept, and people disagree about what is true. There’s scientific truth that comes from a shared search for agreed and objective truth. This is the central mission of science. But in science just any belief is not part of the conception. It is possible for “true believers “ not to accept science as the way to truth. They argue that since some belief is central to science therefore it is just another religion.
We need to recognize this gap in ideas of knowing.
As a psychologist a starting point for Kahneman was a discussion of what we have learned from Psychology devised laboratory paradigms of reasoning, especially in natural/social environments.
Here he discussed 'dual-process' model of the brain & theories of reasoning . This is the distinction between 2 types of reasoning systems - ‘System 1’ and ‘System 2’ processes.
System 1 is an older and FAST form of universal cognition shared between animals and humans. It is probably actually not a single system but a set of subsystems that operate with some autonomy. System 1 includes what people call instinctive or intuitive knowing and behaviors. Kahneman and others like to System 1 processes as those that are formed by associate learning ((associative memory is often called instinctive). They are probably the kind produced by neural networks. System 1 processes are characterized as rapid, parallel and automatic in nature and usually only their resulting product becomes consciousness in humans.
System 2 In contrast is a more recent evolutionary development and is often called deliberate. It is slow and sequential in nature, takes effort (cognitive resources). K asks "What is 13 x 27?" System 2 makes use of the central working memory system. This leads to 2 different ideas of rationality. we apprehend the world in two radically opposed ways, employing two fundamentally different modes of thought – fast and slow.
Kahneman’s earliest study was mentioned in the NAS president’s introduction and Karhneman cites it as evidence of System 2 thinking. Pupils dilate when we engage in deliberately thinking. Yes the eyes are the windows into the mind!
While we like to think of ourselves as deliberate thinkers we are often associative thinkers. But to be fast we think that networks of associations need to be activated. These are not necessarily logical and they provide some quick but biased interpretations and are afforded by other associations. This makes for a Blink knowing, but often a deceptive one.
Take an association to the work “bank” in "approach the bank". A quick interpretation might select a meaning by frequency of use. Then "approach the bank" means going towards a financial institution rather than a river bank. It could also be primed by a related word so if we hear “fish” and “bank” then river is a more likely association.
You’ll have to see the video to see how Kahneman woos the audience with his story about his wife’s phrase "sexy man" and what he felt he knew she said afterwards as “doesn't undress the maid himself.”
As shown by studies associative memory interprets the present in terms of the past. In effect we produces stories that make sense based on these past associations (“sexy and “undress” are related). A good story makes associative sense. And this happens on stock market. It’s not deliberate reasoning but sloppy associative. Stats for atctual analysis of the performance of fund managers over the longer term shows that investors do just as well when basing financial decisions to a monkey throwing darts at a board. There is a tremendously powerful illusion of expertise that sustains managers in their belief their results, when good, are the result of skill. Kahneman explains this as a bias and thus "performance bonuses" are largely awarded for luck or stacking the deck and not real skill at projecting the future.
At this point K started citing studies showing how our interpretations of the likelihood of things is often not logical. What is the overall probability of a flood in California. People say small. Bit if asked the probability of a Flood from an earthquake in CA, the probability is higher, even though the probability of such things such be part of the first probability. Why the illogic? It’s just a better story.
You’ll have to see the video to see how Kahneman woos the audience with his story about his wife’s phrase "sexy man" and what he felt he knew she said afterwards as “doesn't undress the maid himself.”
As shown by studies associative memory interprets the present in terms of the past. In effect we produces stories that make sense based on these past associations (“sexy and “undress” are related). A good story makes associative sense. And this happens on stock market. It’s not deliberate reasoning but sloppy associative. Stats for atctual analysis of the performance of fund managers over the longer term shows that investors do just as well when basing financial decisions to a monkey throwing darts at a board. There is a tremendously powerful illusion of expertise that sustains managers in their belief their results, when good, are the result of skill. Kahneman explains this as a bias and thus "performance bonuses" are largely awarded for luck or stacking the deck and not real skill at projecting the future.
At this point K started citing studies showing how our interpretations of the likelihood of things is often not logical. What is the overall probability of a flood in California. People say small. Bit if asked the probability of a Flood from an earthquake in CA, the probability is higher, even though the probability of such things such be part of the first probability. Why the illogic? It’s just a better story.
At this point K moved to the topic of what is a valid argument? Truth and validity get confused as shown in the example below”
- All roses are Flowers
- Some Flowers fade quickly
· Therefore some Roes fade…..while that seems possible it is not logically true.
This shows that we reason by associations back from conclusion. Correct order is important for valid inference but not associations.
Kahneman provided some examples on the synergy of associations and how the environment influences what we think. If you hold a pencil between your teeth, forcing your mouth into the shape of a smile, you'll find a cartoon funnier than if you hold the pencil pointing forward, by pursing your lips round it in a frown-inducing way you feel more disgust for the cartoon. K had fun with this story. See also Timothy D Wilson’s book Strangers to Ourselves.
This shows that we reason by associations back from conclusion. Correct order is important for valid inference but not associations.
Kahneman provided some examples on the synergy of associations and how the environment influences what we think. If you hold a pencil between your teeth, forcing your mouth into the shape of a smile, you'll find a cartoon funnier than if you hold the pencil pointing forward, by pursing your lips round it in a frown-inducing way you feel more disgust for the cartoon. K had fun with this story. See also Timothy D Wilson’s book Strangers to Ourselves.
Associative coherence and emotion work differently than logical coherence. For emotions that have fit and adhere. This is suggested by a Paul Rozin’s poison experiment (Rozin et al. 1990). In Rozin’s experiment participants are shown 2 empty bottles that are subsequently filled with sugar. The experimenter then shows the participant two labels, one saying ‘Sugar’, the other saying ‘Sodium Cyanide.’ After reading the labels, participants are more hesitant to drink from the bottle with the ‘Sodium Cyanide’ label even it has OJ in it. There is associaton-based discomfort with.
And associations with particular people works strongly too. What we associate with a person has a great deal with how we believe and how we feel.
Most ideas come from people we like. That is a social belief comes from emotional trust.
K mentioned AmosTversky’s socialcultural theory of attitudes. Social leaders may have attitudes on certain topics for arbitrary historical reason. But as likeable leaders they often can influence many attitudes.
Interactions between System 1 and 2 was a big part of K’s talk.
System 2 is used for control and may follow a series of rules. It partially monitors system 1. How it works is shown by the classic math problem: A bat and a ball together cost $1.10. The bat costs a dollar more than the ball. How much does the ball cost?Most ideas come from people we like. That is a social belief comes from emotional trust.
K mentioned AmosTversky’s socialcultural theory of attitudes. Social leaders may have attitudes on certain topics for arbitrary historical reason. But as likeable leaders they often can influence many attitudes.
Interactions between System 1 and 2 was a big part of K’s talk.
Your intuitive system for association may quickly tell you that the ball costs 10 cents. That would be an easy solution, but it would also be incorrect but it is the choice of may even at MIT. Why? We are cognitively lazy. People who delay gratification as shown by a Psych test and have more self control do better on this type of problem. It is more System 1 control.
Scientists should be big System 2. After all they have to pass hostile reviews to get published.
Kahneman says one of his favorite examples of System 1 thinking is what happens when you hear an upper-class British voice say, "I have large tattoos all down my back. People who speak with an upper-class British accent don't have large tattoos down their back. It violates our associative knowledge. So the brain must be bringing vast world knowledge to registers that there is an incongruity here. It happens within three- or four-tenths of a second and it's the same response you'd get if you heard a male voice say, "I believe I am pregnant."
So association makes us ready to respond, but it comes with rigid expectations. An example was hearing that “Julie reads in year 4.
Then you get asked “What's here GPA?” Usually it is high.
It’s as if we have a distribution for each (GPA and reading) which gets mapped together. But there are too many intervening events to accurately predict this.
An example concerns airport insurance. During [the ’90s] when there was terrorist activity in Thailand, people were asked how much they’d pay for a travel-insurance policy that pays $100,000 in case of death for any reason. Others were asked how much they’d pay for a policy that pays $100,000 for death in a terrorist act. Turns out that people will pay more for the second, even though it’s less likely. Why” It’s a policy for an instance of terror vs. dying in general.
We pay more to the terror policy since we fear terror more than death.
this suggests that our associate story telling system 1 is usually in charge.
We like stories and how they sound. If we hear “woes unite foes” it is more persuasive aphorism than “Woes unite enemies.”
A lesson is to communicate to non-experts in a different way. Speak to their story with assoc coherence. This is a lesson for getting the Climate Change story understood. Also the source of the message has to be liked and trusted.
Global warming too distant and abstract so it will take trusted leaders to make the case and do it in associative language. Anecdotes are concrete and specific so they are preferred over facts.
Global warming too distant and abstract so it will take trusted leaders to make the case and do it in associative language. Anecdotes are concrete and specific so they are preferred over facts.