Thinking, fast and slow: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Nick Gardner
mNo edit summary
 
(24 intermediate revisions by 3 users not shown)
Line 1: Line 1:
{{subpages}}
{{subpages}}
'''Thinking, fast and slow''' is a book by the eminent psychologist, Daniel Kahneman, that presents his view of how the mind works. It draws on recent developments in cognitive and social psychology, and includes as an appendix the "Prospect Theory" article "Judgement under uncertainty: heuristics and biases", for his part in which he was awarded the Nobel Prize in economics. The word fast in the title refers to "system 1" thinking, which operates automatically with little or no effort, and no sense of voluntary control.  The word slow refers to "system 2", of  mental activities that require concentration, effort  and self-control. The book examines the evidence concerning circumstances under which system 1 supplies false information to system 2. Its style is narrative rather than didactic, but it provides insights that go far beyond the telling of a story.
'''''Thinking, fast and slow''''' is a book<ref name=gbp>[http://books.google.com/books?id=ZuKTvERuPG8C&dq=kahneman+thinking+fast+and+slow&source=gbs_navlinks_s Google Books preview]. Kahneman D. (2011) ''Thnking, Fast and Slow''. MacMillan. ISBN 9781429969352.</ref> <ref name=booksite>[http://us.macmillan.com/thinkingfastandslow/DanielKahneman Publisher's Webpage for book]. Kahneman D. (2011) ''Thnking, Fast and Slow''. Farrar, Straus and Giroux. ISBN 978-0-374-27563-1.</ref> by the eminent psychologist, [http://www.nobelprize.org/nobel_prizes/economics/laureates/2002/kahneman-autobio.html Daniel Kahneman], that presents his view of how the [[mind]] works. It draws on recent developments in [[Cognitive science|cognitive]] and social [[psychology]]. Kahneman states his aims, in part, as:
{|align="center" style="width:90%;font-size:98%;"
|
 
<font face="Gill Sans MT">I hope to enrich the vocabulary that people use when they talk about the judgments and choices of others, the company's new policies, or a colleague's investment decisions...Many of us spontaneously anticipate how friends and colleagues will evaluate our choices; the quality and content of these anticipated judgments therefore matters...To be a good diagnostician, a physician needs to acquire a large set of labels for diseases, each of which binds an idea of the illness and its symptoms, possible antecedents and causes, possible developments and consequences, and possible interventions to cure or mitigate the illness. Learning medicine consists in part of learning the language of medicine. A deeper understanding of judgments and choices also requires a richer vocabulary than is available in everyday language...So this is my aim...improve the ability to identify and understand errors of judgment and choice, in others and eventually in ourselves, by providing a richer and more precise language to discuss them. In at least some cases, an accurate diagnosis may suggest an intervention to limit the damage that bad judgments and choices often cause.</font>
 
|}
The word 'fast' in the title refers to "system 1" [[thinking]], which operates automatically with little or no effort, and no sense of voluntary control.  The word 'slow' refers to "system 2" thinking, of  mental activities that require concentration, effort  and self-control. We discuss Kahneman's concepts of the two systems of thinking in more detail below.
 
In large part, the book examines the evidence concerning circumstances under which system 1 supplies false information to system 2. Its style is narrative rather than didactic, but it provides insights that go far beyond the telling of a story.  The book includes as an appendix the "Prospect Theory" article "Judgement under uncertainty: heuristics and biases", for his part in which Kahneman was awarded the [http://www.nobelprize.org/nobel_prizes/economics/laureates/2002/kahneman-lecture.html Nobel Prize in Economic Sciences in 2002].


==Part I. Two systems==
==Part I. Two systems==
Part I presents the basic elements of Daniel Kahnemann's two-systems approach to judgement and choice. Its purpose is to introduce a vocabulary for thinking about the mind.  
Part I presents the basic elements of Daniel Kahnemann's two-systems approach to judgement and choice. Its purpose is to introduce a vocabulary for thinking about the mind.
Kahneman writes of the two systems of thinking:
{|align="center" style="width:90%;font-size:98%;"
|
<font face="Gill Sans MT">When we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do. Although System 2 believes itself to be where the action is, the automatic System 1 is the hero of the book. I describe System 1 as effortlessly originating impressions and feelings that are the main sources of the explicit beliefs and deliberate choices of System 2. The automatic operations of System 1 generate surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps. I also describe circumstances in which System 2 takes over, overruling the freewheeling impulses and associations of System 1. You will be invited to think of the two systems as agents with their individual abilities, limitations, and functions.</font>
 
|}


The cognitive effort and self-control of system 2 is shown to draw upon a limited resource of "mental energy" - and to actually involve the depletion of the  blood system's glucose. The concept of "cognitive strain" is introduced as a response to effort and unmet demands  -  the absence of which is termed "cognitive ease". It is shown that cognitive ease is both a cause and a consequence of pleasant feelings  
The cognitive effort and self-control of system 2 is shown to draw upon a limited resource of "mental energy" - and to actually involve the depletion of the  blood system's glucose. The concept of "cognitive strain" is introduced as a response to effort and unmet demands  -  the absence of which is termed "cognitive ease". It is shown that cognitive ease is both a cause and a consequence of pleasant feelings  


System 1 is seen as conserving mental energy while maintaining and updating a model of its possessor's personal world by forming associations with regularly-ocurring events and outcomes. It operates on the assumption that "what you see is all there is" (WYSIATI), constructing the best story it can from the information that is available and making no allowance for the existence of information that it does not have. When information is scarce - which it often is - it acts as a "machine for jumping to conclusions", putting together a coherent story without reservations about the  quality and quantity of the information on which it is based. Much of the time, the coherent story that it creates is close enough to reality to provide a reasonable basis for action, but its dependence upon WYSIATI can lead to a wide variety of errors of judgement and choice.
System 1 is seen as conserving mental energy while maintaining and updating a model of its possessor's personal world by forming associations with regularly-ocurring events and outcomes. It operates on the assumption that "what you see is all there is" (WYSIATI), constructing the best story it can from the information that is available and making no allowance for the existence of information that it does not have. When information is scarce - which it often is - it acts as a "machine for jumping to conclusions", putting together a coherent story, without reservations about the  quality and quantity of the information on which it is based. Much of the time, the coherent story that it creates is close enough to reality to provide a reasonable basis for action, but its dependence upon WYSIATI can lead to a wide variety of errors of judgement and choice.
 
A simple example of system 1 thinking: grasping the meaning of a simple sentence in one's native language. A simple example of systems 2 thinking: searching one's memory to identify a surprising sound.
 
Kahneman divides Part I into nine chapters:
 
*<b>The Characters of the Story</b>
**<font face=“Gill Sans MT”>“This is your System 1 talking. Slow down and let your System 2 take control.”</font>
*<b>Attention and Effort</b>
**<font face=“Gill Sans MT”>“What came quickly to my mind was an intuition from System 1. I'll have to start over and search my memory deliberately.”</font>
*<b>The Lazy Controller</b>
**<font face=“Gill Sans MT”>“His ego was depleted after a long day of meetings. So he just turned to standard operating procedures instead of thinking through the problem.”</font>
*<b>The Associative Machine</b>
**<font face=“Gill Sans MT”>“The world makes much less sense than you think. The coherence comes mostly from the way your mind works.”</font>
*<b>Cognitive Ease</b>
**<font face=“Gill Sans MT”>“I'm in a very good mood today, and my System 2 is weaker than usual. I should be extra careful.”</font>
*<b>Norms, Surprises, and Causes</b>
**<font face=“Gill Sans MT”>“She can't accept that she was just unlucky; she needs a causal story. She will end up thinking that someone intentionally sabotaged her work."
*<b>A Machine for Jumping to Conclusions</b>
**<font face=“Gill Sans MT”>“They didn't want more information that might spoil their story. WYSIATI.”</font>
*<b>How Judgments Happen</b>
**<font face=“Gill Sans MT”>“This was a clear instance of a mental shotgun. He was asked whether he thought the company was financially sound, but he couldn't forget that he likes their product.”</font>
*<b>Answering an Easier Question</b>
**<font face=“Gill Sans MT”>“He likes the project, so he thinks its costs are low and its benefits are high. Nice example of the affect heuristic.”</font>


==Part II. Heuristics and biases==
==Part II. Heuristics and biases==
Part II explores some of the ways in which judgements and choices can be distorted by interactions between system 1 and system 2. The distortions are attributed either to system 2's "laziness" in resorting to an uncritical dependence upon system 1, or to its "ignorance" in being unaware of the shortcomings of system 1.
Part II explores some of the ways in which judgements and choices can be distorted by interactions between system 1 and system 2. The distortions are attributed either to system 2's "laziness" in resorting to an uncritical dependence upon system 1, or to its "ignorance" in being unaware of the shortcomings of system 1.


The deliberate use  of [[heuristic]]s to get rough-and-ready answers to  difficult questions, is a well-known system 2 strategy. An example, suggested by the eminent mathematician Georgs Polya is  that an adequate answer to a difficult quetion can often be achieved by substituting  an easier question  
In the first chapter of Part II, “The Law of Small Samples”&mdash;chapter 10 of the book&mdash;Kahneman argues two points. The first, that “we pay more attention to the content of messages than to information about their reliability”, with the result that we often misguide ourselves, sometimes because System 1 thinking gives us an unjustified and exaggerated faith in small numbers. We conclude that the elderly support the president when 60% of 300 elderly randomly polled say they do, not exerting the cognitive effort to wonder if multiple similar polls of 300 might often give opposite results, and not exerting the even greater cognitive effort in actually computing the risk of error in making that conclusion, using established statistical algorithms. The chapter enunciates a rich trove of examples of how our intuitive statistical thinking misleads us and provides an evolutionary biological hypothesis why our minds work that way. ''See also'':<ref name=tverkahn1971>Tversky A, Kahneman D. (1971) [http://pirate.shu.edu/~hovancjo/exp_read/tversky.htm Belief in the law of small numbers]. ''Psychological Bulletin'' 76(2):105-110.
*<font face=”Gill Sans MT”>"People have erroneous intuitions about the laws of chance. In particular, they regard a sample randomly drawn from a population as highly representative, that is, similar to the population in all essential characteristics. The prevalence of the belief and its unfortunate consequences for psychological research are illustrated by the responses of professional psychologists to a questionnaire concerning research decisions...The true believer in the law of small numbers commits his multitude of sins against the logic of statistical inference in good faith. The representation hypothesis [the sample represents the population] describes a cognitive or perceptual bias, which operates regardless of motivational factors. Thus, while the hasty rejection of the null hypothesis is gratifying, the rejection of a cherished hypothesis is aggravating, yet the true believer is subject to both. His intuitive expectations are governed by a consistent misperception of the world rather than by opportunistic wishful thinking. Given some editorial prodding, he may be willing to regard his statistical intuitions with proper suspicion and replace impression formation by computation whenever possible.“</font></ref>
 
The second point, that we tend to apply causal explanations to observations because the facts seem to us to beg for such, whereas “[m]any facts of the world are due to chance, including accidents of sampling. Causal explanations of chance events are inevitably wrong.” ‘Hot hands’ among professional basketball play, on analyses of thousands of sequences of shots, prove to be a cognitive illusion. System 1 thinking jumping to conclusion, system 2 thinking lazily jumping with it. 
 
The deliberate use  of [[heuristic]]s to get rough-and-ready answers to  difficult questions, is a well-known system 2 strategy. An example, suggested by the eminent mathematician Georgs Polya is  that an adequate answer to a difficult question can often be achieved by substituting  an easier question  
<ref>George Polya: ''How to Solve it: A New Aspect of Mathematical Method Princeton'', University Press, 2004</ref> (for example, responding to the question "who will win next year's presidential election?" by giving the  answer to the question "who has been doing best in this year's polls?"). As an example of the involuntary (system 1) use of a similar strategy, Daniel Kahnemann cites Paul Slovic's "affect heuristic"
<ref>George Polya: ''How to Solve it: A New Aspect of Mathematical Method Princeton'', University Press, 2004</ref> (for example, responding to the question "who will win next year's presidential election?" by giving the  answer to the question "who has been doing best in this year's polls?"). As an example of the involuntary (system 1) use of a similar strategy, Daniel Kahnemann cites Paul Slovic's "affect heuristic"
<ref>[http://people.usd.edu/~xtwang/DM%28GuangHua%29/Readings%28GuangHua%29/AffectHeuristic.pdf Paul Slovic, Melissa Finucane, Ellen Peters, & Donald G. MacGregor: ''The Affect Heuristic'', 2003]</ref>, in which people let their likes and dislikes determine their beliefs about the world, as well as his own research (with Amos Tversky) on the "anchoring" and "availability" heuristics  
<ref>[http://people.usd.edu/~xtwang/DM%28GuangHua%29/Readings%28GuangHua%29/AffectHeuristic.pdf Paul Slovic, Melissa Finucane, Ellen Peters, & Donald G. MacGregor: ''The Affect Heuristic'', 2003]</ref>, in which people let their likes and dislikes determine their beliefs about the world, as well as his own research (with Amos Tversky) on the [[anchoring heuristic|anchoring]] and [[availability heuristic|availability]] heuristics  
<ref name=pr>Amos Tversky and Daniel Kahneman: ''Judgment under Uncertainty: Heuristics and Biases'', Science, 1974 [http://www.math.mcgill.ca/vetta/CS764.dir/judgement.pdf (JSTOR)] - (reproduced as Appendix A of ''Thinking Fast and Slow'')</ref>.
<ref name=pr>Amos Tversky and Daniel Kahneman: ''Judgment under Uncertainty: Heuristics and Biases'', Science, 1974 [http://www.math.mcgill.ca/vetta/CS764.dir/judgement.pdf (JSTOR)] - (reproduced as Appendix A of ''Thinking Fast and Slow'')</ref>.


Line 37: Line 80:


==Part V.Two selves==
==Part V.Two selves==
Part V describes recent research that has introduced a distinction between two selves: the "experiencing self " and the "remembering self". Experiments have shown that system 1 gives more weight to recent experiences than to earlier experiences - so that, for example, it remembers a recent pain as being worse than an earlier than a more prolonged experience of the same pain. People are consequently apt to take memory-based decisions that are contrary to their own interests. A distinction has also been drawn between "experienced utility" and "decision utility", the former being concerned with the intensity of pain or pleasure that is actually experienced as the result of a choice; and the latter being concerned with the intensity with which a person wants the chosen outcome. In the model of economic man that is embodied in conventional economic theory, neither distinction exists and the two concepts coincide.
Part V describes recent research that has introduced a distinction between two selves: the "experiencing self " and the "remembering self". It has shown that system 1 gives more weight to recent experiences than to earlier experiences - so that, for example, it remembers a recent pain as being worse than an earlier than a more prolonged experience of the same pain. People are consequently apt to take memory-based decisions that are contrary to their own interests. A distinction has also been drawn between "experienced utility" and "decision utility", the former being concerned with the intensity of pain or pleasure that is actually experienced as the result of a choice; and the latter being concerned with the intensity with which a person wants the chosen outcome. In the model of economic man that is embodied in conventional economic theory, neither distinction exists and the two concepts coincide. The author uses the term "econs" to distinguish such hypothetical beings from the "humans" of the real world.


==References==
==References==


{{reflist}}
{{reflist}}
[[Category:Suggestion Bot Tag]]

Latest revision as of 07:01, 28 October 2024

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Thinking, fast and slow is a book[1] [2] by the eminent psychologist, Daniel Kahneman, that presents his view of how the mind works. It draws on recent developments in cognitive and social psychology. Kahneman states his aims, in part, as:

I hope to enrich the vocabulary that people use when they talk about the judgments and choices of others, the company's new policies, or a colleague's investment decisions...Many of us spontaneously anticipate how friends and colleagues will evaluate our choices; the quality and content of these anticipated judgments therefore matters...To be a good diagnostician, a physician needs to acquire a large set of labels for diseases, each of which binds an idea of the illness and its symptoms, possible antecedents and causes, possible developments and consequences, and possible interventions to cure or mitigate the illness. Learning medicine consists in part of learning the language of medicine. A deeper understanding of judgments and choices also requires a richer vocabulary than is available in everyday language...So this is my aim...improve the ability to identify and understand errors of judgment and choice, in others and eventually in ourselves, by providing a richer and more precise language to discuss them. In at least some cases, an accurate diagnosis may suggest an intervention to limit the damage that bad judgments and choices often cause.

The word 'fast' in the title refers to "system 1" thinking, which operates automatically with little or no effort, and no sense of voluntary control. The word 'slow' refers to "system 2" thinking, of mental activities that require concentration, effort and self-control. We discuss Kahneman's concepts of the two systems of thinking in more detail below.

In large part, the book examines the evidence concerning circumstances under which system 1 supplies false information to system 2. Its style is narrative rather than didactic, but it provides insights that go far beyond the telling of a story. The book includes as an appendix the "Prospect Theory" article "Judgement under uncertainty: heuristics and biases", for his part in which Kahneman was awarded the Nobel Prize in Economic Sciences in 2002.

Part I. Two systems

Part I presents the basic elements of Daniel Kahnemann's two-systems approach to judgement and choice. Its purpose is to introduce a vocabulary for thinking about the mind. Kahneman writes of the two systems of thinking:

When we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do. Although System 2 believes itself to be where the action is, the automatic System 1 is the hero of the book. I describe System 1 as effortlessly originating impressions and feelings that are the main sources of the explicit beliefs and deliberate choices of System 2. The automatic operations of System 1 generate surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps. I also describe circumstances in which System 2 takes over, overruling the freewheeling impulses and associations of System 1. You will be invited to think of the two systems as agents with their individual abilities, limitations, and functions.

The cognitive effort and self-control of system 2 is shown to draw upon a limited resource of "mental energy" - and to actually involve the depletion of the blood system's glucose. The concept of "cognitive strain" is introduced as a response to effort and unmet demands - the absence of which is termed "cognitive ease". It is shown that cognitive ease is both a cause and a consequence of pleasant feelings

System 1 is seen as conserving mental energy while maintaining and updating a model of its possessor's personal world by forming associations with regularly-ocurring events and outcomes. It operates on the assumption that "what you see is all there is" (WYSIATI), constructing the best story it can from the information that is available and making no allowance for the existence of information that it does not have. When information is scarce - which it often is - it acts as a "machine for jumping to conclusions", putting together a coherent story, without reservations about the quality and quantity of the information on which it is based. Much of the time, the coherent story that it creates is close enough to reality to provide a reasonable basis for action, but its dependence upon WYSIATI can lead to a wide variety of errors of judgement and choice.

A simple example of system 1 thinking: grasping the meaning of a simple sentence in one's native language. A simple example of systems 2 thinking: searching one's memory to identify a surprising sound.

Kahneman divides Part I into nine chapters:

  • The Characters of the Story
    • “This is your System 1 talking. Slow down and let your System 2 take control.”
  • Attention and Effort
    • “What came quickly to my mind was an intuition from System 1. I'll have to start over and search my memory deliberately.”
  • The Lazy Controller
    • “His ego was depleted after a long day of meetings. So he just turned to standard operating procedures instead of thinking through the problem.”
  • The Associative Machine
    • “The world makes much less sense than you think. The coherence comes mostly from the way your mind works.”
  • Cognitive Ease
    • “I'm in a very good mood today, and my System 2 is weaker than usual. I should be extra careful.”
  • Norms, Surprises, and Causes
    • “She can't accept that she was just unlucky; she needs a causal story. She will end up thinking that someone intentionally sabotaged her work."
  • A Machine for Jumping to Conclusions
    • “They didn't want more information that might spoil their story. WYSIATI.”
  • How Judgments Happen
    • “This was a clear instance of a mental shotgun. He was asked whether he thought the company was financially sound, but he couldn't forget that he likes their product.”
  • Answering an Easier Question
    • “He likes the project, so he thinks its costs are low and its benefits are high. Nice example of the affect heuristic.”

Part II. Heuristics and biases

Part II explores some of the ways in which judgements and choices can be distorted by interactions between system 1 and system 2. The distortions are attributed either to system 2's "laziness" in resorting to an uncritical dependence upon system 1, or to its "ignorance" in being unaware of the shortcomings of system 1.

In the first chapter of Part II, “The Law of Small Samples”—chapter 10 of the book—Kahneman argues two points. The first, that “we pay more attention to the content of messages than to information about their reliability”, with the result that we often misguide ourselves, sometimes because System 1 thinking gives us an unjustified and exaggerated faith in small numbers. We conclude that the elderly support the president when 60% of 300 elderly randomly polled say they do, not exerting the cognitive effort to wonder if multiple similar polls of 300 might often give opposite results, and not exerting the even greater cognitive effort in actually computing the risk of error in making that conclusion, using established statistical algorithms. The chapter enunciates a rich trove of examples of how our intuitive statistical thinking misleads us and provides an evolutionary biological hypothesis why our minds work that way. See also:[3]

The second point, that we tend to apply causal explanations to observations because the facts seem to us to beg for such, whereas “[m]any facts of the world are due to chance, including accidents of sampling. Causal explanations of chance events are inevitably wrong.” ‘Hot hands’ among professional basketball play, on analyses of thousands of sequences of shots, prove to be a cognitive illusion. System 1 thinking jumping to conclusion, system 2 thinking lazily jumping with it.

The deliberate use of heuristics to get rough-and-ready answers to difficult questions, is a well-known system 2 strategy. An example, suggested by the eminent mathematician Georgs Polya is that an adequate answer to a difficult question can often be achieved by substituting an easier question [4] (for example, responding to the question "who will win next year's presidential election?" by giving the answer to the question "who has been doing best in this year's polls?"). As an example of the involuntary (system 1) use of a similar strategy, Daniel Kahnemann cites Paul Slovic's "affect heuristic" [5], in which people let their likes and dislikes determine their beliefs about the world, as well as his own research (with Amos Tversky) on the anchoring and availability heuristics [6].

The concluding chapters of Part II are concerned with the problems of statistical inference. Fresh light is thrown on findings that, although people can make useful intuitive judgements of such matters as time and distance, their judgements of probability are almost invariably wrong. Nevertheless, such judgements are often accepted with confidence, and system 1 is usually willing to predict rare events from the weakest of evidence.

Part III. Overconfidence

Part III is about the illusions that result from the WYSIATI limitation on system 1 thinking. An "illusion of understanding" can result from the acceptance of a plausible "story" about the world that is misleading because it ignores the existence of information that system does not have. For example, the "Halo Effect"[7] that is created by survey-based management studies, tends to credit managers and their methods, with successes that are, in fact, the result of chance.

The illusion of understanding can generate an "illusion of validity" about forecasting ability. (Daniel Kahnemann recalls an occasion when his confidence in his forecasts of individual cases was not diminished by his acceptance of evidence that his previous forecasts had, on average, been valueless). Evidence of the overconfidence of experts is provided by Paul Meel's study of the predictions of clinicians [8], Philip Tetlock's analysis of the forecasts of political experts [9], and the author's own findings concerning the performance of financial advisors.

Daniel Kahnemann does not dismiss the possibility of accurate intuitive judgement by experts, but concludes that it is possible only when the environment is sufficiently regular to be predictable, and the expert has been able to learn the regularities by prolonged practice - as exemplified by Gary Klein's Recognition-Primed Decision (RPD) Model [10]. By way of illustration he refers to the finding that the expertise of chess masters depends upon the ability to recognise thousands of chessboard configurations acquired by at least 10,000 hours of dedicated practice [11].

Part IV. Choices

Part IV provides a current view, informed by the two-system model, of the key concepts of the "Prospect Theory" model of choice[6] that the author had developed in collaboration with Amos Tversky in 1974. It starts with a chapter headed "Bernoulli's Error" which demonstrates the inadequacy as a model of choice of the axiom, known by economists as the the law of diminishing marginal utility, by showing that choices are also influenced by the resulting change of utility from a "reference point" (usually the status quo). Their realisation of the importance of the reference point led Daniel Kahnemann and Amos Tversky to the discovery of the psychological traits of "loss aversion" (the practice of giving greater weight to the risk of losing £100, than to the prospect of gaining £100]) and the "endowment effect" (the finding that the possessor of an asset typically requires a higher payment to part with it, than he would pay to acquire it). It also led them to expose the unrealism of the economists' axiom-based expected utility model of choice in which outcomes are weighted according to their expected probabilities. They identified two circumstances in which the decision weights that people give to outcomes are not the same as their probabilities. The "possibility effect" leads people to give excessive weight to highly unlikely outcomes (people buy lottery tickets for more than their expected value, as calculated by multiplying the prize by the probability of winning). The "certainty effect" is a preference for a certain gain over the high probability of a larger gain, which leads people to undervalue highly likely outcomes.

Part V.Two selves

Part V describes recent research that has introduced a distinction between two selves: the "experiencing self " and the "remembering self". It has shown that system 1 gives more weight to recent experiences than to earlier experiences - so that, for example, it remembers a recent pain as being worse than an earlier than a more prolonged experience of the same pain. People are consequently apt to take memory-based decisions that are contrary to their own interests. A distinction has also been drawn between "experienced utility" and "decision utility", the former being concerned with the intensity of pain or pleasure that is actually experienced as the result of a choice; and the latter being concerned with the intensity with which a person wants the chosen outcome. In the model of economic man that is embodied in conventional economic theory, neither distinction exists and the two concepts coincide. The author uses the term "econs" to distinguish such hypothetical beings from the "humans" of the real world.

References

  1. Google Books preview. Kahneman D. (2011) Thnking, Fast and Slow. MacMillan. ISBN 9781429969352.
  2. Publisher's Webpage for book. Kahneman D. (2011) Thnking, Fast and Slow. Farrar, Straus and Giroux. ISBN 978-0-374-27563-1.
  3. Tversky A, Kahneman D. (1971) Belief in the law of small numbers. Psychological Bulletin 76(2):105-110.
    • "People have erroneous intuitions about the laws of chance. In particular, they regard a sample randomly drawn from a population as highly representative, that is, similar to the population in all essential characteristics. The prevalence of the belief and its unfortunate consequences for psychological research are illustrated by the responses of professional psychologists to a questionnaire concerning research decisions...The true believer in the law of small numbers commits his multitude of sins against the logic of statistical inference in good faith. The representation hypothesis [the sample represents the population] describes a cognitive or perceptual bias, which operates regardless of motivational factors. Thus, while the hasty rejection of the null hypothesis is gratifying, the rejection of a cherished hypothesis is aggravating, yet the true believer is subject to both. His intuitive expectations are governed by a consistent misperception of the world rather than by opportunistic wishful thinking. Given some editorial prodding, he may be willing to regard his statistical intuitions with proper suspicion and replace impression formation by computation whenever possible.“
  4. George Polya: How to Solve it: A New Aspect of Mathematical Method Princeton, University Press, 2004
  5. Paul Slovic, Melissa Finucane, Ellen Peters, & Donald G. MacGregor: The Affect Heuristic, 2003
  6. 6.0 6.1 Amos Tversky and Daniel Kahneman: Judgment under Uncertainty: Heuristics and Biases, Science, 1974 (JSTOR) - (reproduced as Appendix A of Thinking Fast and Slow)
  7. Phil Rosenzweig: The Halo Effect, Free Press, 2007
  8. Paul E.Meehl Clinical versus Statistical Prediction, Leslie J. Yonce, 2003
  9. Philip E. Tetlock: Expert Political Judgment: How Good Is It? How Can We Know?, Princeton University Press, 2006 (review & contents)
  10. Gary Klein: A Recognition-Primed Decision (RPD) Model of Rapid Decision Making, 1993
  11. for example, Neil Charness et al: The Role of Deliberate Practice in Chess Expertise, Applied Cognitive Psychology, 2005