Ehud Friedgut: Blissful ignorance and the Kahneman-Tversky paradox

AmosDannyGili

Tversky, Kahneman, and Gili Bar-Hillel (WikiPedia). Taken by Maya Bar-Hillel at Stanford, summer 1979.

tversky kaneman

The following post was kindly contributed by Ehud Friedgut.

During the past week I’ve been reading, and greatly enjoying Daniel Kahneman’s brilliant book “Thinking fast and Slow”.

One of the most intriguing passages in the book is the description of an experiment designed by Kahneman and Tversky which exemplifies a judgmental flaw exhibited by many people, which supposedly indicates an irrational, or inconsistent behavior. I will describe their experiment shortly.
I still remember the first time I heard of this experiment, it was related to me over lunch in Princeton by Noga Alon. Returning to this problem, 15 years later, I still, as in my initial exposure to the problem, made the “inconsistent” choice made by the vast majority of the subjects of the study. In this post I wish to argue that, in fact, there is nothing wrong with this choice.

Before relating their experiment, let me suggest one of my own. Imagine, if you will, that you suffer from gangrene in one of your toes. The doctor informs you that there is a 20% chance that it is “type A” gangrene, in which case you can expect spontaneous healing. There is a 75% chance that it is type B, in which case you will have to amputate it, and a 5% chance that it is type C. In the last case there is a shot you can be given that will save your toe, but it will cost you 2000$.
What would you do? I would probably not take the shot. My guiding principle here is that I hate feeling stupid, and that there’s a pretty good chance that if I take the shot I’ll walk around for the rest of my life, not only minus one toe and 2000$, but also feeling foolish for making a desperate shot in the dark.
Now, say I declined the shot, and I return after a week, and the doctor sees that the condition has worsened and that he will have to amputate the toe. He asks me if I wish (say for no cost) that he send the amputated toe for a biopsy, to see if it was type B or C. Here my gut reaction, and I’m sure yours too, is a resounding no. But even when thinking it over more carefully I still think I would prefer not to know. The question is which is better:
Option 1) I have a 75/80 probability of having a clean conscience, and a 5/80 chance of knowing clearly for the rest of my life that I’m lacking a toe because I’m what’s known in Yiddish as an uber-chuchem (smart aleck).
Option 2) Blissful ignorance: for the rest of my life I enjoy the benefit of doubt, and know that there’s only a 5/80 chance that the missing toe was caused by my stinginess.
I prefer option 2. I’m guessing that most people would also choose this option. I’m also guessing that Kahenman and Tversky would not label this as an irrational or even an unreasonable choice. I’m almost positive they wouldn’t claim that both options are equivalent.

Now, back to the KT experiment. You are offered to participate in a two stage game. In the first stage 75% of the participants are eliminated at random. At the second stage, if you make it, you have two choices: a 100% chance of winning 30$ or an 80% chance of winning 45$. But you have to decide before stage one takes place.
What would you choose?
I’ll tell you what I, and the majority of the subjects of the study do. We choose the 30$. Here’s my reasoning: 30 $ is pretty nice, I can go for a nice lunch, 45$ would upgrade it, sure, but I would feel really bad if I ended up with nothing because I was greedy. Let’s stick to the sure thing.

Now a different experiment: you have to choose between 20% chance of gaining 45$, or a 25% chance of gaining 30$.
What do you choose?
Once again, I chose what the majority chose: I would now opt for the 45$. My reasoning? 20% sounds pretty close to 25% to me, the small difference is worthwhile for a 50% gain in the prize.

O.k., I;m sure you all see the paradox. The two games are identical. In both you choose between a 20% chance of 45$ and a 25% chance of 30$. My reference to “a sure thing” represented a miscomprehension, common to most subjects, who ignored the first stage in the first game. Right?

No, wrong. I think the two games are really different, just as the two options related to the gangrene biopsy were different.
It is perfectly reasonable that when imagining the first game you assume that you are told whether you proceed to the second stage or not, and only if you proceed you are then told, if you chose the 80% option, whether you were lucky.
In contrast, in the second game, it is reasonable to assume that no matter what your choice was, you are just told whether you won or not.
Of course, both games can be generated by the same random process, with the same outcome (choose a random integer between 1 and 100, and observe whether it’s in [1,75], [76,95] or [96,100] ), but that doesn’t mean that when you chose the 45$ option and lose you always go home with the same feeling. In game 1 if you chose the risky route you have a 75% probability of losing and knowing that your loss has nothing to do with your choice, and a 5% chance of kicking yourself for being greedy. In game 2 you have a 80% chance of losing, but enjoying the benefit of doubt, knowing that there’s only a 5/80 chance that the loss is your fault.

Of course, my imagination regarding the design of the games is my responsibility, it’s not given explicitly by the original wording, but it certainly is implicit there.
I maintain that there is nothing irrational about trying to avoid feeling regret for your choices, and that I would really stick to the “paradoxical” combination of choices even in real life, after fully analyzing the probability space in question.
For those of you reading this blog who don’t know me, I’m a professor of mathematics, and much of my research has to do with discrete probability. That doesn’t mean that I’m not a fool, but at least it gives me the benefit of doubt, right?

========================================================

O.k., now, here’s part two of my post – after finishing the book.


I didn’t encounter the notion of blissful ignorance in the book, but, of course, Kahneman is well aware of the notion of trying to avoid regret. However, he finds it, how shall we say? Regrettable.
When addressing a fictitious archetypical character, “Sam”, who is risk aversive and therefore makes choices that are suboptimal from the point of view of expected gain, Kahneman offers Sam the following words of wisdom:

I sympathize with your aversion to losing any gamble, but it is costing you a lot of money. Consider the following question:
Are you on your deathbed? Is this the last offer of a small favorable gamble that you will ever consider?

He then goes on to urge Sam to view life as a long series of repeated games (or variants on a single game) rather than a collection of isolated decisions. Clearly, if Sam wholeheartedly adopts this advice he will both have a good chance of becoming wealthier, and experience much less regret. (My gangrene example doesn’t fall into this category because amputating a toe is slightly too dramatic to be classified as “a small gamble”, and also there’s a clear limit on the number of times you can repeat this experiment.)
Can I adopt this attitude when faced with a gamble similar to the original KT test described above?
I’m afraid I might have a hard time doing so. It seems my ego might get in the way – you see, I pride myself so much on being rational and good at analyzing risks that every such test is a memorable event to me, and therefore I find it hard to view it as just one more choice among thousands that I will have to take. Do you see the paradox? The “skilled decision maker” makes a bad decision because he prides himself for his skill.
Perhaps this is why Amoz Oz once expressed the sentiment that he would like to see a prime minister who is like an old and experienced grocer (and I add in my mind, as opposed to a brilliant analytical genius with a degree from Stanford in operation research.) I think old experienced grocers might be better at avoiding letting their ego get in the way of decision making.

This entry was posted in Guest blogger, Rationality and tagged , , . Bookmark the permalink.

11 Responses to Ehud Friedgut: Blissful ignorance and the Kahneman-Tversky paradox

  1. aravind says:

    Ehud, my Firefox browser is not parsing your dollar-signs correctly. Can you please change those signs to the word “dollars”? Thanks.

  2. Paul says:

    It isn’t the dollar sign, The original post is OK in Firefox. it is what happens to it when it gets picked up by the blog aggregator. (I think it doesn’t pick up the % properly.)

  3. Sam says:

    Suppose the game was that you can choose the 100% chance of $30 or the 80% chance of $45 but either way you are told what would have happened – in other words if you choose the $30, you get the money and you also get told whether or not you would have won if you had opted for the chance at $45. Then what do you do?

    • Ehud says:

      I’ll refrain from answering that, because, to be totally frank, I feel that my current analysis will already be quite far removed from the intellectual experiment we’re trying to run, since I can’t pretend that I don’t know [private information] that there is a non negligible probability that a Nobel Laurette will read and judge my answer…

      I will add the following, however. Kahneman tells in his book about a mind-blowing (to my mind, at least) experiment, where people’s evaluation of how happy they are, was clearly correlated with them “accidentally” finding a dime (planted by the examiners) on the copy machine before they filled out the questionnaire. How, then, can we expect someone’s equanimity to be retained when
      a) It’s not a dime but 15 Dollars.
      b) It’s not blind fate (like finding the dime) but rather a consequence of their choice.
      That’s all fine for ultra rationalists, or Kiplingesque characters (see quote from “If”, below), but what about the rest of us?
      ================================================
      If you can make one heap of all your winnings
      And risk it on one turn of pitch-and-toss,
      And lose, and start again at your beginnings
      And never breathe a word about your loss;

  4. For some references about work related to examples of this kind as well as additional interesting discussions on “paradoxes” in decision making consult: http://psych.fullerton.edu/MBIRNbAUM/PSYCH466/articles/New_Paradoxes_PsyRev_2008.pdf

  5. Gil Kalai says:

    Another thing from Ehud Friedgut gmail profile:” Google it: 100-3/(sqrt(x^2+y^2))+sin(sqrt(x^2+y^2))+sqrt(200-(x^2+y^2)+10*sin(x)+10sin(y))/1000, x is from -15 to 15, y is from -15 to 15, z is from 90 to 101 “

  6. Ro'i says:

    Actually, Loomes and Sugden showed that regret is able to rationalize this pattern with resorting to blissful ignorance. See their Regret Theory paper in the Economic Journal from 1982.

    • Ehud says:

      I just went through it, very interesting, and quite elegant. Thanks for the reference.
      Of course, the notion of blissful ignorance is not a mathematical model to explain our behavior, but rather a psychological phenomenon that affects the values of the function M^k_{i,j} that they define.

  7. You may find the attached (unpublished) paper interesting:

    AMBIGUITY PREFERENCE
    Abstract
    Ambiguity refers to uncertain situations where the probability of the possible events is unknown. Ellsberg (1961) has shown that people, when faced with positive rewards, present aversion to ambiguity. Recent research shows, however, contradicting results. The purpose of this paper is to investigate under which conditions people prefer ambiguity over clarity.
    Similar to Prospect Theory were people prefer uncertainty over certainty in the negative domain we found that in the negative domain people prefer ambiguity over clarity. Perhaps, ambiguity serves as an excuse or a justification against regret. When people have to make decisions concerning negative outcomes, they would prefer to be less responsible and more ignorant.

    Keywords: Ambiguity, Preferences, Risk

    Introduction
    Uncertainty is a major element in the theory of decision making. Research has shown that the way by which uncertainty is presented to the decision maker influences the decision process itself. Knight (1921) distinguishes between “risk” where the probabilities are measurable and “uncertainty” where they are not. Ellsberg (1961) added to this line the notion of ambiguity, which is a mix of “risk” and “uncertainty.” Eichberger and Kelsey (2007) defined ambiguity as situations where some or all of the relevant information about probabilities is missing. Ellsberg himself (1961, p. 657) defined ambiguity as “a quality depending on the amount, type, reliability, and ‘unanimity’ of information, and giving rise to one’s degree of ‘confidence’ in an estimate of relative likelihoods.”
    Ellsberg (1961, 2001) has shown that people, when faced with positive rewards, present aversion to ambiguity. In contrast to the Subjective Expected Utility (SEU) theory, he shows that people have a preference for objective over subjective bets. People make decisions differently if there is ambiguity about uncertainty compared with the instance when there is no ambiguity, even if the expected risk is the same (Kahn and Sarin, 1988). Similar results were shown by Ganzach (2000) and Zajonc (1968) where, based on the Exposure Effect, people preferred investing in known ventures such as stocks of the NYSE rather than stocks of less known markets.
    Most of the recent experimental studies report that individuals tend to avoid ambiguity (e.g., Sarin and Weber, 1993; Kunreuther et al., 1995; Gonzalez-Vallejo, et al., 1996; Kuhn and Budescu, 1996). On the other hand, recent research shows that, on some occasions, people prefer ignorance to knowledge (Ehrich and Irwin, 2005, Botti and Iyengar, 2006).
    Fox and Tversky (1995) addressed ambiguity aversion, the idea that people do not like ambiguous gambles or choices with ambiguity, with the comparative ignorance framework. Their idea was that people are only ambiguity averse when their attention is specifically brought to the ambiguity by comparing an ambiguous option with an unambiguous option. For instance, people are willing to bet more on choosing a correct colored ball from an urn containing equal proportions of black and red balls than an urn with unknown proportions of balls when evaluating both of these urns at the same time. However, when evaluating them separately, people are willing to bet approximately the same amount on either urn. Thus, when it is possible to compare the ambiguous gamble to an unambiguous gamble, people are averse, but not when one is ignorant of this comparison. Camerer and Weber (1992) reviewed the many studies that have examined how individuals react to probabilistic ambiguity.
    Similar to Prospect Theory where we observe risk aversion in the positive domain and risk seeking in the negative domain, it seems to us that there should be a difference in the attitude toward ambiguity in the positive and the negative domains.
    In the negative domain, however, the results are not consistent. Yassour (1984) replicated Ellsberg’s experiment but extended it to include the negative domain. Whereas in the positive domain significant ambiguity aversion was found, in the negative domain, most subjects preferred the ambiguous options over the non-ambiguous ones.
    On the other hand, ambiguity avoidance for both gains and losses was demonstrated by Keren and Gerritsen (1999) and by Inukai and Takahashi (2009). In the same vein, we find Camerer and Weber (1992), Cohen, Jaffray and Said (1985), and Einhorn and Hogarth (1985, 1986) report ambiguity aversion for losses but in a smaller magnitude than for gains.
    Kahn and Sarin (1988) found that “in the gains domain, there is ambiguity seeking at low mean probabilities and ambiguity aversion at high mean probabilities. In the loss domain, a reflection effect occurs with ambiguity aversion at low mean probabilities and ambiguity seeking at high mean probabilities. These results parallel those observed for risk aversion. A possible explanation for the close resemblance between the findings on risk aversion and ambiguity aversion may be that the same psychological factors are responsible for both effects. Therefore, the presence of ambiguity may accentuate the attitude toward risk (aversion or seeking).”
    Karlsson et al., (2009) and Galai and Sade (2006) call ambiguity seeking in the negative domain “The ostrich effect”. They show that people prefer less information and “try to shield themselves from receiving definitive information when they suspect the news may be adverse”.
    In this paper, we intend to examine in an experimental setting the attitudes toward ambiguity in the positive and negative domains using a mixed design of within-subjects and between-subjects.
    In the next section, we describe the experimental setting, then we show the results for each experiment, and we conclude with a general discussion.

    Method
    Participants
    Participants were 329 undergraduate students of business administration and economics from the Ruppin Academic Center (152 female and 167 male). The average age was 33 with a standard deviation of 5.1 (most participants in the reach were students in the executive program).
    The experiment was conducted in the classroom during class hours. The subjects agreed to participate in the study upon our request. The students had the option of not participating in the experiment and leaving the classroom at that point.

    Procedure
    Students’ attitude toward ambiguity was measured using a questionnaire with four scenarios (vacation, health, monetary, survival), all framed positively and negatively.
    Each student answered two questions in the positive framing and two questions in the negative framing.
    In each scenario, the subjects were asked to choose between a known/familiar state and an unknown/ambiguous state on a seven-point scale, where 1 represented aversion to ambiguity and 7 represented ambiguity seeking.
    There were two versions of the questionnaire. In the first version, scenarios 1 and 2 were framed in the positive domain and scenarios 3 and 4 in the negative domain, and vice versa in the second version.

    Scenarios 1 and 2 dealt with the difference between known and unknown probabilities, whereas scenarios 3 and 4 dealt with the difference between a familiar and an unfamiliar process.
    Known and unknown probability distribution
    Scenario 1 (vacation/military service)
    Positive framing: Out of 90 students, 30 students will win a prize of a 1-week stay in Europe, all expenses included. The lottery is done by drawing a ball from an urn. The urn contains red, black, and yellow balls. In total, there are 90 balls, out of which 30 are red and the rest (60) are black or yellow. You can choose between getting the prize by drawing a red ball or by drawing a yellow ball. What would you prefer?
    Negative framing: After one month of a difficult reserve military service in a remote, hot, and dusty area, it is necessary to choose, out of 90 soldiers, those who would have to stay for an extra week and those who could go back home. The lottery is done by drawing a ball from an urn. The urn contains red, black, and yellow balls. In total, there are 90 balls, out of which 30 are red and the rest (60) are black or yellow. You can choose between being selected to stay the extra week by drawing a red ball or by drawing a yellow ball. What would you prefer?
    Scenario 2 (health)
    Positive framing: Two young Israelis traveled in Africa. Their families received an alarming message that they are hospitalized in a local hospital. One suffered from Meningitis type A, for which the chance of recuperating is 70%, and the other from Meningitis type B, for which the chance of recuperating is between 50 and 90%. The telephone connection with that country is out of order. Which family feels better?
    Negative framing: Two young Israelis traveled in Africa. Their families received an alarming message that they are hospitalized in a local hospital. One suffered from Meningitis type A, for which the mortality rate is 30%, and the other from Meningitis type B, for which the mortality rate is between 10 and 50%. The telephone connection with that country is out of order. Which family feels better?
    In scenario 1, we extended the original Ellsberg experiment to the negative domain. Like Ellsberg, we expected our subjects to demonstrate aversion to ambiguity and to prefer the option with known probabilities over the one with vague probabilities. The explanation to such a behavior can be found in the words of Ellsberg himself: “What is at issue (here) might be called the ambiguity of information, a quality depending on the amount, type, reliability, and “unanimity” of information, and giving rise to one’s degree of ’confidence’ in an estimate of relative likelihoods.” (Ellsberg, 1961, p. 657).

    Also, in scenario 2, the subjects had to choose between two options, one with known probabilities and the other with less accurate probabilities. Similar to scenario 1, we expected our subjects to demonstrate aversion to ambiguity and to prefer the option with known probabilities over the one with vague probabilities.
    In the negative domain, however, we expected just the opposite results in both scenarios. Kahn and Sarin (1988) found that, in a consumer choice context, subjects were ambiguity-averse in the gain domain and were ambiguity-prone in the loss domain. Similar to the Prospect Theory (Kahneman and Tversky, 1979), we assumed that, when people have to make decisions concerning negative outcomes, they would prefer to be less responsible, more ignorant, and would rely (mostly for an excuse) more on uncertainty.
    Results
    Ambiguity aversion was measured on a 7-point scale where 1 refers to “extreme ambiguity aversion” and 7 to “extreme ambiguity seeking.” As expected, ambiguity aversion was stronger in the vacation question (positive framing of scenario 1) (M = 3.04, SD = 1.61) as compared with the military reserve question (negative framing of scenario 1) (M = 3.86, SD = 1.65).
    (t (327) = 4.52, p< 0.001; see Table I).
    As expected, ambiguity aversion was stronger in the health question framed positively using the recuperation chance (M = 4.01, SD = 1.9) as compared with the negatively framed question using the mortality chance (M = 4.52, SD = 1.69). (t (229) = 2.09, p< 0.02; see Table I).
    Familiar and an unfamiliar process
    Scenario 3 (monetary)
    Positive framing: An Israeli scholar was awarded a prize by the government of South Korea in the sum of ten million Won, which equals, more or less, the value of 6000 Euros. She will receive the award a year from now, but should decide now in which currency to get the prize, Won or Euro. What would you choose?
    Negative framing: An Israeli traveler was fined while in a trip in South Korea in the sum of ten million Won, which equals, more or less, the value of 6000 Euros. She has to pay the fine within a year but has to decide now in which currency to pay the fine when it is due, Won or Euro. What would you choose?
    Scenario 4 (survival)
    Positive framing: An Israeli traveler and an Italian traveler were captured by a cannibal tribe in Papua New Guinea. The tribe is known for killing 50% of the white people it captures. The procedure of choosing the one to be freed is either by flipping a coin or by blowing feathers to the wind by a local priest. In that "blowing in the wind" procedure, the chances are also 50-50 on the average. If you were the Israeli traveler, in what procedure would you prefer the free person to be determined?
    Negative framing: An Israeli traveler and an Italian traveler were captured by a cannibal tribe in Papua New Guinea. The tribe is known for killing 50% of the white people it captures. The procedure of choosing the one to be killed is either by flipping a coin or by blowing feathers to the wind by a local priest. In that "blowing in the wind" procedure, the chances are also 50-50 on the average. If you were the Israeli traveler, in what procedure would you prefer the victim to be determined?
    According to the Exposure Effect (Zajonc, 1968) people tend to prefer options that they are familiar with. Ganzach (2000) has shown that people preferred investing in known ventures such as stocks of the NYSE rather than stocks of less known markets. Attitude toward ambiguity was also explained by the perceived competence of the decision maker or the salience of limited knowledge (Frisch and Baron, 1988; Heath and Tversky, 1991). The idea is that missing information could consequently lead to greater blame or regret, if the decision proves to be a poor one (Kuhn and Budescu, 1996). In the negative domain, however, ambiguity can serve as an excuse or a justification against regret.
    In scenario 3, in both cases, reward or fine, the subjects had to choose between the won (unknown currency) and the euro (more familiar to Israeli students). We expected that, in the reward (positive) case, the subjects would reveal aversion to ambiguity and would prefer the euro option. On the other hand, we expected that, in the fine (negative) case, the subjects would reveal ambiguity seeking and would prefer the won option.
    In scenario 4, the subjects had to choose between an unknown probability-generating process and a known one. There is no ambiguity in tossing a coin. We expected that, in the "freeing" (positive) case, the subjects would reveal aversion to ambiguity and would prefer the coin option. On the other hand, we expected that, in the "being cooked" (negative) case, the subjects would reveal ambiguity seeking and would prefer the "feather" option.
    Results
    As mentioned above, ambiguity aversion was measured on a 7-point scale where 1 refers to "extreme ambiguity aversion" and 7 to "extreme ambiguity seeking". As expected, ambiguity aversion was stronger in the reward version of the Korean question (M = 2.49, SD = 1.78) as compared with the fine version (M = 4.28, SD = 1.85). (t (327) = 8.82, p< 0.001; see Table I).
    As expected, ambiguity aversion was stronger in the survival version of the Papua New Guinea question (M= 4.02, SD=1.58) as compared with the negatively framed question using mortality chance (M = 4.34, SD = 1.59).
    (t (327)=1.84, p< 0.03; see Table I).
    Summary of results and discussion
    Table I summarizes the differences between subjects with regard to the attitude toward ambiguity in the positive and negative domains.
    Table I: Ambiguity preference for positive and negative scenarios (1=extreme ambiguity aversion and 7= extreme ambiguity seeking).
    Positive domain
    (Mean ± SD) Negative domain
    (Mean ± SD)
    t SMD – Cohen's d
    1) Ellsberg’s jug 3.04 ± 1.61 3.86± 1.65 4.52 ** 0.50
    2) Health:70% or 50-90%
    Death: 30% or 10-50% 4.01± 1.9 4.52± 1.69 2.09 * 0.28
    3) Won or euro 2.49± 1.78 4.28± 1.85 8.82 ** 0.99
    4) Survival: coin or feathers 4.02± 1.58 4.34± 1.59 1.84 * 0.20

    **significant at the 0.01 level;* significant at the 0.05 level.

    As seen in Table I, across all scenarios, ambiguity preference was significantly higher in the negative domain than in the positive domain. Notice the very high size effect (SMD coefficient) of question 3 and the high coefficient of question 1.
    The results shown in Table I are illustrated in Figure I below.

    Figure I – Ambiguity preference between subjects
    As mentioned above, each subject answered four questions concerning the four scenarios; two in the positive domain and two in the negative domain. Thus, there were two versions of the questionnaire. In the first version, scenarios 1 and 3 were framed in the positive domain and scenarios 2 and 4 in the negative domain, and vice versa in the second version.
    In addition to the between-subject analysis described above, we also analyzed the within-subject difference of the attitude toward ambiguity between the positive and the negative domains. The within-subject design eliminated better the personal differences of the participants. As in the between-subject design, where we expected, and found, differences between the two groups, one answering a positively framed question and the other answering a negatively framed one, in the within-subject design, we expected to find a difference between the positively framed questions and the negatively framed questions for each subject.
    For each participant, we calculated the average score of the two positively framed questions and of the two negatively framed questions. We then compared these two figures (using T test paired samples) for each version, separately.
    Notice that scenarios 1 and 2 deal with the difference between known and unknown probabilities and scenarios 3 and 4 deal with the difference between a familiar and an unfamiliar process. In version 1, the known/unknown probability stories (scenarios 1 and 2) were positively framed, and the familiar/unfamiliar stories (scenarios 3 and 4) were negatively framed, and vice versa in version 2.
    Figure II illustrates the attitude toward ambiguity of the subjects in the two versions.

    Figure II – Within-subject ambiguity preference
    As seen in Figure II and, as expected, the subjects exhibited higher preference for ambiguity in the negatively framed questions than in the positively framed questions. The phenomenon was found in both versions, regardless of the nature of the scenarios, namely, in the known/unknown probability scenarios and in the familiar/unfamiliar scenarios.
    Conclusion

    This study examined people's attitude toward ambiguity. We found significant differences between the positive and the negative domains. The experimental results show ambiguity aversion in the positive domain. These results are consistent with the literature beginning from Ellsberg (1961) and onward. However, in the negative domain, our subjects showed ambiguity seeking in all experiments compared with equivocal findings in research so far.
    In the between-subject analysis, we compared the same questions but in positive versus negative framing, across all subjects. In all scenarios, we found that ambiguity aversion was lower in the negative domain than in the positive domain.
    In the within-subject comparison, we analyzed the four answers of each subject. Again, we found that the subjects demonstrated higher aversion to ambiguity in the positively framed questions. Moreover, our subjects demonstrated ambiguity seeking in the negatively framed questions. These results were independent of the type of ambiguity embedded in the scenarios i.e., in one version, the known/unknown probability stories were positively framed, and the familiar/unfamiliar stories were negatively framed, whereas in the other version, the known/unknown probability stories were negatively framed, and the familiar/unfamiliar stories were positively framed.
    This study shows similarities between attitude toward risk and that toward ambiguity. Similar to the Prospect Theory that shows a major difference between decisions in the positive and in the negative domain, we found that people show aversion to ambiguity in the positive domain but will show preference to ambiguity in the negative domain. A possible explanation is that, in the negative domain, ambiguity can serve as an excuse or a justification against regret. When people have to make decisions concerning negative outcomes, they would prefer to be less responsible, more ignorant, and relying more on uncertainty.
    Rational thinking assumes that more information is better than less. More and more current research shows that, in many cases, ignorance is bliss (Camerer et al., 1989; Birch, 2005; Ehrich and Irwin, 2005;). For example, Yaniv et al. (2004) showed that only about half of young students wished to know if they were carrying the genes of the lethal and incurable Huntington disease. In a research that we have just begun, participants were asked whether they would like to know the exact date of their death if such a possibility was available. Most subjects preferred to avoid such information, and more strongly regarding their children and other family members.
    Based on our results, we dare to claim that, when dealing with decisions in a negative environment, people will often prefer knowing less than more.

    References
    Birch, S. A. J (2005). When knowledge is a curse: Children's and Adults' Reasoning About Mental States. Current Directions in Psychological Science, 14, 25-29.
    Botti, S., & Iyengar, S. S. (2006). The Dark Side of Choice: When Choice Impairs Social Welfare. Journal of Public Policy and Marketing, 25, 24-38.
    Camerer, C., Loewenstein, G., & Weber, M. (1989). The Curse of Knowledge in Economic Settings: An Experimental Analysis. The Journal of Political Economy, 97, 1232-54.
    Camerer, C. F., & Weber, M. (1992). Recent developments in modeling preferences: Uncertainty and ambiguity. Journal of Risk and Uncertainty, 5, 325-70.
    Cohen, M., Jaffray, J. Y., & Said, T. (1985). Individual behavior under risk and under uncertainty: An experimental study. Theory and Decision, 18, 203-228.
    Ehrich, K., & Irwin, J. R. (2005). Willful Ignorance in the request for product information. Journal of Marketing Research, 42, 266-277.
    Eichberger, J., & Kelsey D. (2007). Ambiguity. Discussion Paper Series No. 448, Department of Economics, University of Heidelberg.

    Einhorn, H. J., & Hogarth, R. M. (1985). Ambiguity and uncertainty in probabilistic inference. Psychological Review, 92, 433-461.
    Einhorn, H. J. & Hogarth, R. M. (1986). Decision making under ambiguity. Journal of Business, 59, S225-S250.
    Ellsberg, D. (1961). Risk, ambiguity and the Savage axioms. Quarterly Journal of Economics, 75, 643-699.
    Ellsberg, D. (2001). Risk, ambiguity and decision. Routledge.
    Fox, C. R., & Tversky, A. (1995). Ambiguity aversion and comparative ignorance. The Quarterly Journal of Economics, 110, 585-603.
    Frisch, D., & Baron, J. (1988). Ambiguity and rationality. Journal of Behavioral Decision Making, 1, 149-157.
    Galai, D., & Sade, O. (2006). The 'ostrich effect' and the relationship between the liquidity and the yields of financial assets, The Journal of Business, 79, 2741-2759.
    Ganzach, Y. (2000). Judging risk and return of financial assets. Organizational Behavior and Human Decision Processes, 83, 353-370.
    Gonzalez-Vallejo, C., Bonazzi, A., & Shapiro, A. J. (1996). Effects of vague probabilities and of vague payoffs on preference: A model comparison analysis. Journal of Mathematical Psychology 40, 130-140.
    Heath, C., & Tversky, A. (1991). Preference and belief: Ambiguity and competence in choice under uncertainty. Journal of Risk and Uncertainty, 4, 5-28.
    Inukai, K., & Takahashi, T. (2009). Decision under ambiguity: Effects of sign and gagnitude. International Journal of Neuroscience, 119, 1170-1178.

    Kahn, B. E., & Sarin, R. K. (1988). Modeling ambiguity in decisions under uncertainty. The Journal of Consumer Research, 15, 265-272.
    Kahneman, D., & Tversky, A. (1979). Prospect theory. Econometrica, 47, 263-290.
    Karlsson, N., Loewenstein, G., & Seppi D. (2009). The ostrich effect: Selective attention to information, Journal of Risk and Uncertainty, 38, 95-115.
    Keren, G., & Gerritsen, L. E. M (1999) On the robustness and possible accounts of ambiguity aversion, Acta Psychologica, 103, 149-172.
    Knight, F. H. (1921). Risk, Uncertainty, and Profit. Houghton Mifflin.
    Kuhn, K. M., & Budescu, D. V. (1996). The relative importance of probabilities, outcomes, and vagueness in hazard risk decisions. Organizational Behavior and Human Decision Processes 68, 301-317.
    Kunreuther, H., Meszaros, J., Hogarth, R. M., & Spranca, M. (1995). Ambiguity and underwriter decision processes. Journal of Economic Behavior and Organization, 26, 337-352.
    Sarin, R. K. & Weber, M. (1993). Effects of ambiguity in market experiments. Management Science, 39, 602-615.
    Tversky, A., & FOX, C. R. (1995). Weighting risk and uncertainty. Psychological Review, 102, 269-283.
    Yaniv, I., Benador D., & Sagi, M. (2004). On not wanting to know and not wanting to inform others: Choices regarding predictive genetic testing. Risk Decision and Policy, 9, 317-336.
    Yassour, J. (1984) Risk, information, and the behavior of stockholders. The Economic Quarterly, 31, 97-99. (in Hebrew)
    Zajonc, R. B. (1968). Attitudinal effects of mere exposure, Journal of Personality and Social Psychology, Monograph Supplement, 9, 1-27.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s