this post was submitted on 24 Jul 2023
4 points (100.0% liked)

math

829 readers
1 users here now

General community for all things mathematics on @lemmy.world

Submit link and text posts about anything at all related to mathematics.

Questions about mathematical topics are allowed, but NO HOMEWORK HELP. Communities for general math and homework help should be firmly delineated just as they were on reddit.

founded 2 years ago
MODERATORS
mrh
 

So - Wikipedia says:

A person is given two indistinguishable envelopes, each of which contains a sum of money. One envelope contains twice as much as the other. The person may pick one envelope and keep whatever amount it contains. They pick one envelope at random but before they open it they are given the chance to take the other envelope instead.

Now suppose the person reasons as follows:

  1. Denote by A the amount in the player's selected envelope.
  2. The probability that A is the smaller amount is 1/2, and that it is the larger amount is also 1/2.
  3. The other envelope may contain either 2A or A/2.
  4. If A is the smaller amount, then the other envelope contains 2A.
  5. If A is the larger amount, then the other envelope contains A/2.
  6. Thus the other envelope contains 2A with probability 1/2 and A/2 with probability 1/2.
  7. So the expected value of the money in the other envelope is [5/4 * A].
  8. This is greater than A so, on average, the person reasons that they stand to gain by swapping.
  9. After the switch, denote that content by B and reason in exactly the same manner as above.
  10. The person concludes that the most rational thing to do is to swap back again.
  11. The person will thus end up swapping envelopes indefinitely.
  12. As it is more rational to just open an envelope than to swap indefinitely, the player arrives at a contradiction.

To me it seems clear that the flaw in the reasoning is at step #2. Once you know how much money is in the envelope you've opened, it's no longer true that the probability that you got the smaller envelope is 50%.

Put it this way: If you knew that A was generated by sampling a particular probability distribution, then you could do the math and determine what was the probability that you got the smaller envelope, given the amount of money you found when you opened it. E.g. if the number of dollars in the small envelope was determined by a poisson distribution with μ=5, and you found $10 when you opened your envelope, then you could find the probability that you had the smaller amount of money using Bayes's theorem:

  • P(small amount is $5 | nothing) = e^-5 * 5^5 / 5! ≈ 0.175
  • P(small amount is $10 | nothing) = e^-5 * 5^10 / 10! ≈ 0.018
  • P(small amount is $5 | you found $10) ≈ 0.175 / (0.175 + 0.018) ≈ 90%

So, before you opened the envelope, it was 50/50 as stated. Once you open the envelope and find $10, you shouldn't switch, since it's much more likely that the small amount was $5 than $10, so you probably have the higher amount.

In the actual paradox presentation, you don't actually know anything about the process that generated the amounts of money in the envelopes, except the amount of money you found -- but that specific amount of money is information you're gaining about the process. If you were to find $5, that gives you a very different model of the process than if you found $5 million. The assertion that underlies step #2 -- that after you've opened one envelope, you still have 0 knowledge about the process -- is clearly untrue from a common-sense perspective, but it's also mathematically unsound.

In formal terms, you can see that it's unsound by following the implications of that assertion that after you've opened one envelope, it's still true that the process that generated the money in the envelopes had an equal chance of generating 2A as A. If you'd found 2A dollars in your envelope, you would assert that it was equally likely to generate 4A as 2A, and by induction on up for all powers of 2 -- so you're saying that the process had an equal chance of generating any one of an infinite number of discrete answers. That's not possible. So, by contradiction, the probability of A and 2A are no longer equal once you've opened an envelope and found a specific amount of money.

No? Wikipedia has this to say about explanations for the paradox:

No proposed solution is widely accepted as definitive. Despite this, it is common for authors to claim that the solution to the problem is easy, even elementary.

I wouldn't call my line of reasoning totally elementary, but it doesn't seem earth-shattering to me or a reason for a big controversy about it. No? Am I missing something?

(I tried to read the citations on Wikipedia to see if someone else was arguing this same thing, or giving a counterargument to it, but the citations were more confusing than enlightening to me.)

top 11 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

Doesn't the actual problem says you don't open the envelope until you made your definitive decision?

Edit: in case it is about finding out if your envelope was the lucky one or not, you get zero information out of opening your envelope. The other one still has either 2A or A/2, each with probability 1/2

[–] mo_ztt 1 points 1 year ago (1 children)

So, my assertion is specifically that you do get information out of opening your envelope. Flip it around -- if you're going to assert that you get zero information out of opening your envelope, what's your argument for asserting that? Does your argument also apply if you know that there's a specified probability distribution that the smaller amount is being drawn from in order to set up the scenario?

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

But you know nothing about the distribution of different money amounts. By adding that and by ooening your envelope you add additional conditions to the problem. With these additional conditions you know more about the probability that you have chosen the lucky envelope.

You can also add another condition. In the loosing envelope there is a paper with "LOSER!" written on it. Now you have 100% probability if your envelope is the lucky one. But you altered the problem by adding a condition.

[–] mo_ztt 1 points 1 year ago (1 children)

There are two separate questions here:

  1. If you know the distribution from which the amounts of money were selected, does opening the envelope impact the probability that you have the smaller amount of money?
  2. If you don't know the distribution (or more generally the process) for choosing the amounts of money, does it impact the probability?

I think we're in agreement that the answer to #1 is yes. And, to me, it sounds like you're asserting that the answer to #2 is no. The core of my argument has nothing to do with adding conditions to #2; my question is, what's your argument for the answer to #2 being no? (And, also, by way of illumination I'm asking whether that argument would apply equally to #1.)

[–] [email protected] 1 points 1 year ago (1 children)

When you know the distribution, you have additional information. Information can change probability.

I won't say yes to one in general. But 1. does not matter because it has nothing to do with the 2 envelope problem.

Regarding 2: The question is ill posed. Not knowing the distribution is always true and there is nothing that can "impact the probability".

You're trying to add information that was not there in the original problem and then asking: "what happens if we leave this information out?"

[–] mo_ztt 1 points 1 year ago

Dude, you gotta work with me a little bit if you're interested in having this conversation. I'm not sure how to say it other than how I said it: The core of my question is about #2. Not anything about additional information.

If you're interested in having a conversation about scenario #2, let's rock and roll, but if you just want to pretend that I'm trying to add something to scenario #2 and then beat up the strawman, I think this exchange has pretty much run its course.

[–] TitanLaGrange 1 points 1 year ago* (last edited 1 year ago) (1 children)

I'm not sure I understand what all the reasoning is for. Your choices don't affect the predictability of the situation. Just pick one and be happy that you've got some money you didn't have before.

edit: I think I see what you're saying though. If you can make a reasonable guess about the upper bound of what could be in the envelopes, and if you know what is in the first envelope you choose, then you can try to guess which envelope you've picked. Like, if you know the prize budget for the event was $100 and you picked an envelope with $90 in it, most likely the other envelope has $45 in it and you should stay.

The problem statement says that the envelopes are indistinguishable though, and you don't get to know the amount until after you've settled on one or the other. In that case, switching makes no difference.

It seems to me #7 in the quoted section is wrong. The estimated value of the two envelopes is identical regardless of your choices, so there is no paradox.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

#7 is correct, because how we calculate the estimated value:

E(X)=\sum_{i \in I} x_i p_i

If you keep your envelope, you get A. If you change, the probabilities are p_1=p_2=1/2 and the outcomes are x_1 =A/2 and x_2 =2A (I is {1,2})

Let's plug it in:

E(X)=1/2*A/2+1/2*2A=5A/4

[–] TitanLaGrange 1 points 1 year ago* (last edited 1 year ago) (1 children)

To frame my response below: I know next to nothing about statistics and probability, so I'm interested in your take on the analysis. It seems like a very simple problem from an intuitive standpoint, and it's interesting to see how someone how knows more about this sort of analysis approaches it.

Since your choice has no effect on the value of either envelope, it seems to me that a calculation of the value of the envelopes should indicate that they have the same value and that value should not change regardless of whether you have chosen one or not.

[–] [email protected] 1 points 1 year ago (1 children)

I can see where are you coming from, but the concept "excepted value" might be something different than you think. When you throw a die the excpected value is 3.5. It is a measurement of how much something you can expect if you do something that is random. The die has six outcomes, each with Prob 1/6, but the value ranges from 1 to six. So according to the definition above it is 1/61+1/62+...+1/66=1/6(1+2+3+4+5+6)=3.5

What is that expected value exactly? If you roll the dice infintly many times, add all points and then divide by the number rolls, you will get 3.5. This can be proven, but the proof is way out of scope here.

In the example here the expected value says something about how much money you can expect when switiching. And that is 5/4 of what you have, because 2A is 3A/2 more than A/2, so you gain 'more when your switch was good than the amount you lose when it was bad.

[–] TitanLaGrange 1 points 1 year ago* (last edited 1 year ago)

In the example here the expected value says something about how much money you can expect when switiching. And that is 5/4 of what you have,

I think there is something wrong with that analysis strategy. The expected value calculation can tell me that the value of the envelope is 1.25n, but it can't tell me anything about which envelope is more likely to contain 2n. Both envelopes have an expected value of 1.25n until both are open. If one is open I can put some constraints on what n actually is, but that doesn't put any constraints on which envelope has n and which has 2n.

(also, this is fun to think about!)

load more comments
view more: next ›