(Thanks to Itai Benjamini and Ronen Eldan.) Test (quickly) your intuition: You have a box with n red balls and n blue balls. You take out balls one by one at random until left only with balls of the same colour. How many balls will be left (as a function of n)?
1) Roughly εn for some ε>0.
2) Roughly ?
3) Roughly log n?
4) Roughly a constant?
no poll?
GK:OK
roughly 4?
I’d say a constant.
1) I’d understand “roughly” as pertaining to the Mean or Median. My guess is either sqrt or log, but I don’t know which.
2, I’m thinking of random walks.
Roughly constant. In particular, roughly 2. Play the process backwards.
Roughly constant. Similar to coupon collectors problem or a 1-d random walk?
This reminds me of a homework problem from one of Moni Naor’s classes: you play a game with a randomly shuffled deck of cards. In each step, you can either take out the next card and look at it or stop (and you automatically stop when there is only a single card left in the deck). If the color of the next card is red you win $1 and otherwise you lose $1. What is the highest expected reward you can get (using any strategy)? Of course you can stop immediately and then the expectation is 0, but can you do better?
It seems to have something to do with Catalan numbers. Suppose the deck has 2n cards.
Consider the following strategy: pool cards until you are holding more blacks than reds (meaning that the next card is more likely to be red). Then stop.
– if you stop before the last card, you win w.p. of at least n/(2n-1)
– if complete the deck you lose (since the last card is black).
– we can think of black and red cards as moving right and up on a grid, respectively.
– having at least as many reds as blacks corresponds to remaining below the diagonal.
– the probability of reaching the end (and lose) is C(n) / (2n choose n) = 1/(n+1)
Thus the expected value is
(-1) 1/(n+1) + (+1) n/(n+1) n/(2n-1) –> 1/2 as n tends to infinity. (about 0.454 for n=26)
If the deck is large you can probably wait for a better before stopping,
In your expected value computation, you’re missing the case where you get a -1 payout by guessing (wrong) before the end of the deck is reached. When you include this, the expected payout should become zero.
Nice problem. Did not expect that answer (assuming I did not mess up). I used induction, but there must be a more intuitive way.
You can solve by induction, and the answer is 0 either way. But the very elegant way Moni proved it for us (which I find very related to the way this post should be answered) is the following:
Consider the following game: you play a game with a randomly shuffled deck of cards. In each step, you can either take out the next card and look at it or stop (and you automatically stop when there is only a single card left in the deck). Now one opens *the last card in the deck* (rather than the next card). If the color is red you win $1 and otherwise you lose $1. Note that in this case, the strategy of the player makes no difference – your reward is always determined by the color of the last card. So here the payoff is clearly 0. On the other hand, one can convince himselve rather easily that the payoff in both games (for each specific strategy) is the same. Simply, conditioned on all of the cards you already saw, the color of the next card and the color of the last card have the same probability distribution. And thus the payoff for the original game is 0 too.
Nice. 🙂
How about this: you are given a box that is one of two kinds, each with 50% odds:
A) n blue balls, n red balls;
B) the maker flipped an independent coin to determine the color of each of the 2n balls.
Let t be a fixed value. You draw t balls without replacement and try to guess which of the two types you were given. What does your success probability p look like as a function of t? E.g., roughly when do you hit p > .99?
2n / (1+n)
This problem has absolutely nothing to do with intuition, but rather abstract reasoning.
Another vote for a constant (which is 2 in the limit). I find it easier to think about the question, “How many balls of a single color will you draw at the beginning before the first ball of the other color appears?” But this is symmetrically equivalent to the original question.
Omer, I learned the same great game from Abhiram Ranade years ago. It is a nice way of skirting the martingale stopping theorem.
It is great, and I love how elementary both the problem description and the solution are!
I actually plot the distribution from n=1 to 500…….. the average number of balls left fits EXACTLY as the square root of n……..
apparently a stupid mistake was made…
if my code runs correctly, it suggests that that number of balls left obeys a geometrical distribution with the parameter p equal to one half (thus not depending on n).
Reblogged this on ___ cesare.nl ___.
Constant because as we draw balls we expect to have the same number of balls of each color. So we would have the same ending whether we start with n red and n blue or n/2 red and n/2 blue.
It is not a constant, it is not log n;
Roughly \sqrt n?
public static void main(String[] args) {
int testCount = 100;
int boxSize = 10000;
Random rand = new Random();
for(int i=0; i<testCount; i++){
int size = rand.nextInt(boxSize) + 1; //random number in [1,..,boxSize]
int red = size, blue = size, left;
while(true){
if(rand.nextInt()%2 == 0){
red–;
}else{
blue–;
}
if(red == 0){
left = blue;
break;
}
if(blue == 0){
left = red;
break;
}
}
System.out.println(
"test " + i + " N=" + size + " left=" + left +
" SQRT(N)=" + (int)Math.sqrt(size) +
" LOG(N)=" + (int)(Math.log(size)/Math.log(2)));
}
}
Your code seems to assume that your random number mod 2 says whether the ball is red or blue, in which case your sqrt(n) is correct. But that would be for a problem like “flip coins until you have either n heads or n tails. What is the expected value of |heads – tails| at that time?”
If you change your code to pick a random integer from 1 through red+blue, and to decrement red if the random integer is <= red, you should see the result staying near 2 even as the number of balls increases. A way to see that is that if the number of, say, red balls gets down to 1 when the number of blue is still big, the chances of getting that last red ball is pretty low, so you tend to get rid of a bunch of blue ones before finally getting that last red one.
good
http://www.afu.ac.ae
As an intuitive argument why it should be constant… Assume you have a huge number of balls, so that even log n is large. Then picking out the last ball of one color with log n balls remaining has very small probability.