[net.rec.bridge] simple

ark@rabbit.UUCP (08/16/83)

A friend gave me this puzzle and mentioned that many people
get it wrong.  I verified his claim -- of the five people I have
asked so far, no one got it right the first try.

I would appreciate solution attempts by mail.  When you send me
your solution, also tell me whether or not you play bridge:
the problem is similar in spirit to some problems routinely faced
by bridge players and I want to see if more of them get it right.

Here goes:  you are in a room with three cabinets, each of which has
two drawers.  One cabinet has a gold coin in each drawer.  Another
has a silver coin in each drawer.  The third has a gold coin in one
drawer and a silver coin in the other.

You pick a cabinet at random and open a random drawer.  It contains
a gold coin.  What is the probability that the other drawer of that
same cabinet contains a gold coin?

ark@rabbit.UUCP (08/18/83)

First, let me confess a small imprecision:  it isn't a
statistics problem but rather a probability problem.
Review:  there are three cabinets, each with two drawers.
One contains two gold coins, another contains two silver
coins, and the third contains one gold and one silver.
You walk up to a cabinet and open a drawer.  It contains
a gold coin.  What is the probability that the coin in the
other drawer will also be gold?

I have gotten about 80 responses so far.  Almost none were
from bridge players, and all but about five said that the
answer is 1/2.  They reason as follows:  I can rule out having
chosen the cabinet with two silver coins, and the other two
cabinets are equally likely, so I'll see another gold coin
half the time.

This answer, popular as it is, is wrong.  Here's the right one.

There are six drawers.  Three contain silver coins, and the other
three contain gold coins.  Before I have chosen anything, my probability
of choosing each drawer is 1/6.  Once I've seen a gold coin, I know
I didn't choose any of the three drawers with silver coins in them,
but my probability of having chosen each of the other drawers must
be equal, as I haven't learned anything beyond having ruled out the
three with silver coins.

Once I've seen the gold coin, then, the probability is 1/3 that I've
chosen any particular one of the drawers with a gold coin.

1/3 of the time, then, I've chosen the drawer in the cabinet with the
gold and silver coins.  In that case, the probability of seeing another gold is 0.
1/3 of the time, I've chosen one of the drawers in the cabinet with the
two golds.  In that case the probability of my seeing another gold is 1.
The same reasoning applies to the OTHER drawer in the cabinet with two
golds.  Thus, my overall probability of seeing another gold is

	(1/3)*0 + (1/3)*1 + (1/3)*1 or 2/3.

Another way to look at it is this.  Suppose I do the experiment six
times and choose a different (first) drawer each time.  Three times,
I'll get a silver on the first try and the initial conditions won't hold.
The remaining times I'll see a gold again 2/3 of the time.

The reason I asked about bridge players is that this is similar to
the following bridge problem.  You want to pick up this suit:

		Dummy
		x x x x

		You
		A K 10 x x

You cash the Ace and the Queen drops offside.  Do you assume the Queen
was singleton or that LHO was dealt QJ doubleton?

The rule that applies here is called the Principle of Restricted Choice.
It is normaly correct to assume that LHO's choice was restricted --
that is, that LHO was playing a singleton Q -- rather than that LHO chose
to exercise a choice in a particular way (playing Q from QJ).
If LHO had QJ, after all, he might have played the J.

laura@utcsstat.UUCP (Laura Creighton) (08/18/83)

Combinatorics were the bane of my existance in high school. They did not
appear to work logically like the rest of mathematics, they seemed entirely
subjective to me at the time. QED -- you did not do very well unless you
had the same subjective view as the teacher. or so I thought.

The problem with such puzzles is you have to determine what is relevant,
and call that the 'sample set' or something like that. I pick the wrong
set. If someone could explain to me why what I did was wrong, perhaps
I could finally understand combinatorics...

To begin with, you have an equal chance of picking a 2-gold, a 2-silver or a
1-of-each cabinet. You eliminate one of these when you make your choice. I
toss this one out of the window and say that you have a 50% chance of either
cabinet. The answer says that I should not have tossed the other cabinet
out of the window. Why is that silly cabinet relavant to the question?

Laura Creighton
utzoo!utcsstat!laura

dash@fluke.UUCP (Mike Dash) (08/18/83)

rabbit!ark's solution to the gold-coin problem is correct, but is very
complex.  here's a simple way to explain it:

	there are three gold coins altogether, and i have just found one
	of them.  there is a 2-in-3 chance that it is one of the coins in
	the 2-gold-coins cabinet, since that's where 2 of the 3 are.

	thus there is a 2-in-3 chance that i will see gold in the 2d
	drawer.

...decvax!microsof!fluke!dash

laura@utcsstat.UUCP (Laura Creighton) (08/19/83)

I am getting answers to why I was wrong, but I am still confused. I
think I can express my confusion better now, though. (I still don't
know what is going on, but I misunderstand *much* better today).

If you had phrased the question "What is the chance that you picked
the drawer with the 2 gold coins", I would have come up with the official
answer to the problem. your start condition is a room with three cabinets
in it.

But you *didn't*. The start condition is with one drawer open. You get
to do a rescan of the sample set, and, noticing that one cabinet has
been eliminated from consideration, you *eliminate it*. You are only
left with 2 cabinets, and they are (obviously) equal in proportion
to each other with regard to the property (having another gold coin)
that you are testing.

The official answer says that I do not get to do that rescan. I do
not understand why not. 

I also think the same way when I am playing bridge. You count up your
sure tricks, and then consider how to make the best out of the rest
of the deal. (you had better make sure you have leaders to both the
board and your hand as well). If you do not know the distribution of
cards in your oponents hands from the bidding, you may have the situation
that you mentioned. In some cases, you have no choice. Either the
finesse fails, and your contract goes down, or it doesn't and you are
safe. Lay on MacDuff... (actually, you may have to think about when to time
your finesse but if it is the only way that your contract is going to make
it, then it is the only way that your contract is going to make it.)   

Frequently, though, there is another way out. You hold onto your high card.
You then apply a squeeze. For people not aquainted with the sqeeeze, the
idea is for you to run all your cards out, usually fairly rapidly, to
increase the intimidation factor, and sqeeze your oponent, who does not know
what card you have, into dropping a card that would have won him a trick.

For instance, suppose you are south, and are holding the Ace of hearts,
and the 10 of hearts, and the Ace of spades. Your west opponent has
the King and Queen of hearts, and the Ace of clubs. Your East opponent
has the King of clubs and some losers. If you run the squeze on West,
he is going to have to decide between keeping the Queen of Hearts and
keeping the Ace of Clubs. Remember, he has the possibility that you have
the King of Clubs and the Ace of Hearts, and that his partner has the 
10 of Hearts to consider. 

I would do this rather than trying to finesse east, unless from the bidding 
it was obvious that the missing King of Hearts was in East's hand.

You understand the squeeze? The reason that I use it is that I get to
see a lot of the cards fall and can do a lot of rescanning. It also
works more often for me than trying a finesse. You can see your odds get
better and better....

In fact, it is most important to keep track of the distribution of cards
as they fall. You must assume that unless your opponent West is a fool
he is doing the same. If he finds out that you are void in clubs early
on, the show is over, for he will know where the 10 lies.

Okay. I rescan constantly when I play bridge. you say that I cannot
rescan when I open cabinets. Why?  Why is your puzzle applicable to bridge?

Always interested in finding new ways to win more often at bridge,

Laura Creighton
utzoo!utcsstat!laura

larry@grkermit.UUCP (Larry Kolodney) (08/19/83)

From Laura@utcsstat:
To begin with, you have an equal chance of picking a 2-gold, a 2-silver or a
1-of-each cabinet. You eliminate one of these when you make your choice. I
toss this one out of the window and say that you have a 50% chance of either
cabinet. The answer says that I should not have tossed the other cabinet
out of the window. Why is that silly cabinet relavant to the question?

~~~~~~~~~~~~~~~~

here's why it works.  First, lets define exactly what the probability
is.  IF we were to do this experiment 100 times, and every time we get
a gold coin, we check to see what cabinet we are in, the percentage of
the time that wee choose the 2 gold cabinet is the probability that
that is the cabinet on any given time.

So, there a three gold coins.  Since we have an equal chance of
choosing any one of them, if we do choose one it is equally likely that
it is the one in the silver gold cabinet, or that it is one of the two
in the gold gold cabinet.  Since the odd that it is any particular one
given that we have already chosen a gold one is 1/3, the odds that it
was in the gold gold cabinet is the sum of the probabilities of all the
coins in that cabinet, 1/3 + 1/3 = 2/3.

Think of it like this, since there are twice as many gold coins in one
cabinet as the other, you are going to choose gold coins out of that
cabinet twice as often.  Since there are 3 coins, 2 out of every three
will be chosen from the gold gold cabinet, thus 2/3.

-- 
Larry Kolodney (The Devil's Advocate)
{linus decvax}!genrad!grkermit!larry
(ARPA)  rms.g.lkk@mit-ai

johnc@orca.UUCP (John Crown) (08/19/83)

Here's another note from an enlightened dummy who failed this quiz on first
try:

The cabinet that contains only silver coins really *is* a red herring (which
is what we all knew intuitively).  I.e., once you get to "round two" and you
have a gold coin in sight, you are in one of *three* (not two) equally likely
states.  How you got there is no longer of interest.  If you started with
only four drawers (by omitting the silver-only cabinet), or with six, or a
hundred (by adding more silver-only cabinets), the problem still works out
the same.

John Crown, Tektronix
...[decvax|ucbvax]!teklabs!tekecs!johnc

larry@grkermit.UUCP (Larry Kolodney) (08/19/83)

From richl@tektronix

Put another way, if you put me in the above described situation,
with one gold coin in hand and one drawer left to open, I can assure
you that I would be correct 50% of the time by just saying, "Yep,
the other one must be gold".

~~~~~~~~~~~~~
No you wouldn't.  If it were true that the coins was picked randomly,
then the odds that the coin came from the gold gold cabinet is still
2/3.  

Think about this.  Let say the I present you with the situation where I
hold one gold coin in my hand and there is one open drawer, but I have
fiendishly decided to always choose the gold gold drawer.  In that
case, if you knew that, you would have to say the prob. was 1/1 for the
gold gold drawer.  If you know that the coin was chosen randomly, you
have to still say 2/3.  Only if it were chosen with a method that gave
each drawer an equal chance rather than each coin would you be able to
say 1/2.
-- 
Larry Kolodney (The Devil's Advocate)
{linus decvax}!genrad!grkermit!larry
(ARPA)  rms.g.lkk@mit-ai

halle1@houxz.UUCP (08/19/83)

You are making a major mistake.  You were told that ONE was gold, not that
the first was gold.  In essence, the question asks: given four coins, one
of which is silver, the rest gold, you pick a gold one.  What is the
chance that your next selection is also gold?  Obviously the answer
is 2/3.  Remember, everything was done at random.
Reread the solution carefully, and with an open mind.  You should see
that it is correct.
(I bet every bridge player worth his master points got this one right.)

thomson@utcsrgv.UUCP (Brian Thomson) (08/19/83)

No, the probability of a silver coin really IS 1/3.
Perhaps the following explanation will clear this up:

After finding one gold coin, you can indeed eliminate the
cabinet containing two silver coins.  Let's (figuratively)
toss that cabinet out the window.  We are left with two
cabinets and four drawers.  Three drawers contain gold coins,
one drawer contains a silver coin.

We have already opened one drawer and found a gold coin.  That
means there are three drawers remaining, two with gold coins and
one with a silver coin.  We have NO WAY of knowing which of those
three drawers is on the other side of our selected cabinet.
It could be any one of the three remaining drawers, so the probability
of it being the drawer with the silver coin is 1 in 3.

Does this help?
-- 
			Brian Thomson,	    CSRG Univ. of Toronto
			{linus,ihnp4,uw-beaver,floyd,utzoo}!utcsrgv!thomson

laura@utcsstat.UUCP (Laura Creighton) (08/19/83)

Re Brian Thomsom's solution of the 2 cabinet problem (where you get
to throw the cabinet out of the window).

You counted drawers to get your sample size. I counted whole cabinets.
Why am I wrong? (because you get the wrong answer is not acceptable,
although it is pleasantly silly :-) :-) :-) )

laura creighton
utzoo!utcsstat!laura

israel@umcp-cs.UUCP (08/20/83)

	To begin with, you have an equal chance of picking a 2-gold, a
	2-silver or a 1-of-each cabinet.  You eliminate one of these
	when you make your choice.  I toss this one out of the window
	and say that you have a 50% chance of either cabinet.  The
	answer says that I should not have tossed the other cabinet out
	of the window.  Why is that silly cabinet relavant to the
	question?

	Laura Creighton		utzoo!utcsstat!laura

You CAN throw the all-silver cabinet out the window.  The problem that
arises is that the probability is not 50% just because you have two
cabinets.  To use an analogy:

I have three BIG cardboard boxes.  I fill the first with one dollar
bills to the brim.  In the second I put a single dollar bill and fill
the rest of the box with shredded "National Enquirer".  The third
box I fill with dimes.  You reach into a box and pull out the first
object you touch.  It is a one dollar bill.  Now it obviously was
not the third box (unless someone who needed some change came by).
It could be the first box (which was filled with one dollar bills).
It could even be the second box since that one had a single one
dollar bill in it.  Since there are two boxes left, would you
say that it was equal chance of it being either box?  Of course
not since it should be pretty obvious that the one-dollar bill
was much more likely to come from the first box and not the
second.

In the same fashion, when you find a gold coin, it is more likely
that it come from the cabinet with two gold coins, since it has
twice as many gold coins to find as the second cabinet does.  Since
every gold coin in that cabinet has a gold coin in the opposite
drawer and no other gold coin outside that cabinet has a gold
coin in an opposite drawer, the solution is equivalent to the
probability of the chosen coin being in the first cabinet.
-- 

~~~ Bruce
Computer Science Dept., University of Maryland
{rlgvax,seismo}!umcp-cs!israel (Usenet)    israel.umcp-cs@Udel-Relay (Arpanet)

mason@utcsrgv.UUCP (Dave Mason) (08/20/83)

Also ref 1996@umcp-cs.UUCP & 567@ihuxr.UUCP

Lew suggested looking at 3 cabinets with: 10s; 9s1g; 10g coins.
By changing the number of coins so that the number of gold coins does not
match the number of cabinets the problem becomes easier (and less interesting).

Think of labelling all the coins with 1, 2 or 3 for the cabinet in which they
are found, then dump all 30 coins in a bowl.  Take a coin out of the bowl.
You immediately see that it is gold.  What is the chance that it is has a 1
marked on it? what is the chance it has a 2? 3?

I submit that the probabilities are 0,1/11 and 10/11: there are 11 gold coins,
and only one of them has a 2.  We still haven't looked to see what it was,
but we're going to predict what is most likely to be on the next gold coin we
get out. In fact the probability is 10/11 that again it will be a 3.
(there is a 1/11 chance that the first one was a 2. If it was then the next
one will be a 3 (there's only one 2) ie with a prob of 1 so the overall
probability so far is 1/11x1=1/11.  But there was a 10/11 chance that the first
was a 3.  If it was then there is a 9/10 chance that the next will be a 3 so
this half of the possibilities contributes an overall prob of 10/11x9/10=9/11.
The total probability of the second gold being a 3 is the sum of these two:
1/11+9/11=10/11.  The math gets a bit messy, but the probability stays the
same for all 11 gold coins we draw.)

The result of this is that if we get a gold coin when we first walk into the
room we should keep drawing from that cabinet as long as we get gold, and
switch as soon as we get silver. (maybe this was the original question)

 -- Gandalf's flunky Hobbit --   Dave Mason, U. Toronto CSRG,
        {cornell,watmath,ihnp4,floyd,allegra,utzoo,uw-beaver}!utcsrgv!mason
     or {cwruecmp,duke,linus,lsuc,research}!utzoo!utcsrgv!mason   (UUCP)

mark@umcp-cs.UUCP (08/21/83)

I think the difference between the 50/50 and the 1/3-2/3 views
is exactly the rescan issue (as Laura says).  

The crucial difference is whether or not it is assumed that the
open drawer was opened at random.  If it was, then it could have
been any one of the three gold drawers, and the probability that
the other drawer in the same cabinet is gold is 2/3 (by everyone's
arguments which need not be repeated.)

However, if all you know is that a drawer stands open and
the other drawer could be either gold or silver, then the
50/50 result follows.  BUT--this conclusion ignores some
of the relevant informatin and so is a less accurate conclusion
than the one using all the given prior information.

An even better guess could be made if you knew something about
the "random" process used to pick tl%drawer, such as it
was really a psychology professor doing an experiment on
perceived probabilities or someone with X-ray vision who
hated (or liked) you.  Assuming a random event is a convenient
approximation to having no knowledge, but it is never exactly right.

opening-- 
spoken:	mark weiser
UUCP:	{seismo,allegra,brl-bmd}!umcp-cs!mark
CSNet:	mark@umcp-cs
ARPA:	mark.umcp-cs@UDel-Relay

levy@princeton.UUCP (08/21/83)

Laura -
It is not true that the "initial state" of the problem is one drawer open.
It should be "one drawer open at random, but such that it contains a gold
coin".  Then it is not true that the two relevant cabinets are equivalent,
since the one with two golds is more likely to have been picked up in the
first place.  
It's a bit like saying "One out of every five people born every day are
Chinese, except in China -- where every one of them is".  If you pick a
human being at random, and you verify that s/he is a Chinese, what are the
chances that your picked you human in China?  Very high indeed.  (In particular
the chance that this person's father is Chinese is close to one, while the
chance that an arbitrary person's father is Chinese is only close to 1/5.)
     Why should you choose drawers rather than cabinets for your sample
space?  The answer to that one is simple.  Just a choice of cabinets does
not determine the outcome of the event.  You need more information, i.e.
you need to subdivide your sample space further. 
     Finally, as many people pointed out, the SS cabinet is totally 
irrelevant.
     I suggest that you actually do the experiment (you may substitute dimes
and nickels for gold and silver coins respectively, with approximately the
same effect).  *This is the experiment implied by the statement of the
problem:*
           i=0;
           while  (i < BIG) {
                     choose drawer at random;
                     if (coin = gold) {
                              look at other drawer;
                              if (other coin = gold) j = j+1;  }  }
                       -- Silvio Levy
           probability = j/BIG.
iiate of

jerry@eagle.UUCP (Jerry Schwarz) (08/23/83)

This discussion has taken a typical course.  Each side 
repeats arguments in favor or one answer without addressing 
where the other side has gone wrong.  In this item I will
try to expose the error in Laura's reasoning.  

But first, it is possible to demonstrate that she
is wrong without finding a flaw in her reasoning by
conducting an experiment.  A program to simulate the 
conditions of the problem is attached below.
On our machine it gives the result:
	    n_gold=654, n_silver=346

So where is the flaw in her reasoning?  It is that her
idea of "rescanning" is not well formulated.  In order to
be correct the "rescan" must use ALL the knowledge acquired
by opening a drawer.  But, besides the certainty that
one cabinet is eliminated we have gained a certain amount
of information from having carried out a random trial.  
The experimental evidence shows that this information
must be taken into account.

The idea of gathering knowledge from random trials is a 
commonplace and intuitive one. Perhaps it is slightly obscured 
when, as in this problem, statistical information is combined 
with definite information.

Jerry Schwarz
eagle!jerry

/**************************************************************/
typedef enum { gold, silver } COIN ;

COIN drawers[6] = 
	{ silver, silver, silver, gold, gold, gold } ;

main() {
    int n_gold = 0 ;
    int n_silver = 0 ;
    int choice ;

    while ( n_gold+n_silver < 1000 ) {
	choice = (rand()>>5) % 6 ;
	switch  ( drawers[choice] ) {
	    case silver: /* ignore these trials */ break ;
	    case gold: {
		switch( drawers[ choice^1 ] ) {
		    case gold:   ++n_gold   ; break ;
		    case silver: ++n_silver ; break ;
		    } break ;
	        }
	    }
	}
    printf("n_gold=%d, n_silver=%d\n",n_gold,n_silver) ;
    }

ecn-ec:ecn-pc:ecn-ed:vu@pur-ee.UUCP (08/24/83)

	I agree with Laura Creighton in the point that the all silver cabinet
is irrelevant to the problem: 'toss it out the window', or put in there a dozen
other won't make any difference. But the result IS 2/3. The answer I mailed to
rabbit!ark was this:
	Let's call the all gold cabinet cabinet 1.
                   the gold-silver cabinet cabinet 2(gold in drawer 1, silver
                                                     in drawer 2 )
Then, on the first pick, it may have been:
1) drawer 1 cabinet 1
2)	drawer 2 cabinet 1
3)	drawer 1 cabinet 2.

Of these possibilities, 2 of them gives you a second gold coin: cases 1 & 2.
Thus, the probability is 2/3.

Hao-Nhien Vu (pur-ee!vu )
PS: If you want to mail anything to me, please send to pur-ee!norris because
pur-ee!vu  will be terminated early this semester, that is within a week from
today.

ecn-ec:ecn-pc:ecn-ed:vu@pur-ee.UUCP (08/24/83)

	You counted drawers to get your sample size. I counted  whole cabinets.
	Why am I wrong? (because you get the wrong answer is not acceptable,...)
	laura creighton
	utzoo!utcsstat!laura

You CAN count whole cabinets. BUT (there always is a but) you do NOT have a
50-50 chance of getting cabinet 1 & 2. It STILL is 1/3 for silver-gold cabinet
& 2/3 for gold-gold cabinet. Now, since the result with 2 cabinets are not
really clear, let's make a analogy. Go back to the time of Marco Polo. Then
  suppose you are dropped on the Earth. The first man you see is a white man.
Is there a 50-50 chance that you are either in China or Europe ? No. Europe
has a better chance. Likewise, Mrs. Harris goes to Paris doesn't make you
feel a 50-50 chance you are either in France or in the U.S. just because
you see an American woman. Or if you see a nurse, there is more chance that
you are in a hospital or nursing school or the like, than that you are, say
in the Computer Center. Back to the problem, and let's count the whole
cabinets.

	There is a probability of 2/3 that you got the gold-gold cabinet.
There is a probability of 1/3 that you got the gold-silver cabinet. Therfore,
the probability is 2/3 (I think I am repeating myself !!! )

Hao-Nhien Vu (pur-ee!vu (This account will be terminated sometime this week.
			 Please send correspondence to pur-ee!norris.Thnx) )

richl@tektronix.UUCP (Rick Lindsley) (08/25/83)

Hold it! 2/3 is the probability that given the constraint that
you have ordered pairs of coins you will choose one that has at
least one gold coin. Your argument is only valid if you are not
allowed to update your information to apply to the situation.

The key to the argument is whether you calculate the probability
BEFORE you see the gold coin or after. If before, then your reasoning
may hold, but if you calculate the probability with the knowledge that
I hold in my hand a gold coin and I have just chosen the first of
ordered pairs (S,S) (G,S) or (G,G), then I can eliminate the first
ordered pair from my calculations. It is not in the set of possibilities.
I then have two possiblities remaining, and one of them is desirable.

Put another way, if you put me in the above described situation,
with one gold coin in hand and one drawer left to open, I can assure
you that I would be correct 50% of the time by just saying, "Yep,
the other one must be gold".

Rick Lindsley
richl@tektronix
...tektronix!richl

rcj@burl.UUCP (08/25/83)

Finally!!  I was seething with frustration on this one, cheering Laura
on, and had sent my version of "What the hell is wrong with my solution?"
to the original poster.  Then, along came Bruce (israel) with his
amazingly clear "shredded National Enquirer" analogy and it finally
came crystal clear.

Thank you, Bruce.

I *LOVE* teaching via analogy and, in case you are wondering, this letter
is totally above-board and is *NOT* one of my usual sarcastic submissions,
-- 

The MAD Programmer -- 919-228-3814 (Cornet 291)
alias: Curtis Jackson	...![ floyd sb1 mhuxv ]!burl!rcj