## Pages

1. 9^9
2. You light the match first

Nice triple post

Side-by-side comparison of top modeling training courses + exclusive discount through WSO here.

Why would you not count ^ as a mathematical symbol?

And if "the match" is really the answer to that question, I'd only hire people who got it wrong. I wouldn't want someone that pedantic working with me.

"It is a fine thing to be out on the hills alone. A man can hardly be a beast or a fool alone on a great mountain." - Francis Kilvert (1840-1879)

"Ce serait bien plus beau si je pouvais le dire a quelqu'un." - Samivel

"It is a fine thing to be out on the hills alone. A man can hardly be a beast or a fool alone on a great mountain." - Francis Kilvert (1840-1879)

"Ce serait bien plus beau si je pouvais le dire a quelqu'un." - Samivel

Agree with the stupidity of the match question. Bear in mind for the first one, you don't need a carrot when writing an exponent (except on a computer).

Got it (re: the carrot)

One more point... if you're going to be that pedantic about the match question, you'd have to define the word "number" a lot more precisely than that.

"It is a fine thing to be out on the hills alone. A man can hardly be a beast or a fool alone on a great mountain." - Francis Kilvert (1840-1879)

"Ce serait bien plus beau si je pouvais le dire a quelqu'un." - Samivel

"It is a fine thing to be out on the hills alone. A man can hardly be a beast or a fool alone on a great mountain." - Francis Kilvert (1840-1879)

"Ce serait bien plus beau si je pouvais le dire a quelqu'un." - Samivel

That's retarded. If you don't count exponents as a mathematical symbol, I would just write two sideways eights. Infinite times infinite = infinite.

the answer for 1 is infinity;
the first number is
99999999999...9
plus the second number
99999......9

two zeros...intersecting each other in a tangent like fashion to form infinity, of course infinity times infinity is not valid mathematically just as infinity divided by infinity isn't.

You have to remember that infinity is not a number; it's a mathematical concept.

bevo51:

You have to remember that infinity is not a number; it's a mathematical concept.

It depends on what set of numbers you're working. Infinity is a number in the extended real number system. Of course, this set does not form a field, but it is still widely used.

1. My explanation: It's the two largest single digits put together and I didn't feel like trying to be a smart ass.
• 3

i first thought 99 as 2 numbers, and then "100" is two numbers (one and zero). so using that logic it could be 999...998 i guess.

I think we all agree it's a pretty stupid question.

For the interviewers out there, what do you look for when you ask a brain teaser question (even if it's better than this one)? Is it the right answer, the candidates "way of thinking" or just to see that the person doesn't panic??

"It is a fine thing to be out on the hills alone. A man can hardly be a beast or a fool alone on a great mountain." - Francis Kilvert (1840-1879)

"Ce serait bien plus beau si je pouvais le dire a quelqu'un." - Samivel

"It is a fine thing to be out on the hills alone. A man can hardly be a beast or a fool alone on a great mountain." - Francis Kilvert (1840-1879)

"Ce serait bien plus beau si je pouvais le dire a quelqu'un." - Samivel

Match question is a pretty bad measure of thought process in my opinion. I got this one once... it's an interesting interview question.

There is a lily pad in a pond. (It is the only thing in the pond) It doubles in size every minute and completely fills the pond at 60 minutes. At what minute is the pond 1/4 full?

It should be 1/4 full at the 58th minute.

The pond will then be half full at minute 59, and completely full at minute 60

• 2

4) 1.. but if you ask this in an interview, it's hard to tell if you're talking about birthdays or birth days.

largest number: 1/0 = infinity

• 1

Really, it's the limit of 1/x as x->0+ that equals infinity. 1/0 itself undefined. Hey, as long as your interviewer doesn't know that I guess its an okay answer...

Question three has many answers, depending on what you classify as yours. Name is probably the best answer to go with, unless you speak in third person all the time. Also, I would think your phone number, being that you do not call yourself. But then you would probably have someone say, "that belongs to your phone, not you." Same goes with your address. Social Security number belongs to you as well, but its usually used by others as a way to identify you. The list could probably go on for awhile.

5) When they meet they are at the same point - same distance away from Paris.

6) Uhh 8 hours? Must be some trick..

We're Italian, "WACC" means something else to us.

6 - I'd imagine the boat continues to rise as it rains more and the ladder will never be half submerged.

Questions 1 - 4 are terrible questions.

CompBanker

Not sure if this is a teaser, but I got this:

You have a 9x9x9 Rubik's cube that you decide to dip in red paint so that when dried all the outside is covered in red. You then throw the all-red cube against the wall and break the 729 cubes that comprise the 9x9x9 completely apart. How many of the pieces will have no red on them whatsoever?

gomes3pc:

Not sure if this is a teaser, but I got this:

You have a 9x9x9 Rubik's cube that you decide to dip in red paint so that when dried all the outside is covered in red. You then throw the all-red cube against the wall and break the 729 cubes that comprise the 9x9x9 completely apart. How many of the pieces will have no red on them whatsoever?

A rubiks cube is 3x3...... but if you insist on a 9x9x9.... 9x9x9 has 6 sides of 81... which is 486 pieces with red paint on them.

elan:
gomes3pc:

Not sure if this is a teaser, but I got this:

You have a 9x9x9 Rubik's cube that you decide to dip in red paint so that when dried all the outside is covered in red. You then throw the all-red cube against the wall and break the 729 cubes that comprise the 9x9x9 completely apart. How many of the pieces will have no red on them whatsoever?

A rubiks cube is 3x3...... but if you insist on a 9x9x9.... 9x9x9 has 6 sides of 81... which is 486 pieces with red paint on them.

I didn't "insist" anything. That was the question I was asked from a MM analyst during an interview. I know a Rubik's is 3x3...Bear Stearns gave me one during final round interviews (ha! nice memorabilia).

• 1
gomes3pc:

Not sure if this is a teaser, but I got this:

You have a 9x9x9 Rubik's cube that you decide to dip in red paint so that when dried all the outside is covered in red. You then throw the all-red cube against the wall and break the 729 cubes that comprise the 9x9x9 completely apart. How many of the pieces will have no red on them whatsoever?

all of the side pieces would have paint whether it be painted on 1,2,or 3 sides.
so you're left with a 7x7x7 cube of unpainted little cubes. 343 cubes.

a little bit of a twist on the question:
What is the expected value of the number of painted sides you'd expect if you picked one cube at random?

Now what if you picked 3 cubes?

Got this one once.

1)You have three 5's and a 1. Using the mathematical operations and all four digits, get to a total of 24.

2)If you have a 5yr bond with a coupon of 5%, and all rates drop to 0%, what would you pay for the bond?

3)This was asked to me in an office: "I make a stack of quarters which stretches to the height of the empire state building. If I took those coins and put them in this office, make me a market in what percentage of the room would be taken up".

4)There is a mile of railway track. It is bolted down in such a way that when it cannot displace to the side but only up or down. If on a hot day, the track expands by one foot, how high up into the air would the rail lift?

Jimbo

1, ((5/5)*5)-(1/1)

1)You have three 5's and a 1. Using the mathematical operations and all four digits, get to a total of 24.
- (5!)/(5/(1^5)) = 120/5 = 24

3)This was asked to me in an office: "I make a stack of quarters which stretches to the height of the empire state building. If I took those coins and put them in this office, make me a market in what percentage of the room would be taken up".
- empire state building about 1000feet tall, room is about say 10x10x10. quarters have diameter of 1 inch. could either use area of quarter or length of quarter to calculate how much of the room would be occupied. using area: so area of quarter is about .75 sq. in. so, the total volume of the stack is 1000 x 12 x .75 = 9000 cubic inch. total volume of room is 120x120x120 so about 1.7 mil cubic inches. 1700/9 is about 200 so approximately .5% of the room. using length method: we know stack is 12000 inches tall so we could divide this stacks into stacks of height 120, so we'd need 100 stacks. the room is 14400 stacks wide, so 100/14400 is about 1/144 or .7% of the room.

have an exam in a bit, will try the others after.

(5 x 5) - 1^5

We're Italian, "WACC" means something else to us.

(5x5)^0.5x5-1=24
:)

the bonds question anyoone ???

the empire estate building question... i had the same answer more or less, quite impressed that the actual % is quite low....

For the bond question wouldn't the market price be \$1,250? If rates are at 0% you would just pay out all the future cash flows at the start of the bond's life.

Of course that answer would not take into account inflation, time value of money, alternative investments, etc. etc.

David Van Patten:

Of course that answer would not take into account inflation, time value of money, alternative investments, etc. etc.

sure it would. 125 price is correct.

Clarification on #1, only the four primary operations are open to you, exponents are not allowed.

for #3 there is a much faster way of approaching it. the office was on a single floor, one story high. keep that in mind.

any more brainteasers?

You have a revolver into which 2 bullets are loaded right next to each other. You point the gun to your head, spin the revolver and shoot. It is a blank. You have to shoot once again, but have the option to either spin one more time before shooting, or you can pull the trigger right away. Does it matter which option you choose, why ?

(5-1/5)5

• 2

well done

1) You are on a gameshow, behind one of the doors is \$1000. Behind the other 2 is nothing. You pick a door. The host, opens one of the other doors, revealing that it's empy. What would you pay to switch?

2) Well, this one is tougher and maybe too involved for this forum but...assuming you have a swap curve built through the 30 yr point, and nothing beyond that, make me a market in a 40yr/50yr rate switch.

3) I have two envelopes. One contains twice the amount of the other. I give you one of the envelopes. Should you switch?

Jimbo.

1. I would pay nothing becuase the probability is in my favor now. So why pay money to switch.

Think again, it's definitely advantageous to switch. Let me phrase the question again.

There are three doors on a gameshow. One has \$1000 behind it, the other two have nothing. You pick one. The host opens one of the two remaining doors, revealing its contents to be empty. He now gives you a chance to switch to the remaining closed door, or keep your original choice. What would you pay to switch?

Jimbo:

Think again, it's definitely advantageous to switch. Let me phrase the question again.

There are three doors on a gameshow. One has \$1000 behind it, the other two have nothing. You pick one. The host opens one of the two remaining doors, revealing its contents to be empty. He now gives you a chance to switch to the remaining closed door, or keep your original choice. What would you pay to switch?

i could be way off, but i'd be willing to pay any amount up to \$666.67 to switch doors. my reasoning is this: the chances of my initial choice being a door with nothing behind is is 2/3, whereas my chances of choosing the \$1000 door right off the bat is only 1/3. logic would dictate that more often than not, i'll choose a losing door on my first try. the fact that the gameshow host always reveals an empty door is the key to this question. 2/3 of the time (when you pick a loser right off the bat) he'll reveal the second loser leaving you the option to switch to the winning door, compared to the 1/3 chance of you picking a winner the first time around and the host revealing only one of the losers (leaving you the option of switching to another loser). so, by only paying up \$666.67 i "guarantee" that i leave the show with no less than what i started with, or in the best case scenario (assuming i pay between 0 and \$666.67) anywhere between \$333.33-\$1000.

also, is the answer to the expanding one-mile railroad question 51.38 feet, jimbo?

alabinjo:

1. I would pay nothing becuase the probability is in my favor now. So why pay money to switch.

This guy is correct, I researched this question with a friend a few weeks ago and I'm 99.9% sure that answer is "do not switch" i.e. pay \$0.00

It only make sense just to stay put. They are 3 doors. two with nothing and the other with \$1000. If the host already revealed one door with nothing behind then your odds are now 50/50. So why on earth would you pay money when the probability would still be the same.

alabinjo:

It only make sense just to stay put. They are 3 doors. two with nothing and the other with \$1000. If the host already revealed one door with nothing behind then your odds are now 50/50. So why on earth would you pay money when the probability would still be the same.

if so, then you should also switch after the host opens one of those two. nothing has changed. at least one of those doors is always empty.

Jimbo:

1) You are on a gameshow, behind one of the doors is \$1000. Behind the other 2 is nothing. You pick a door. The host, opens one of the other doors, revealing that it's empy. What would you pay to switch?

3) I have two envelopes. One contains twice the amount of the other. I give you one of the envelopes. Should you switch?

Jimbo.

1) EV of the envelope that you have in your hand is \$333.33. EV of the other envelope is \$500 so I'd be willing to pay \$176.66 or less.

I heard this from a friend:

you play a game in which a deck of cards (52 cards) will be revealed to you one after the other, before each card is revealed you can make a bet about whether the card is red or black. You start off with x amount of money. For each bet you can use any fraction of your money (assume x is infinitely divisible), if you win, the money you bet gets doubled, if you lose you lose all the money you put into the bet. You play this game until all 52 cards are revealed. What strategy should you use to maximize a riskfree(!) outcome.

Jimbo, if you asked me #3 in an interview I would reply the following:

If you are on the second floor a building, and you have a stack of quarters the height of the empire state building, and you begin to place all the quarters into the single room, by the time you would finish there would be no quarters in that room. Reason being, the floor would collapse beneth the quarters due to the weight. Therefore, the answer is zero.

"Reason being, the floor would collapse beneth the quarters due to the weight. Therefore, the answer is zero."

And you would be very wrong.

Count and wait until you know that only one color remains in the deck, and then bet. I thin that's the only way you hit the risk free criteria. Or at the least, wait until two cards are left and bet half.

but apparently it's wrong...

Ah, it was worth a shot. It was the first thing that came to my head when you emphasized that the room was on the second floor.

Ok Tyler, and now that you've switched, i give you the chance to switch again. Would you do it?

Jimbo:

Ok Tyler, and now that you've switched, i give you the chance to switch again. Would you do it?

Got my probabilities wrong in the first one.. EV of second choice should be 2/3 not 1/2.

I don't think I'd switch again because after a switch, you have a 2/3 chance of winning, but if you swap, then the probabilities drop to 1/2 each which leaves you worse off.

yes to the second.

for the first, you already have something with an expected value of 333, so to switch to something worth 667 or so, i'd think you just pay the difference.

What are your thoughts on my switching envelopes. I have two envelopes. You pick one, and open it, revealing the contents. I tell you the one envelope contains twice the amount of money as the other. Do you switch?

for the two envelopes problem, yes i would switch. right now i have an opened envelope with X dollars in it. I know the other envelope has either X/2 or 2X. So my EV from switching is (1/2)(X/2) + (1/2)*2X = 1.25X > X.

Jimbo:

What are your thoughts on my switching envelopes. I have two envelopes. You pick one, and open it, revealing the contents. I tell you the one envelope contains twice the amount of money as the other. Do you switch?

You would switch.

If you open it up and see you have \$100, then your EV of the other envelope is \$125.

So I would have to pony up cash for a 67% chance of making \$667? I certainly wouldn't be willing to pay \$667, b/c you wouldn't pay the expected value to recieve like amount in return. I have no clue...

Charming Chimp:

So I would have to pony up cash for a 67% chance of making \$667? I certainly wouldn't be willing to pay \$667, b/c you wouldn't pay the expected value to recieve like amount in return. I have no clue...

it's a 67% chance to make \$1000. the expected value is (2/3)*(1000) = \$666.67.

Without changing you have a 33.3% chance of making \$1000, so the EV is (1/3)*(1000) = \$333.33.

You're willing to pay 1 cent less than the expected difference assuming risk neutral.

How much you're willing to pay also depends on your risk aversion level and utility function of wealth.

Side-by-side comparison of top modeling training courses + exclusive discount through WSO here.

to be honest, i'm not sure if i'd switch. i mean, you've got an 50/50 chance of choosing either sum, so there is no strategy involved in the first choice. when it comes to switching, the second envelope contains either 0.5x or 2x so it's just a matter of figuring out the expected value of the switch. in this case it would be (0.5)(0.5x)+(0.5)(2x)-x or 25% of the initial sum. on the other hand, you could just forego the whole switch and pretend you swapped the second one for the one you're currently holding, the expected payout is the same.

i say if you're feeling ballsy, make the switch, if not, take the money and run :P

4.One, he may have many Birthdays, but only one birth day!

6.The ladder will always be 1 foot under the water, because as the water rises...so does the boat!!

alabinjo:

4.One, he may have many Birthdays, but only one birth day!

6.The ladder will always be 1 foot under the water, because as the water rises...so does the boat!!

but if its raining 1 foot per hour won't the boat be more submerged, lowering the ladder? Or even sink haha

For the envelope question i would say not to switch. The EV of switching is either 1/4x or 1x. Right now i have X. If i switch the average EV would be about 5/8X which is less than X.

2.The match of course!

1. They will both be the same distance from Paris when they meet!!

^ That is if you're assuming that to "meet" the two planes are at the same point.

I visualize meet as more of a head-to-head meet, meaning that the plane flying from Paris will actually be closer, right?

But ehh, I'm not a fan of many of the brainteasers in this thread so far. Are these really IB-interview type questions?

"It only make sense just to stay put. They are 3 doors. two with nothing and the other with \$1000. If the host already revealed one door with nothing behind then your odds are now 50/50. So why on earth would you pay money when the probability would still be the same."

it's not 50/50

Jimbo:

"It only make sense just to stay put. They are 3 doors. two with nothing and the other with \$1000. If the host already revealed one door with nothing behind then your odds are now 50/50. So why on earth would you pay money when the probability would still be the same."

it's not 50/50

If you pick door 1 with nothing and you switch, you win \$.
If you pick door 2 with nothing and you switch, you win \$.
If you pick door 3 with the money and you switch, you get nothing.
2/3 of the time u win money if you switch.

If you don't switch you only win 1/3 of the time.

Is that right?

"for the two envelopes problem, yes i would switch. right now i have an opened envelope with X dollars in it. I know the other envelope has either X/2 or 2X. So my EV from switching is (1/2)(X/2) + (1/2)*2X = 1.25X > X."

your ev calculation is not true actually....

How could the boat sink? How will the boat be more submerged? I don t think you know how about works.

well won't the rain "flood" the boat? Adding extra weight onto it causing it to lower into the water more?

If that was the case every fisherman fishing in the rain will have drowned and therefore everybody will be to scared to go boating in the rain. LOL

For the two envelopes. Say one has x, the other has 2x.

You pick one, and then switch. You're equally likely to have the x as you are to have the 2x in your initial pick. In the former you gain x by switching, and in the latter you lose x.

Your gain from switching is 0.5x + 0.5(-x) = 0.

Therefore, you will be indifferent to switching.

Also, I'm shocked there are still people out there that haven't heard the Monty Hall problem - so overused.

This one is good.

You have two doors, one leads to the job offer (the door you want) and one leads to the exit (the one you dont want.) In front of each door stands a guard, one of which tells only lies, and one who is truthful. You may ask one guard only one question in hopes of opening the correct door. What do you ask? (Keep in mind you do not know what door the 'liar' or the truthful' guard is in front of?

You can ask guard 1, "if i ask guard 2 which door to choose, which one would he tell me?". And you choose the opposite of what he says.

If guard 1 is the lier, he will tell you the wrong door. So you pick the other one and win.
If guard 1 is not the lier, he will tell you the false door again. So you pick the other one and win.

Chiggity check this out for the envelopes Q:

http://www.ibankingfaq.com/category/interviewing-brainteasers/

"You would switch.

If you open it up and see you have \$100, then your EV of the other envelope is \$125.
"

nope.

"You would switch.

If you open it up and see you have \$100, then your EV of the other envelope is \$125.
"

this is definitely a common answer, probably the most common. but the question is not nearly as simple as it seems.

because the ev is 112.5?

the ev is unknowable.

Two Envelopes: Assuming the contents are unknown to both people. You have envelope with values X and 2X. You receive an EV of 1.5X. The EV of the switch is likewise 1.5X. There is no incentive to switch, unlike in the three door question where new information is revealed.

I didn't read all the solutions, but for the door problem.

you make the assumption that when the door opens it was assumed wrong (asymmetrical knowledge. the host knows what door has nothing).

Initially you have an EV of 1/3(1000) = 333.33

By switching your p(success) goes to 2/3, so EV is 2/3(1000) = 666.67

you'd be willing to pay 666.67-333.33 = 333.34 - 0.01 = 333.33 (this #'s a little off bc of rounding. It's actually

lim x (1/3*1000) - (1/x)
x->inf

i believe.

You're willing to pay so long as your value increases. If it stays the same, you're neutral to switching. If it goes up you pay (<333.33). If it goes down you don't.

It's the monty hall problem. You have to make the assumption that the host knew which door had a prize, otherwise the odds don't change. If it helps, think of 100 doors. you pick a door. the host opens 98 doors, all of which are empty. Do you switch to the other unopened door or keep your door?

"You have to make the assumption that the host knew which door had a prize, otherwise the odds don't change. "

Actually I don't think it does. It just matters that he opens a door that's empty.

Jimbo:

"You have to make the assumption that the host knew which door had a prize, otherwise the odds don't change. "

Actually I don't think it does. It just matters that he opens a door that's empty.

It does. Otherwise it's a 50/50 instead of 1/3 to 2/3.

The host does not know what lies behind the doors, and opens one at random without revealing the car (vos Savant 1996, p. 181).
-->Switching wins the car half of the time.

http://en.wikipedia.org/wiki/Monty_Hall_problem#Ot...
We did the monty hall in our stats 101. The only addition is the EV and willingness to pay. I guess the amount you pay could depend on a utility function and risk aversion (e.g. there's a chance you win nothing and pay a certain amount to switch), but adding that could make the problem quite obscene. If he's risk averse he'd be willing to pay less; if he's risk loving he'd be willing to pay more. If he's risk neutral (which most Q's are assumed), he'd be willing to pay 1 cent less than the difference in the EV.

This probably isn't the answer you're looking for, but by switching envelopes you aren't actually improving your gains from where you were at the start of the scenario. Sure, if someone gives me \$100 and then says "I'll either double your money or take half," I'm better off taking that gamble. However, my odds of choosing the larger sum of money do not improve once I know the contents of a single envelope. Switching evenlopes is still going to net me the lower sum 50% of the time.

CompBanker

Oh and, as to the math logic behind it, when you switch to the wrong envelope, you are actually losing 50% of 2X, not 1X, because you originally chose the larger amount. If the envelopes contain \$100 and \$200, the options are either gain \$100 or lose \$100.

CompBanker

"The host does not know what lies behind the doors, and opens one at random without revealing the car (vos Savant 1996, p. 181).
-->Switching wins the car half of the time."

I'm not convinced. I understand the argument at some level....but let's say there's 100 doors. You pick one. You have a 1% chance of being right on your initial choice, the rest of the set has the other 99%. The host randomly opens 98 of the other doors, and by pure luck doesnt reveal the prize. You still think it's 50/50 to switch?

if it's by pure chance, then yeah.

If it's by pure luck, the other door had a 1% chance as well, the same as your door. With only two doors left and the rest have been failures, each door has a 50% chance of success.

statistically there's a 33% chance of each door being the right one. so now that if one of the two left is ruled out, the second door becomes 67%, so it pays, statistically, to change. it's got to go with variable change.

this was actually in the movie 21..neat movie by the way

good movie, but not worth the \$12 or whatever it cost me to go see it. Catch it on video. It started out good and but falters towards the end. He notes the assumption in the movie as well though. If not, you can't group them together and each door is treated independently.

There's a similar problem related to death row inmates where 1 gets to go free.

agree on the video comment.

what's the one about death row inmates?

Omfg I can't believe you guys are STILL debating this wtf

Get a room!

It's almost identical in structure.

3 inmates, A,B, & C. The warden says that 2 of the 3 are getting the electric chair next week, and the third goes free. He won't tell which one is which. Prisoner A asks the warden to tell him which 2, but the Warden refuses. He then asks to know one of the two. The warden agrees, believing that this gives no additional information for prisoner A. (e.g. a,b: a,c: b,c get the chair) Since the warden knows which prisoner is which, he tells prisoner A that prisoner C is for sure going to get the chair. Since there is still another left, the warden leaves for the day and believes that the prisoner has no new information. Prisoner A asks the next day to switch fates with prisoner B. Since the warden thinks that prisoner A has a 50/50 chance, he doesn't see the problem with that, and agrees. Prisoner A thinks this improves his chances. Who is right?

After the previous problem the answer is pretty obvious, but I think the point is being able to pick out relevant vs. nonrelevant information and coming to a conclusion. This is effectively identical the Monty Hall problem, but dealing with prisoners and life/death instead of money. From the outset we can safely assume the information asymmetry where the warden knows the fates of all 3 from the beginning.

"if all rates fall to 0%, what would you pay for the bond"

Whoops, that's right.

so far, my intuition is this: looking at the dollar amount shouldn't change your decision, because it should generalize to any amount x. therefore, it can't pay to switch, because if it did, then you would choose to switch without looking, and after switching, you would face the same blind circumstances and choose to switch again, ad infinitum.

i'll get back to you in a few minutes if i come up with more...

let's say i put 100 dollars in one envelope and 200 in the other envelope and put them in my back pocket, it is no longer identifiable even to me which is which.

i'm walking merrily along when i realize that one of the envelopes has fallen out of my pocket! so, in expectation, i realize that i've just lost 150 dollars!

but along comes you, kind sir, and you say "whoa, good thing i caught up to you - i found this envelope of yours, is it important?"

now i tell him, "you are quite an honorable young man - i tell you what, here's the second envelope from my pocket - this morning i put \$100 in one of these envelopes and \$200 in the other. would you like to switch?"

at this point, he and i have the same information, and to both of us, the expected value of each envelope is \$150, so a trade is immaterial, we would both be indifferent.

...

because this scenario generalizes to any envelopes of x and 2x dollars, you should not have to look inside the envelope to make your decision, and even looking in at it should not change your generalized decision.

the trick here is that the math appears to show that you are either doubling or halving your money, but in fact, you are either doubling x or halving 2x, which is the same potential gain/loss

note: looks like mj explained it pretty well, but anyway, if you're bored...

You get 1.25 if you do:

Envelope (A) = 0.5x (i.e. you lose half of what you had)
Envelope (B) = 2
x (i.e. you double what you had)

where x is the value in my envelope. If you were to switch you'd get either 1/2 the value in your envelope (x/2) or twice the value in the envelope (2x).

p(A) = p(B) = 0.5

0.5(x/2) + 0.5(2x) = (5x / 4) = 1.25x for switching. 1.25 > 1 -> switch

The problem with this solution is that the x-value isn't constant here.

If you were to make X constant, then you either get x or 2x.

So:
Envelope (A) = x
Envelope (B) = 2x

0.5x + 0.52x = 1.5x for either envelope. With this solution you're indifferent to switching. This makes more logical sense in that your EV is the average value of the two envelopes. With a 50/50 chance for A or B, it really doesn't matter. This assumes there's no knowledge of the amounts in either envelope / both are unopened.

Now with all that said, when the envelope is opened and you have an opportunity to switch, you do. Essentially you stand to gain more than you lose (gain 100% versus lose 50%). Going back to the first problem, you gain 2x or lose .5x where x is the value you have.

So it doesn't matter with an unknown value, but does matter when you open it & have a value. You always switch once you have a known value regardless. That's why it's a paradox.

It seems like every two months we have a long winded discussion of

1) The Monty Hall Problem
&
2) The two envelope problem

Judging by past trends this thread will die in two days and start up again June 5,2008

agreed with gimmearedbull. good call.

and you still dont switch even if you see the contents of the envelope.

split the 9 into three sets of 3.

1st move: measure two of the sets against each other.

if one of the sets is lighter than the other, it contains the light ball. if the sets are equally heavy, the unmeasured set contains the light ball. so we know which set contains the light ball.

2nd move: take the set which contains the light ball and measure two of its balls against each other. if one ball is lighter than the other then it is the light ball. or, if these two balls are equally heavy, the final unmeasured ball is the light ball.

and here's the kicker: if you have 3^n balls, you can find the light ball in n moves. (proof by induction).

I liked this one when I was younger:
I have two coins and they add up to 30 cents but one of them is not a nickel. What are the two coins?

quarter and a nickel. had to stare at that one for a minute

I like this one a lot (taken from http://excelerade.com/blog)
You are a painter and have a 6 day project coming up. You have to get paid equitably (same amount every day on the job), but the people hiring you only have one brick of gold to pay you with. You are allowed to make two cuts in the brick of gold (straight cuts, nothing fancy). How do you make two cuts and get paid equitably?

Cut it in 1/2 and cut one of the remaining halves into a 2/3, 1/3 split.

The first day they pay you 1/3 of the 1/2 which is 1/6.

The 2nd day you give back the 1/6 and take the 2/3 of the 1/6 which is 2/6.

The third day you give back the 2/6 and take the 1/2 which is 3/6.

The fourth day you take the 1/6 piece again and now have 4/6.

The 5th day you give back the 1/6, keep the 1/2 and take the 2/6 piece for a total of 5/6.

The 6th day you take the last piece and have 6/6.

• 2

for what it is worth, I had to do about 15 brainteasers for my HF job.....

they don't go away, and they get a LOT harder.

so if you don't get it right, but show a logical train of thought, does that work against you?

they don't go away, and they get a LOT harder.

True that.

How far can a dog run into the woods?

Half way..

I recently got the following question on an interview, follow the link:
http://www.mycoted.com/4_men_in_hats

nice one

wannabebanker:

Half way..

I recently got the following question on an interview, follow the link:
http://www.mycoted.com/4_men_in_hats

A child is born in Boston Mass. to parents who are also born in Boston Mass. How is it possible that the child is not an American?

Indian territory? early 1700's?

coscarsm:

A child is born in Boston Mass. to parents who are also born in Boston Mass. How is it possible that the child is not an American?

Ummm... the child was born before US independence so it was a British citizen.

Boston Mass is a hospital...

Can only run halfway in, after that it would be running out of the woods.

WizardofOz:

Can only run halfway in, after that it would be running out of the woods.

ah.