1. 9^9
2. You light the match first

Nice triple post

Side-by-side comparison of top modeling training courses + exclusive discount through WSO here.

Why would you not count ^ as a mathematical symbol?

And if "the match" is really the answer to that question, I'd only hire people who got it wrong. I wouldn't want someone that pedantic working with me.

"It is a fine thing to be out on the hills alone. A man can hardly be a beast or a fool alone on a great mountain." - Francis Kilvert (1840-1879)

"Ce serait bien plus beau si je pouvais le dire a quelqu'un." - Samivel

"It is a fine thing to be out on the hills alone. A man can hardly be a beast or a fool alone on a great mountain." - Francis Kilvert (1840-1879)

"Ce serait bien plus beau si je pouvais le dire a quelqu'un." - Samivel

Agree with the stupidity of the match question. Bear in mind for the first one, you don't need a carrot when writing an exponent (except on a computer).

Got it (re: the carrot)

One more point... if you're going to be that pedantic about the match question, you'd have to define the word "number" a lot more precisely than that.

"It is a fine thing to be out on the hills alone. A man can hardly be a beast or a fool alone on a great mountain." - Francis Kilvert (1840-1879)

"Ce serait bien plus beau si je pouvais le dire a quelqu'un." - Samivel

"It is a fine thing to be out on the hills alone. A man can hardly be a beast or a fool alone on a great mountain." - Francis Kilvert (1840-1879)

"Ce serait bien plus beau si je pouvais le dire a quelqu'un." - Samivel

That's retarded. If you don't count exponents as a mathematical symbol, I would just write two sideways eights. Infinite times infinite = infinite.

the answer for 1 is infinity;
the first number is
99999999999...9
plus the second number
99999......9

two zeros...intersecting each other in a tangent like fashion to form infinity, of course infinity times infinity is not valid mathematically just as infinity divided by infinity isn't.

You have to remember that infinity is not a number; it's a mathematical concept.

bevo51 wrote:

You have to remember that infinity is not a number; it's a mathematical concept.

It depends on what set of numbers you're working. Infinity is a number in the extended real number system. Of course, this set does not form a field, but it is still widely used.

1. My explanation: It's the two largest single digits put together and I didn't feel like trying to be a smart ass.

i first thought 99 as 2 numbers, and then "100" is two numbers (one and zero). so using that logic it could be 999...998 i guess.

I think we all agree it's a pretty stupid question.

For the interviewers out there, what do you look for when you ask a brain teaser question (even if it's better than this one)? Is it the right answer, the candidates "way of thinking" or just to see that the person doesn't panic??

"It is a fine thing to be out on the hills alone. A man can hardly be a beast or a fool alone on a great mountain." - Francis Kilvert (1840-1879)

"Ce serait bien plus beau si je pouvais le dire a quelqu'un." - Samivel

"It is a fine thing to be out on the hills alone. A man can hardly be a beast or a fool alone on a great mountain." - Francis Kilvert (1840-1879)

"Ce serait bien plus beau si je pouvais le dire a quelqu'un." - Samivel

Match question is a pretty bad measure of thought process in my opinion. I got this one once... it's an interesting interview question.

There is a lily pad in a pond. (It is the only thing in the pond) It doubles in size every minute and completely fills the pond at 60 minutes. At what minute is the pond 1/4 full?

It should be 1/4 full at the 58th minute.

The pond will then be half full at minute 59, and completely full at minute 60

4) 1.. but if you ask this in an interview, it's hard to tell if you're talking about birthdays or birth days.

largest number: 1/0 = infinity

Really, it's the limit of 1/x as x->0+ that equals infinity. 1/0 itself undefined. Hey, as long as your interviewer doesn't know that I guess its an okay answer...

Question three has many answers, depending on what you classify as yours. Name is probably the best answer to go with, unless you speak in third person all the time. Also, I would think your phone number, being that you do not call yourself. But then you would probably have someone say, "that belongs to your phone, not you." Same goes with your address. Social Security number belongs to you as well, but its usually used by others as a way to identify you. The list could probably go on for awhile.

5) When they meet they are at the same point - same distance away from Paris.

6) Uhh 8 hours? Must be some trick..

We're Italian, "WACC" means something else to us.

6 - I'd imagine the boat continues to rise as it rains more and the ladder will never be half submerged.

Questions 1 - 4 are terrible questions.

CompBanker

Not sure if this is a teaser, but I got this:

You have a 9x9x9 Rubik's cube that you decide to dip in red paint so that when dried all the outside is covered in red. You then throw the all-red cube against the wall and break the 729 cubes that comprise the 9x9x9 completely apart. How many of the pieces will have no red on them whatsoever?

gomes3pc wrote:

Not sure if this is a teaser, but I got this:

You have a 9x9x9 Rubik's cube that you decide to dip in red paint so that when dried all the outside is covered in red. You then throw the all-red cube against the wall and break the 729 cubes that comprise the 9x9x9 completely apart. How many of the pieces will have no red on them whatsoever?

A rubiks cube is 3x3...... but if you insist on a 9x9x9.... 9x9x9 has 6 sides of 81... which is 486 pieces with red paint on them.

elan wrote:
gomes3pc wrote:

Not sure if this is a teaser, but I got this:

You have a 9x9x9 Rubik's cube that you decide to dip in red paint so that when dried all the outside is covered in red. You then throw the all-red cube against the wall and break the 729 cubes that comprise the 9x9x9 completely apart. How many of the pieces will have no red on them whatsoever?

A rubiks cube is 3x3...... but if you insist on a 9x9x9.... 9x9x9 has 6 sides of 81... which is 486 pieces with red paint on them.

I didn't "insist" anything. That was the question I was asked from a MM analyst during an interview. I know a Rubik's is 3x3...Bear Stearns gave me one during final round interviews (ha! nice memorabilia).

gomes3pc wrote:

Not sure if this is a teaser, but I got this:

You have a 9x9x9 Rubik's cube that you decide to dip in red paint so that when dried all the outside is covered in red. You then throw the all-red cube against the wall and break the 729 cubes that comprise the 9x9x9 completely apart. How many of the pieces will have no red on them whatsoever?

all of the side pieces would have paint whether it be painted on 1,2,or 3 sides.
so you're left with a 7x7x7 cube of unpainted little cubes. 343 cubes.

a little bit of a twist on the question:
What is the expected value of the number of painted sides you'd expect if you picked one cube at random?

Now what if you picked 3 cubes?

Got this one once.

1)You have three 5's and a 1. Using the mathematical operations and all four digits, get to a total of 24.

2)If you have a 5yr bond with a coupon of 5%, and all rates drop to 0%, what would you pay for the bond?

3)This was asked to me in an office: "I make a stack of quarters which stretches to the height of the empire state building. If I took those coins and put them in this office, make me a market in what percentage of the room would be taken up".

4)There is a mile of railway track. It is bolted down in such a way that when it cannot displace to the side but only up or down. If on a hot day, the track expands by one foot, how high up into the air would the rail lift?

Jimbo

1, ((5/5)*5)-(1/1)

1)You have three 5's and a 1. Using the mathematical operations and all four digits, get to a total of 24.
- (5!)/(5/(1^5)) = 120/5 = 24

3)This was asked to me in an office: "I make a stack of quarters which stretches to the height of the empire state building. If I took those coins and put them in this office, make me a market in what percentage of the room would be taken up".
- empire state building about 1000feet tall, room is about say 10x10x10. quarters have diameter of 1 inch. could either use area of quarter or length of quarter to calculate how much of the room would be occupied. using area: so area of quarter is about .75 sq. in. so, the total volume of the stack is 1000 x 12 x .75 = 9000 cubic inch. total volume of room is 120x120x120 so about 1.7 mil cubic inches. 1700/9 is about 200 so approximately .5% of the room. using length method: we know stack is 12000 inches tall so we could divide this stacks into stacks of height 120, so we'd need 100 stacks. the room is 14400 stacks wide, so 100/14400 is about 1/144 or .7% of the room.

have an exam in a bit, will try the others after.

(5 x 5) - 1^5

We're Italian, "WACC" means something else to us.

(5x5)^0.5x5-1=24
:)

the bonds question anyoone ???

the empire estate building question... i had the same answer more or less, quite impressed that the actual % is quite low....

For the bond question wouldn't the market price be \$1,250? If rates are at 0% you would just pay out all the future cash flows at the start of the bond's life.

Of course that answer would not take into account inflation, time value of money, alternative investments, etc. etc.

David Van Patten wrote:

Of course that answer would not take into account inflation, time value of money, alternative investments, etc. etc.

sure it would. 125 price is correct.

Clarification on #1, only the four primary operations are open to you, exponents are not allowed.

for #3 there is a much faster way of approaching it. the office was on a single floor, one story high. keep that in mind.

You have a revolver into which 2 bullets are loaded right next to each other. You point the gun to your head, spin the revolver and shoot. It is a blank. You have to shoot once again, but have the option to either spin one more time before shooting, or you can pull the trigger right away. Does it matter which option you choose, why ?

(5-1/5)5

well done

1) You are on a gameshow, behind one of the doors is \$1000. Behind the other 2 is nothing. You pick a door. The host, opens one of the other doors, revealing that it's empy. What would you pay to switch?

2) Well, this one is tougher and maybe too involved for this forum but...assuming you have a swap curve built through the 30 yr point, and nothing beyond that, make me a market in a 40yr/50yr rate switch.

3) I have two envelopes. One contains twice the amount of the other. I give you one of the envelopes. Should you switch?

Jimbo.

1. I would pay nothing becuase the probability is in my favor now. So why pay money to switch.

Think again, it's definitely advantageous to switch. Let me phrase the question again.

There are three doors on a gameshow. One has \$1000 behind it, the other two have nothing. You pick one. The host opens one of the two remaining doors, revealing its contents to be empty. He now gives you a chance to switch to the remaining closed door, or keep your original choice. What would you pay to switch?

Jimbo wrote:

Think again, it's definitely advantageous to switch. Let me phrase the question again.

There are three doors on a gameshow. One has \$1000 behind it, the other two have nothing. You pick one. The host opens one of the two remaining doors, revealing its contents to be empty. He now gives you a chance to switch to the remaining closed door, or keep your original choice. What would you pay to switch?

i could be way off, but i'd be willing to pay any amount up to \$666.67 to switch doors. my reasoning is this: the chances of my initial choice being a door with nothing behind is is 2/3, whereas my chances of choosing the \$1000 door right off the bat is only 1/3. logic would dictate that more often than not, i'll choose a losing door on my first try. the fact that the gameshow host always reveals an empty door is the key to this question. 2/3 of the time (when you pick a loser right off the bat) he'll reveal the second loser leaving you the option to switch to the winning door, compared to the 1/3 chance of you picking a winner the first time around and the host revealing only one of the losers (leaving you the option of switching to another loser). so, by only paying up \$666.67 i "guarantee" that i leave the show with no less than what i started with, or in the best case scenario (assuming i pay between 0 and \$666.67) anywhere between \$333.33-\$1000.

also, is the answer to the expanding one-mile railroad question 51.38 feet, jimbo?

alabinjo wrote:

1. I would pay nothing becuase the probability is in my favor now. So why pay money to switch.

This guy is correct, I researched this question with a friend a few weeks ago and I'm 99.9% sure that answer is "do not switch" i.e. pay \$0.00

It only make sense just to stay put. They are 3 doors. two with nothing and the other with \$1000. If the host already revealed one door with nothing behind then your odds are now 50/50. So why on earth would you pay money when the probability would still be the same.

alabinjo wrote:

It only make sense just to stay put. They are 3 doors. two with nothing and the other with \$1000. If the host already revealed one door with nothing behind then your odds are now 50/50. So why on earth would you pay money when the probability would still be the same.

if so, then you should also switch after the host opens one of those two. nothing has changed. at least one of those doors is always empty.

Jimbo wrote:

1) You are on a gameshow, behind one of the doors is \$1000. Behind the other 2 is nothing. You pick a door. The host, opens one of the other doors, revealing that it's empy. What would you pay to switch?

3) I have two envelopes. One contains twice the amount of the other. I give you one of the envelopes. Should you switch?

Jimbo.

1) EV of the envelope that you have in your hand is \$333.33. EV of the other envelope is \$500 so I'd be willing to pay \$176.66 or less.

I heard this from a friend:

you play a game in which a deck of cards (52 cards) will be revealed to you one after the other, before each card is revealed you can make a bet about whether the card is red or black. You start off with x amount of money. For each bet you can use any fraction of your money (assume x is infinitely divisible), if you win, the money you bet gets doubled, if you lose you lose all the money you put into the bet. You play this game until all 52 cards are revealed. What strategy should you use to maximize a riskfree(!) outcome.

Jimbo, if you asked me #3 in an interview I would reply the following:

If you are on the second floor a building, and you have a stack of quarters the height of the empire state building, and you begin to place all the quarters into the single room, by the time you would finish there would be no quarters in that room. Reason being, the floor would collapse beneth the quarters due to the weight. Therefore, the answer is zero.

"Reason being, the floor would collapse beneth the quarters due to the weight. Therefore, the answer is zero."

And you would be very wrong.

Count and wait until you know that only one color remains in the deck, and then bet. I thin that's the only way you hit the risk free criteria. Or at the least, wait until two cards are left and bet half.

but apparently it's wrong...

Ah, it was worth a shot. It was the first thing that came to my head when you emphasized that the room was on the second floor.

Ok Tyler, and now that you've switched, i give you the chance to switch again. Would you do it?

Jimbo wrote:

Ok Tyler, and now that you've switched, i give you the chance to switch again. Would you do it?

Got my probabilities wrong in the first one.. EV of second choice should be 2/3 not 1/2.

I don't think I'd switch again because after a switch, you have a 2/3 chance of winning, but if you swap, then the probabilities drop to 1/2 each which leaves you worse off.

yes to the second.

for the first, you already have something with an expected value of 333, so to switch to something worth 667 or so, i'd think you just pay the difference.

What are your thoughts on my switching envelopes. I have two envelopes. You pick one, and open it, revealing the contents. I tell you the one envelope contains twice the amount of money as the other. Do you switch?

for the two envelopes problem, yes i would switch. right now i have an opened envelope with X dollars in it. I know the other envelope has either X/2 or 2X. So my EV from switching is (1/2)(X/2) + (1/2)*2X = 1.25X > X.

Jimbo wrote:

What are your thoughts on my switching envelopes. I have two envelopes. You pick one, and open it, revealing the contents. I tell you the one envelope contains twice the amount of money as the other. Do you switch?

You would switch.

If you open it up and see you have \$100, then your EV of the other envelope is \$125.

So I would have to pony up cash for a 67% chance of making \$667? I certainly wouldn't be willing to pay \$667, b/c you wouldn't pay the expected value to recieve like amount in return. I have no clue...

Charming Chimp wrote:

So I would have to pony up cash for a 67% chance of making \$667? I certainly wouldn't be willing to pay \$667, b/c you wouldn't pay the expected value to recieve like amount in return. I have no clue...

it's a 67% chance to make \$1000. the expected value is (2/3)*(1000) = \$666.67.

Without changing you have a 33.3% chance of making \$1000, so the EV is (1/3)*(1000) = \$333.33.

You're willing to pay 1 cent less than the expected difference assuming risk neutral.

How much you're willing to pay also depends on your risk aversion level and utility function of wealth.

Side-by-side comparison of top modeling training courses + exclusive discount through WSO here.

to be honest, i'm not sure if i'd switch. i mean, you've got an 50/50 chance of choosing either sum, so there is no strategy involved in the first choice. when it comes to switching, the second envelope contains either 0.5x or 2x so it's just a matter of figuring out the expected value of the switch. in this case it would be (0.5)(0.5x)+(0.5)(2x)-x or 25% of the initial sum. on the other hand, you could just forego the whole switch and pretend you swapped the second one for the one you're currently holding, the expected payout is the same.

i say if you're feeling ballsy, make the switch, if not, take the money and run :P

4.One, he may have many Birthdays, but only one birth day!

6.The ladder will always be 1 foot under the water, because as the water rises...so does the boat!!

alabinjo wrote:

4.One, he may have many Birthdays, but only one birth day!

6.The ladder will always be 1 foot under the water, because as the water rises...so does the boat!!

but if its raining 1 foot per hour won't the boat be more submerged, lowering the ladder? Or even sink haha

For the envelope question i would say not to switch. The EV of switching is either 1/4x or 1x. Right now i have X. If i switch the average EV would be about 5/8X which is less than X.

2.The match of course!

1. They will both be the same distance from Paris when they meet!!

^ That is if you're assuming that to "meet" the two planes are at the same point.

I visualize meet as more of a head-to-head meet, meaning that the plane flying from Paris will actually be closer, right?

But ehh, I'm not a fan of many of the brainteasers in this thread so far. Are these really IB-interview type questions?

"It only make sense just to stay put. They are 3 doors. two with nothing and the other with \$1000. If the host already revealed one door with nothing behind then your odds are now 50/50. So why on earth would you pay money when the probability would still be the same."

it's not 50/50

Jimbo wrote:

"It only make sense just to stay put. They are 3 doors. two with nothing and the other with \$1000. If the host already revealed one door with nothing behind then your odds are now 50/50. So why on earth would you pay money when the probability would still be the same."

it's not 50/50

If you pick door 1 with nothing and you switch, you win \$.
If you pick door 2 with nothing and you switch, you win \$.
If you pick door 3 with the money and you switch, you get nothing.
2/3 of the time u win money if you switch.

If you don't switch you only win 1/3 of the time.

Is that right?

"for the two envelopes problem, yes i would switch. right now i have an opened envelope with X dollars in it. I know the other envelope has either X/2 or 2X. So my EV from switching is (1/2)(X/2) + (1/2)*2X = 1.25X > X."

your ev calculation is not true actually....

How could the boat sink? How will the boat be more submerged? I don t think you know how about works.

well won't the rain "flood" the boat? Adding extra weight onto it causing it to lower into the water more?

If that was the case every fisherman fishing in the rain will have drowned and therefore everybody will be to scared to go boating in the rain. LOL

For the two envelopes. Say one has x, the other has 2x.

You pick one, and then switch. You're equally likely to have the x as you are to have the 2x in your initial pick. In the former you gain x by switching, and in the latter you lose x.

Your gain from switching is 0.5x + 0.5(-x) = 0.

Therefore, you will be indifferent to switching.

Also, I'm shocked there are still people out there that haven't heard the Monty Hall problem - so overused.

This one is good.

You have two doors, one leads to the job offer (the door you want) and one leads to the exit (the one you dont want.) In front of each door stands a guard, one of which tells only lies, and one who is truthful. You may ask one guard only one question in hopes of opening the correct door. What do you ask? (Keep in mind you do not know what door the 'liar' or the truthful' guard is in front of?

You can ask guard 1, "if i ask guard 2 which door to choose, which one would he tell me?". And you choose the opposite of what he says.

If guard 1 is the lier, he will tell you the wrong door. So you pick the other one and win.
If guard 1 is not the lier, he will tell you the false door again. So you pick the other one and win.

Chiggity check this out for the envelopes Q:

http://www.ibankingfaq.com/category/interviewing-

"You would switch.

If you open it up and see you have \$100, then your EV of the other envelope is \$125.
"

nope.

"You would switch.

If you open it up and see you have \$100, then your EV of the other envelope is \$125.
"

this is definitely a common answer, probably the most common. but the question is not nearly as simple as it seems.

because the ev is 112.5?

the ev is unknowable.

Two Envelopes: Assuming the contents are unknown to both people. You have envelope with values X and 2X. You receive an EV of 1.5X. The EV of the switch is likewise 1.5X. There is no incentive to switch, unlike in the three door question where new information is revealed.

I didn't read all the solutions, but for the door problem.

you make the assumption that when the door opens it was assumed wrong (asymmetrical knowledge. the host knows what door has nothing).

Initially you have an EV of 1/3(1000) = 333.33

By switching your p(success) goes to 2/3, so EV is 2/3(1000) = 666.67

you'd be willing to pay 666.67-333.33 = 333.34 - 0.01 = 333.33 (this #'s a little off bc of rounding. It's actually

lim x (1/3*1000) - (1/x)
x->inf

i believe.

You're willing to pay so long as your value increases. If it stays the same, you're neutral to switching. If it goes up you pay (<333.33). If it goes down you don't.

It's the monty hall problem. You have to make the assumption that the host knew which door had a prize, otherwise the odds don't change. If it helps, think of 100 doors. you pick a door. the host opens 98 doors, all of which are empty. Do you switch to the other unopened door or keep your door?

"You have to make the assumption that the host knew which door had a prize, otherwise the odds don't change. "

Actually I don't think it does. It just matters that he opens a door that's empty.

Jimbo wrote:

"You have to make the assumption that the host knew which door had a prize, otherwise the odds don't change. "

Actually I don't think it does. It just matters that he opens a door that's empty.

It does. Otherwise it's a 50/50 instead of 1/3 to 2/3.

The host does not know what lies behind the doors, and opens one at random without revealing the car (vos Savant 1996, p. 181).
-->Switching wins the car half of the time.

http://en.wikipedia.org/wiki/Monty_Hall_problem#Ot...
We did the monty hall in our stats 101. The only addition is the EV and willingness to pay. I guess the amount you pay could depend on a utility function and risk aversion (e.g. there's a chance you win nothing and pay a certain amount to switch), but adding that could make the problem quite obscene. If he's risk averse he'd be willing to pay less; if he's risk loving he'd be willing to pay more. If he's risk neutral (which most Q's are assumed), he'd be willing to pay 1 cent less than the difference in the EV.

This probably isn't the answer you're looking for, but by switching envelopes you aren't actually improving your gains from where you were at the start of the scenario. Sure, if someone gives me \$100 and then says "I'll either double your money or take half," I'm better off taking that gamble. However, my odds of choosing the larger sum of money do not improve once I know the contents of a single envelope. Switching evenlopes is still going to net me the lower sum 50% of the time.

CompBanker

Oh and, as to the math logic behind it, when you switch to the wrong envelope, you are actually losing 50% of 2X, not 1X, because you originally chose the larger amount. If the envelopes contain \$100 and \$200, the options are either gain \$100 or lose \$100.

CompBanker

"The host does not know what lies behind the doors, and opens one at random without revealing the car (vos Savant 1996, p. 181).
-->Switching wins the car half of the time."

I'm not convinced. I understand the argument at some level....but let's say there's 100 doors. You pick one. You have a 1% chance of being right on your initial choice, the rest of the set has the other 99%. The host randomly opens 98 of the other doors, and by pure luck doesnt reveal the prize. You still think it's 50/50 to switch?

if it's by pure chance, then yeah.

If it's by pure luck, the other door had a 1% chance as well, the same as your door. With only two doors left and the rest have been failures, each door has a 50% chance of success.

statistically there's a 33% chance of each door being the right one. so now that if one of the two left is ruled out, the second door becomes 67%, so it pays, statistically, to change. it's got to go with variable change.

this was actually in the movie 21..neat movie by the way

good movie, but not worth the \$12 or whatever it cost me to go see it. Catch it on video. It started out good and but falters towards the end. He notes the assumption in the movie as well though. If not, you can't group them together and each door is treated independently.

There's a similar problem related to death row inmates where 1 gets to go free.

agree on the video comment.

what's the one about death row inmates?

Omfg I can't believe you guys are STILL debating this wtf

Get a room!

It's almost identical in structure.

3 inmates, A,B, & C. The warden says that 2 of the 3 are getting the electric chair next week, and the third goes free. He won't tell which one is which. Prisoner A asks the warden to tell him which 2, but the Warden refuses. He then asks to know one of the two. The warden agrees, believing that this gives no additional information for prisoner A. (e.g. a,b: a,c: b,c get the chair) Since the warden knows which prisoner is which, he tells prisoner A that prisoner C is for sure going to get the chair. Since there is still another left, the warden leaves for the day and believes that the prisoner has no new information. Prisoner A asks the next day to switch fates with prisoner B. Since the warden thinks that prisoner A has a 50/50 chance, he doesn't see the problem with that, and agrees. Prisoner A thinks this improves his chances. Who is right?

After the previous problem the answer is pretty obvious, but I think the point is being able to pick out relevant vs. nonrelevant information and coming to a conclusion. This is effectively identical the Monty Hall problem, but dealing with prisoners and life/death instead of money. From the outset we can safely assume the information asymmetry where the warden knows the fates of all 3 from the beginning.

"if all rates fall to 0%, what would you pay for the bond"

Whoops, that's right.

so far, my intuition is this: looking at the dollar amount shouldn't change your decision, because it should generalize to any amount x. therefore, it can't pay to switch, because if it did, then you would choose to switch without looking, and after switching, you would face the same blind circumstances and choose to switch again, ad infinitum.

i'll get back to you in a few minutes if i come up with more...

let's say i put 100 dollars in one envelope and 200 in the other envelope and put them in my back pocket, it is no longer identifiable even to me which is which.

i'm walking merrily along when i realize that one of the envelopes has fallen out of my pocket! so, in expectation, i realize that i've just lost 150 dollars!

but along comes you, kind sir, and you say "whoa, good thing i caught up to you - i found this envelope of yours, is it important?"

now i tell him, "you are quite an honorable young man - i tell you what, here's the second envelope from my pocket - this morning i put \$100 in one of these envelopes and \$200 in the other. would you like to switch?"

at this point, he and i have the same information, and to both of us, the expected value of each envelope is \$150, so a trade is immaterial, we would both be indifferent.

...

because this scenario generalizes to any envelopes of x and 2x dollars, you should not have to look inside the envelope to make your decision, and even looking in at it should not change your generalized decision.

the trick here is that the math appears to show that you are either doubling or halving your money, but in fact, you are either doubling x or halving 2x, which is the same potential gain/loss

note: looks like mj explained it pretty well, but anyway, if you're bored...

You get 1.25 if you do:

Envelope (A) = 0.5x (i.e. you lose half of what you had)
Envelope (B) = 2
x (i.e. you double what you had)

where x is the value in my envelope. If you were to switch you'd get either 1/2 the value in your envelope (x/2) or twice the value in the envelope (2x).

p(A) = p(B) = 0.5

0.5(x/2) + 0.5(2x) = (5x / 4) = 1.25x for switching. 1.25 > 1 -> switch

The problem with this solution is that the x-value isn't constant here.

If you were to make X constant, then you either get x or 2x.

So:
Envelope (A) = x
Envelope (B) = 2x

0.5x + 0.52x = 1.5x for either envelope. With this solution you're indifferent to switching. This makes more logical sense in that your EV is the average value of the two envelopes. With a 50/50 chance for A or B, it really doesn't matter. This assumes there's no knowledge of the amounts in either envelope / both are unopened.

Now with all that said, when the envelope is opened and you have an opportunity to switch, you do. Essentially you stand to gain more than you lose (gain 100% versus lose 50%). Going back to the first problem, you gain 2x or lose .5x where x is the value you have.

So it doesn't matter with an unknown value, but does matter when you open it & have a value. You always switch once you have a known value regardless. That's why it's a paradox.

It seems like every two months we have a long winded discussion of

1) The Monty Hall Problem
&
2) The two envelope problem

Judging by past trends this thread will die in two days and start up again June 5,2008

agreed with gimmearedbull. good call.

and you still dont switch even if you see the contents of the envelope.

split the 9 into three sets of 3.

1st move: measure two of the sets against each other.

if one of the sets is lighter than the other, it contains the light ball. if the sets are equally heavy, the unmeasured set contains the light ball. so we know which set contains the light ball.

2nd move: take the set which contains the light ball and measure two of its balls against each other. if one ball is lighter than the other then it is the light ball. or, if these two balls are equally heavy, the final unmeasured ball is the light ball.

and here's the kicker: if you have 3^n balls, you can find the light ball in n moves. (proof by induction).

I liked this one when I was younger:
I have two coins and they add up to 30 cents but one of them is not a nickel. What are the two coins?

quarter and a nickel. had to stare at that one for a minute

I like this one a lot (taken from http://excelerade.com/blog)
You are a painter and have a 6 day project coming up. You have to get paid equitably (same amount every day on the job), but the people hiring you only have one brick of gold to pay you with. You are allowed to make two cuts in the brick of gold (straight cuts, nothing fancy). How do you make two cuts and get paid equitably?

Cut it in 1/2 and cut one of the remaining halves into a 2/3, 1/3 split.

The first day they pay you 1/3 of the 1/2 which is 1/6.

The 2nd day you give back the 1/6 and take the 2/3 of the 1/6 which is 2/6.

The third day you give back the 2/6 and take the 1/2 which is 3/6.

The fourth day you take the 1/6 piece again and now have 4/6.

The 5th day you give back the 1/6, keep the 1/2 and take the 2/6 piece for a total of 5/6.

The 6th day you take the last piece and have 6/6.

for what it is worth, I had to do about 15 brainteasers for my HF job.....

they don't go away, and they get a LOT harder.

so if you don't get it right, but show a logical train of thought, does that work against you?

Quote:

they don't go away, and they get a LOT harder.

True that.

How far can a dog run into the woods?

Half way..

I recently got the following question on an interview, follow the link:
http://www.mycoted.com/4_men_in_hats

wannabebanker wrote:

Half way..

I recently got the following question on an interview, follow the link:
http://www.mycoted.com/4_men_in_hats

A child is born in Boston Mass. to parents who are also born in Boston Mass. How is it possible that the child is not an American?

Indian territory? early 1700's?

coscarsm wrote:

A child is born in Boston Mass. to parents who are also born in Boston Mass. How is it possible that the child is not an American?

Ummm... the child was born before US independence so it was a British citizen.

Boston Mass is a hospital...

Can only run halfway in, after that it would be running out of the woods.

WizardofOz wrote:

Can only run halfway in, after that it would be running out of the woods.

ah.

I think it's because of the word "into", denoting that the maximized distance is the point that is equidistant from both edges of the forest.

You are blindfolded with a large number of coins in front of you. you also are wearing gloves so you can't tell by feel if the coins are heads up or down. You are told that 6 of the coins are tails, and all the rest are heads.

You need to divide the pile into 2 piles, each with an equal number of tails, how do you do it?

• put a random 6 coins aside
• flip them all once
You then have between 0 and 12 tails all together, but just as many in both piles

There are 3 lightbulbs inside a room which you cannot see into unless you physically open the door and walk in. Far outside that room there are 3 switches that operate the 3 lightbulbs. Opening the door only once to see which lightbulbs are lit, how can you determine which switch turns on which lightbulb? and no...you can't leave the door open and turn on the switches to see which switch connects to which bulb.
For example, if I hit switch B and opened the door, I'd see that bulb 3 was lit...but would still not know which bulb for switch A and C.
Friend of mine was asked this question during an interview...not for banking, but for engineering.

• activate switch 1, wait for 5mins
• turn off switch 1, activate switch 2, go into the room

There will be 1 light on, 1 light off but the lightbulb will be hot, and 1 cold light that's off, so you can figure out which switch goes with which lightbulb

good job euro monkey

You probably won't come across this one in an interview but it's the only one I can think:

A guy commits suicide. He's found dead, hung from the ceiling of an empty room with a puddle of water underneath him. He is naked and noone was in the room while he was. How did he do it?

stood on a block of ice?

(these conf calls are draggin on...)

you are given 1000 1\$ coins, place these in ten envelopes such that when i ask for any sum of money between \$1 and \$1000 you can give me a combination of envelopes that equals that amount

1, 2, 4, 8, 16, 32, 64, 128, 256, 512

Could an engineer out there explain what's so special about these particular numbers (and those that follow ie. 1024 etc)? You see them around (eg computer RAM and nintendos) quite a bit.

Don't mean to take this thread on a tangent, I'm just a generally curious guys who's not mad enough to sign up on a maths or programmer board...

definately the right idea. just need to watch what your total sum is, all right bar the final envelope

Stupid mistake.

Binary counting is how computers work with numbers using only 0's and 1's. It is a base two system which gives you the link to all those numbers. It works on the fact that any number is the sum of a combination of powers of two, ie 11 = 2^3 (8) + 2^1 (2) + 2^0 (1). it would be displayed 1011 in binary, 0 in third column (right to left) as there is no 2^2 involved

Computer's all communicate through the use of binary. Binary is simply a system of counting with a base of two. For example:
0 = 0
1 = 1
10 = 2
11 = 3
100 = 4
101 = 5
110 = 6
111 = 7

Therefore, when communicating using binary, you are always limited to 2^x numbers, where x is the number of digits in the binary number.

2^1 = 2
2^2 = 4
2^3 = 8
2^4 = 16
2^5 = 32

This is why everything falls on this pattern of numbers. Make sense?

CompBanker

Interesting stuff, cheers rgilly and comp

My uni years were such a waste of time from an academic point of view...

CompBanker, I've always kind of known that without actually understanding it. Kudos for such a concise explanation.

A guy gets out of jail, hops into a silver cadillace, stops by a red hotel, runs inside and drops off a ton of cash and drives off. What just happened?

any one of a million things happened hobbes, there is no one right answer

Jimbo wrote:

any one of a million things happened hobbes, there is no one right answer

Answer - the guy was playing monopoly.

@ monopoly

There's a fine line between problem solving and imagination.

Curious to hear some answers although I personally won't put my sense of humour to the test just right now...

Just to follow up on my binary example, we can look at hexidemical the same way. The difference with Hex and Binary is that Hex can have 16 values for each digit.

0 = 0
1 = 1
2 = 2
3 = 3
4 = 4
5 = 5
6 = 6
7 = 7
8 = 8
9 = 9
A = 10
B = 11
C = 12
D = 13
E = 14
F = 15

So the first digit represents the first 16 numbers. Hex is just as easy, but appears more complicated than Binary. Examples:

4D = 416 + 14 = 76
28 = 2
16 + 8 = 40
AEB = (16^2 * 10) + (16^1 * 15) + (16^0 * 11) = 2560 + 240 + 11 = 2811

Same thing, only the first digit goes from 0 to 15 instead of 0 to 1 like in binary.

Why does Hex exist? Well, 16 = 2^4, so it is easy to convert between the two. For every 4 digits of binary, you can have 1 digit in hex.

Example:

hex: 3 = binary: 0010 (The front 2 zeroes mean nothing just like if i were to charge you \$001,000)
hex: F = binary: 1111
hex: 9 = binary: 101

Okay, so what happens if we have a huge number in hex and want to convert to binary? Easy:

Hex: F38BA = Binary: 1111 (the F), 0011 (the 3), 0100 (the 8), 1011 (the B), 1010 (the A). Our final answer is 11110011010010111010 in binary. Through this system we can quickly convert back and forth between the two and never need a calculator.

Let me know if anyone needs further explanation or has more questions.

CompBanker

There is a row of five houses, each having a different colour. In these houses live five people of various nationalities. Each of them nurtures a different beast, likes different drinks and smokes different brand of cigars.

The Brit lives in the Red house.
The Swede keeps dogs as pets.
The Dane drinks tea.
The Green house is on the left of the White house.
The owner of the Green house drinks coffee.
The person who smokes Pall Mall rears birds.
The owner of the Yellow house smokes Dunhill.
The man living in the centre house drinks milk.
The Norwegian lives in the first house.
The man who smokes Blends lives next to the one who keeps cats.
The man who keeps horses lives next to the man who smokes Dunhill.
The man who smokes Blue Master drinks beer.
The German smokes Prince.
The Norwegian lives next to the Blue house.
The man who smokes Blends has a neighbour who drinks water.
Who has fish at home?

coscarsm wrote:

There is a row of five houses, each having a different colour. In these houses live five people of various nationalities. Each of them nurtures a different beast, likes different drinks and smokes different brand of cigars.

The Brit lives in the Red house.
The Swede keeps dogs as pets.
The Dane drinks tea.
The Green house is on the left of the White house.
The owner of the Green house drinks coffee.
The person who smokes Pall Mall rears birds.
The owner of the Yellow house smokes Dunhill.
The man living in the centre house drinks milk.
The Norwegian lives in the first house.
The man who smokes Blends lives next to the one who keeps cats.
The man who keeps horses lives next to the man who smokes Dunhill.
The man who smokes Blue Master drinks beer.
The German smokes Prince.
The Norwegian lives next to the Blue house.
The man who smokes Blends has a neighbour who drinks water.
Who has fish at home?

Entertaining but fairly basic. It's easy to do once you have paper. It's called Einstein's question, i did it a few years ago.

http://www.jamsarts.com/illusion-einstein.htm

This is more of an LSAT question than a brain teaser. You definitely need paper for that

How do we know which is the first house?

I got Yellow, Blue, Red, Green, White in that order.

thats what I got but the rest didn't fit in. I may have made a mistake though.

You come to a Y in the road and ahead are two identical looking cities, the city of truth, and the city of lies. There are 2 hermits at this junction that look identical, except that one always tells the truth and one always lies. You're allowed 1 question to direct you to the City of truth. What's that question?

Just to clarify, they will both answer you on the same question, so if you asked, which way to the city of truth, one will say take the left road, the other will say take the right road.

ask either hermit " if i ask the other hermit which way leads to the city of truth, what would he say?"

devilindisguise wrote:

ask either hermit " if i ask the other hermit which way leads to the city of truth, what would he say?"

streetbuck wrote:
devilindisguise wrote:

ask either hermit " if i ask the other hermit which way leads to the city of truth, what would he say?"

Assume the city of truth is L and the other city is R.

If you ask that question, the truthful hermit will say R. The lying hermit will also say R. The key is to get them to say the same direction.

Therefore you know its L.

WizardofOz wrote:

Assume the city of truth is L and the other city is R.

If you ask that question, the truthful hermit will say R. The lying hermit will also say R. The key is to get them to say the same direction.

Therefore you know its L.

Ah, got it. I missed the part where the OP said both hermits will answer your question. I was thinking that you can only ask one question directed at one of the hermits.

no, they don't have to answer both, just one is enough

A lot of the ones I got further on were more probability and binomial option pricing driven...

You all should look into those.

Got any examples?

What number comes next in the sequence:
61, 691, 163, 487, 4201, ?

1. The sequence consists of the prime numbers which, when their digits are reversed, are perfect squares.

that is just asinine, what a stupid question.

Of course! Gotta know those primes and perfect squares if you're going to spread comps.

If I remember correctly, I think the most difficult question I was asked that was actually within the realm of possibility was to give the angle formed by the hands of a clock when it is 1:47.

Didn't get it exactly right. I was close, however, and still got the offer.

I generally don't ask candidates heavily quantitative brainteasers. If I'm going to hammer somebody it's usually on accounting or financial topics, particularly if they start to get all cocky.

<><><><><><><><><><><><><><>

Once more into the breach, dear friends.

I won't claim this is difficult, but here is one I got:

You have stacks of quarters, dimes, nickels and pennies. The number of coins in the stacks are irrelevant.

You can take coins from a stack in any amount, any order, and place them in your hand.

The question is: what is the greatest dollar value in coins you can have in your hands without being able to make change for a dollar [with the coins in your hand]?

what's the answer? 3 quarters, 4 dimes, 4 pennies?

What is the next number?

1, 11, 21,1211,...

yup, that's it

<><><><><><><><><><><><><><>

Once more into the breach, dear friends.

Trading brain teaers (understandably) are much more difficult as compared to IBD ones.

I got a pretty easy one - The interviewer throws a dice 3 times and each time I have the option of taking as many dollars as the dots on the face of the dice or playing again (till a max of 3 throws). I was to find the optimal strategy as well as the expected payoff of the game.

yikes. im dumb.

5 or 6 you keep the money, anything less you roll again right?

expected payoff would be, (1/3) 5.5 + (1/3) 4.375 + (1/3) 3.5? so 4.458333 is the expected payoff?

optimal strat = after 1st roll accept only a 5 or 6 or else reroll, after 2nd roll keep above 4 or reroll

Someone explain the pennies and nickels and dimes

Whats the max amount of money you can hold without being able to make change for \$1. I think stanford is right. 3 quarters, 4 dimes, 4 pennies. That's \$1.19. Add 1 more dime and you can take 5 dimes and 2 quarters to make \$1. Add one more penny and you can take 5 pennies, 3 quarters, and 2 dimes to make a dollar. Of course adding a quarter will allow you to make \$1. Same problem adding a nickle as if you add a penny.

ohhhhhh gotcha

3.9+ in finance man? nice g. where do you go to school. or what are some peer schools of it

You could also do 9 dimes, a quarter, and 4 pennies, for what it's worth (\$1.19).

<><><><><><><><><><><><><><>

Once more into the breach, dear friends.

sleepyguyb wrote:

3.9+ in finance man? nice g. where do you go to school. or what are some peer schools of it

Small private school, very non-target. Top 50 undergrad b-school, but not top 20.

Classic: 7 game series. How much do you bet on the first game to make sure you have \$100 if your team wins the series.

Classic: I will give you \$5,000,000 if you are able to beat the market this year. If you don't, I'll give you nothing. Would you be willing to do this? What strategy would you employ? (think about maximizing downside, so you either loose with low probability a lot or make a bit on each bet.)

Hardest brainteaser i,ve ever had was when a nterview a year back. Was unsure if the HR lady interviewing me wanted me to pin her down or bend her over.

Guess I will never know.

= 1.69. Does this not work?

no it doesnt because 2 quarters and 5 dimes is a dollar

You have a large cube made up of small cubes.
Small cubes measure 1x1. The large cube is 10x10.

How many are left when you take away the little cube on surface.

Some people got it, but I'll just throw up the thought process:

Dimes are next; can't have 5 (5 dimes + 2 quarters = \$1), so take 4.

Nickels: Can't have any (3 quarters plus two dimes plus a nickel would be \$1)

Pennies: can't have a nickel so you can't have 5 pennies, so you take 4.

3 Quarters + 4 Dimes + 4 Pennies = \$1.19

I'll also note that I completely fumbled this question and it helped me blow the interview.

is it 999 cubes?

its 512=888

Problem:
You are standing outside a 100-story building holding two identical glass spheres. You are told that either sphere, if dropped from the roof, would shatter upon hitting the earth, but that it would not necessarily break if dropped from the 1st story. Your task is to identify the lowest possible floor that you can drop a ball from and break it. QUESTION: In the general case, what is the smallest number of drops required to guarantee that you have identified the lowest story?

Notes:
- Both balls have the same minimum breakage story.
- You only have two balls to use. If one breaks, it cannot be used for the rest of the experiment.

Regarding my post above:

The solution does not require any advanced math. Rumor has it that a math olympian solved this in under a minute.

Legend has it that this question showed up in a BB trading interview. The applicant was a math major.

18 drops

There's a pretty intuitive explanation that goes along with the correct answer.

posted too soon. its 14?

I need a drink

<><><><><><><><><><><><><><>

Once more into the breach, dear friends.

I think I am way off (and I just suck at brainteasers) but shouldn;y it be 49 or 50? You do every even floor and when the first one breaks, you go down one floor and drop the second one. Technically this can happen on the 99th floor if you start from the bottom, so 50 would be the smallest number where you can know for sure.

here's how to approach the problem. you have to find the lowest number of drops so that each story will be covered. use the first ball to find the ball park from which to drop the second ball. in other words, we will use the first ball to find the maximum floor which the ball can be dropped from.

to get the lowest number of drops, you have to find the starting floor so that as you test each next set of drops, you still end up with the same number of drops. eventually, you find that if you start with the first orb to ballpark the range for the second orb by progressing in the sequence 14, 27, 39, 50, 60, 69, 77, 84, 90, 95, 99. suppose the ball breaks at 14, then you have to test 1-13. suppose the ball breaks at 99, then you have to test 96-98. no matter where the first orb breaks, you will always have 14 drops.

if you start any lower, say at 13, you wont make it all the way to 100. if you start higher, you dropping too many times.

51, I suppose.
Start at 2, then work your way up to 100 (am assuming roof, with guaranteed breakage, isn't considered 100th story). Find break, do a second drop test at F - 1 to guarantee lowest floor found.

Then go get a drink because you've wasted an afternoon dropping balls from windows.

<><><><><><><><><><><><><><>

Once more into the breach, dear friends.

See aachimp's post above.