Lets hear some of them you guys have gotten in investment banking interviews...

What to have some fun? Try these two.

1. let's say A keep tossing a fair coin, until he get 2 consecutive heads, define X to be the number of tosses for this process; B keep tossing another fair coin, until he get 3 consecutive heads, define Y to be the number of the tosses for this process.

Calculate P{X>Y}

1. On an island there are snakes of 3 different colors. blue, red, yellow. Every time 2 snakes of different colors met, they both change their color to the 3rd color. eg, if a red snake meet a yellow snake, they both change to blue.

Now if we know at a certain time, there are 13 blue, 15 red, and 17 yellow snakes, question: could the snakes meet so that eventually all snakes become
same color?

The answer to the snake problem is YES! They all can become one color, to proof this it is sufficient to show that there exist one solution where all snakes are one color, e.g.. one red and one yellow snake meet, this will lead to 14 blue, 14 red, and 16 yellow snakes. Now the 14 blue and 14 red snakes are meeting leading to all snakes being yellow. QED

• 1
ralph240574:

The answer to the snake problem is YES! They all can become one color, to proof this it is sufficient to show that there exist one solution where all snakes are one color, e.g.. one red and one yellow snake meet, this will lead to 14 blue, 14 red, and 16 yellow snakes. Now the 14 blue and 14 red snakes are meeting leading to all snakes being yellow. QED

Nope. One red and one yellow snake meeting will lead to 15 blue, not 14 blue. Compbanker was right earlier. You need a difference of 3 between 2 colours to make this possible.

14+14+16 is not equal to 13+15+17

Solution to problem 1 can be found here: http://pratikpoddarcse.blogspot.com/2009/10/lets-s...

sonyyy:

What to have some fun? Try these two.

1. let's say A keep tossing a fair coin, until he get 2 consecutive heads, define X to be the number of tosses for this process; B keep tossing another fair coin, until he get 3 consecutive heads, define Y to be the number of the tosses for this process.

Calculate P{X>Y}

1. On an island there are snakes of 3 different colors. blue, red, yellow. Every time 2 snakes of different colors met, they both change their color to the 3rd color. eg, if a red snake meet a yellow snake, they both change to blue.

Now if we know at a certain time, there are 13 blue, 15 red, and 17 yellow snakes, question: could the snakes meet so that eventually all snakes become
same color? Anyone figure out number 1 yet? I don't get how to do it and it's driving me crazy.

sonyyy:

What to have some fun? Try these two.

1. let's say A keep tossing a fair coin, until he get 2 consecutive heads, define X to be the number of tosses for this process; B keep tossing another fair coin, until he get 3 consecutive heads, define Y to be the number of the tosses for this process.

Calculate P{X>Y}

1. On an island there are snakes of 3 different colors. blue, red, yellow. Every time 2 snakes of different colors met, they both change their color to the 3rd color. eg, if a red snake meet a yellow snake, they both change to blue.

Now if we know at a certain time, there are 13 blue, 15 red, and 17 yellow snakes, question: could the snakes meet so that eventually all snakes become
same color?

Solution to Problem 1:

Anal Analyst:

Anyone figure out number 1 yet? I don't get how to do it and it's driving me crazy.

sonyyy:

What to have some fun? Try these two.

1. let's say A keep tossing a fair coin, until he get 2 consecutive heads, define X to be the number of tosses for this process; B keep tossing another fair coin, until he get 3 consecutive heads, define Y to be the number of the tosses for this process.

Calculate P{X>Y}

1. On an island there are snakes of 3 different colors. blue, red, yellow. Every time 2 snakes of different colors met, they both change their color to the 3rd color. eg, if a red snake meet a yellow snake, they both change to blue.

Now if we know at a certain time, there are 13 blue, 15 red, and 17 yellow snakes, question: could the snakes meet so that eventually all snakes become
same color?

1. P(X>Y) =0 since Y>X a.s.
2. No it is not possible since you need total number to be even at least to have that happen.

Isn't the answer to number 2 no because there is an odd number of snakes? Unless there is a way to get it so there are 15 of each, which i can't figure out.

Get some probability questions from the GMAT. www.gmatclub.com has a forum, and on it you can find a bunch of proba questions.

Remember, you will always be a salesman, no matter how fancy your title is.
- My ex girlfriend

go over expectations. I had one of the hardest interview probability questions and it was just expectation however it was near impossible to structure the answer. got it though after about 5 minutes of hard work and got the job. Richard Durret has a text called Probability: theory and example that is amazing. pick it up.

Got this two parter for Bear F.A.S.T. summer analyst spot:

a) Easy: How much would you pay to roll a single die, if you got \$1 if you roll a 1, \$2 for a 2, etc...?
b) Harder: Okay, now how much would you pay if you were allowed to roll twice and take the higher of the two?

Ok obviously for (a) the answer is \$3.50 because you just mult. each amount paid by 1/6 and add them, but for (b), if each role is valued at \$3.50, one's instinct is to say \$7.00 because your essentially being given two roles for the price of one, but I think that is the teaser answer. What did you say and do you know the process to take for the correct answer?

i've heard a variation of b) from a friend who heard it from a friend at Merrill. it was the same thing except you dont get the higher of the 2, if you roll the 2nd time you get the payout of the 2nd roll, not the greater of the two, but i'll give yours a shot.

this may be the long way but i still believe its correct. draw out a decsion tree.

first node is 1st roll. consists of 6 branches, each corresponding to what you roll, each with a prob of 1/6 . at the end of each branch is another node for 2nd roll, each with 6 branches of its own, same as above.

you're gonna have 36 unique combinations. at the end of each combo, you take the meax of the first and 2nd roll. then multiply that by 1/6 twice, i.e.e 1/36.

sum all of those and theres youre total expected value and therefore how much you'd be willing to pay.

------

"its the running joke now, we now have fair trade with china so they send us poisoned sea food and we send them fraudulent securities."

------

"its the running joke now, we now have fair trade with china so they send us poisoned sea food and we send them fraudulent securities."

Well I don't think for b it is meant you keep the higher of the two times two, simply you get to keep the higher of the two, so I would think the value of the 2nd roll has to be either the probability of rolling higher than expected return from first roll, or >3.50 (i.e. 4, 5, 6) and then if it rolls 1, 2, 3 value is 0. So given you have a 50% chance to roll higher, and then your return could be \$1.50 on avg more (5 = avg of 4, 5, 6, 5 - 3.5 = 1.50), and then 50% to get 0, you would pay an extra \$0.75 for the roll or \$4.25 in total.

That is how I would break it down. Never got this or anything like it, so I am not sure at all.

using my method its, which i am sure of, the answer is 4.47222. because i am bored and lame, i did it in excel, pm me if you want the excel file, hah...

------

"its the running joke now, we now have fair trade with china so they send us poisoned sea food and we send them fraudulent securities."

------

"its the running joke now, we now have fair trade with china so they send us poisoned sea food and we send them fraudulent securities."

Alex Kap has the right answer. Having never taken probability, I didn't get it.

NICE HAHAHA... But I just used logic, I had probability questions that I didn't get like that, but at the same time they like the overall thought process, I just made a mistake in one step.

I had an interview with a top BB and not only did they ask me what I would pay for 1 roll, they 2 rolls, then 3 rolls, then they asked what I would pay for 2 rolls should I be allowed to keep the maximum of the two rolls.

There are 6 possible outcomes for the 1st roll (1, 2, 3, 4, 5, 6), each with a probability of 1/6. For each possible outcome, there is a corresponding expected value for the "better-of-two" rolls.

First roll outcome: 6
Conditional expected value: 6

First roll outcome: 5
Conditional expected value: (5/6)5 + (1/6)6 = 5.1667 (this is expected, i.e. >5)

First roll outcome: 4
Conditional expected value: (4/6)4 + (1/6)(5+6) = 4.5

First roll outcome: 3
Conditional expected value: (3/6)3 + (1/6)(4+5+6) = 4

First roll outcome: 2
Conditional expected value: (2/6)2 + (1/6)(3+4+5+6) = 3.6667

First roll outcome: 1
Conditional expected value: (1/6)1 + (1/6)(2+3+4+5+6) = (1/6)(1+2+3+4+5+6) = 3.5 (in other words, if you roll a 1 the 1st throw, we scrap the score and roll again as if we were only given 1 throw)

Since each 1st roll outcome has a probability of 1/6, we can get the total expected value by add up the weighted expected values for each conditional outcome.

That is,

(1/6)6 + (1/6)5.1667 + (1/6)4.5 + (1/6)4 + (1/6)3.6667 + (1/6)3.5 =

(1/6)*(6 + 5.1667 + 4.5 + 4 + 3.6667 + 3.5) =

(161/36) = 4.4722

You are presented with a fair, 6-sided die. What is the expected number of rolls needed to roll two 6's in a row? (i.e. if you roll two 6's right off the bat, that counts as 2 rolls)

its actually 1/17 if u thought about it...

the probability for rolling any single target number with 2 dice is 2/34 because, for example you could roll a 2 and a 5 or a 5 and a 2 ( 2 ways to roll a 7 ) this is harder to demonstrate for pairs but if you thought of it as if u had 1 die with circles and 1 with squares and u were aimin to roll a total of 12 shapes, then you could possibly roll 6 squares and then 6 circles or vice-versa, givin you two possibilities to roll the 12

so your probability of rolling 2 sixes is 2/34 or 1/17( why not 2/36? because its impossible to roll a 1 since theres 2 dice )

"You are presented with a fair, 6-sided die. What is the expected number of rolls needed to roll two 6's in a row? (i.e. if you roll two 6's right off the bat, that counts as 2 rolls)"

Well the probability of rolling 2 sixes in a row is 1/36.....so the number of expected rolls would be 36?

Man this shit reminds me of my industrial org class. If you struggle with bayes approach (probability tree) your going to hate game theory where you then have to use backwards deduction.

"Oh - the ladies ever tell you that you look like a fucking optical illusion?"

"Oh the ladies ever tell you that you look like a fucking optical illusion" - Frank Slaughtery 25th Hour.

heh the best of 2 rolls tripped me up too, I came up with 4.25 but EE and Oranhoutan are def right.
Oranhoutan what is the answer to your question? I have no idea, I tried coming up with prob for rolling two 6s in a row but then I have no clue what the expected # of would be. It's def not 36, that's just avg # of rolls required to roll 2 6s in general.

I had these two questions in one of the interviews: a) you have a 10x10 in cube of ice suspended in the air, it is made up of 1x1 in smaller cubes, when the ice starts to melt, the outer layer of cubes falls away, how many 1x1 cubes are left still together? and b) you have a fair 6-sided die, if you roll a 6 you win, if you roll 1,2,3,4, or 5 you keep rolling..what is your prob of winning?

mykyta:

Oranhoutan what is the answer to your question? I have no idea, I tried coming up with prob for rolling two 6s in a row but then I have no clue what the expected # of would be. It's def not 36, that's just avg # of rolls required to roll 2 6s in general.

Why is it not 36? Wouldn't expected # of rolls for tow 6s in general be 12? E(1/6*12)=2.

Let's make the two-6's-in-a-row question a little simpler, what is the expected number of rolls to get one 6?

Or maybe,

Instead of having a die, suppose you had a fair coin. What is the expected number of flips to get two heads in a row? How about a head followed by a tail?

As for the 10x10x10 cube question, if the outer layer melts away, you end up with an 8x8x8 cube = 512 cubes.

For the second question, you'll eventually win, i.e. you'll eventually roll a 6 if you are allowed to keep rolling indefinitely. Am I misreading the question?

EE and Oranhotan are right. Normally the question is, what's the expected value of two rolls assuming you can roll again after the first roll, and only your last roll counts?

The answer to that one would be 4.25, but if it's max of both rolls, then its 4.47.

I may have forgotten how the question was phrased, and yes, that does make a big difference. In any case, it must have been just the second roll, because 4.25 was the answer.

You can answer what the probability is that out of X rolls, you will get at least one 6 using a geometric random variable. The probability that you will get at least one 6 out of 6 rolls is about 0.668.

Broken down:
P(X=1) = 1/6=.167
P(X=2) = (5/6)(1/6)=.139
P(X=3) = (5/6)^2(1/6)=.116
P(X=4) = (5/6)^3(1/6)=.096
P(X=5) = (5/6)^4(1/6)=.080
P(X=6) = (5/6)^5(1/6)=.067
sums to .668

If you wanted to, you could multiply the #rolls times the respective probability (1.167+2.139...etc) up to an infinite number of rolls and find that the expectation converges to 1/P, where P is 1/6 in our case, so the expected number of rolls to get a six (or any other number on the die) is 6.

The easiest/quickest idea to pull out of this is that the expectation of a geometric random variable is just 1/p, where p is the probability that your event will happen in a given roll.

Finding the E of 2 sixes in a row would take longer, but it seems that it would be less than 36. If you treat each event as a pair of rolls the expectation would be 36 rolls, but you have to count rolls 2 and 3, 4 and 5, etc. as well.

Let's denote the event of hitting a six as "6".
Let's also denote the event of hitting two 6's in a row as "6-6".

Notice that, in order to get to "6-6", you have to first get to "6".

That is, the expected hitting time to "6-6" = expected hitting time to "6" + expected hitting time to "6-6" from "6".

Hope that helps.

If you can figure out what the expected hitting time to flipping two heads in a row for a coin...

i.e., suppose you found a coin that flips with 1/6th probability of heads and 5/6th probability of tails.

alright so with a = expected # of flips to get heads with 50% chance of heads or tails on each flip: a = 1/2(1) + 1/2(1+a) -- first part of the right side is hitting heads on first try, second part is starting back at a but now with 1 flip having taken place.
solving for a we get 1/2a = 1 --> a = 2

Going further for 2 heads in a row we get: a = 1/2(1+a) + 1/4(2+a) + 1/4(2)
in this one we have first part denotes hitting tails right away so back to a, second part is hitting Heads and then hitting Tails, so again back to square one, finally last part is hitting 2 heads in a row on the first two flips. Solving for a again we get
a = 3/4a + 3/2 --> 1/4a = 3/2 --> a=6

Extending this to the 6 sided die with your great hint of looking at it a coin with unequal weights we get: a = 5/6(1+a) + 5/36(2+a) + 1/36(2)
solving for a we get 1/36a = 7/6 --> a = 42

This thing bothered me all day haha.

You're given a loop of copper wire. You're instructed to insert the wire loop into a machine that will make three random cuts (independently, uniformly distributed) along the wire.

What is the probability that you'll end up with a piece of wire (of the 3 pieces) with at least half the length of the original loop?

What is the probability that a straight wire cut in two places (independent, uniformly random) will have its three pieces make a triangle?

the above 2 questions are essentially the same.

How applicable is all this to trading?

Are you expected to do this all in your head where you think outloud and explain HOW you would arrive at the answer? Or are you given a paper and pen and given a couple minutes.

What concepts should one know to answer these questions? Some i saw mentioned were expected value and game theory... any others?

^^^yea how the hell do you guys know the answers??? what are they teaching you in school that i never saw...

Is there a book that would cover similar questions and answers? To get my head back in the game?

roll a die, and you get paid what the dice shows. if you want, but you don't have to, you can roll the die again and get paid what the second roll shows instead of the first. what is the expected value?

I don't know why this called "Hardest Probability Questions" ..
I assume it is a kind of Joke ..

Lets Solve first Rolling Once ..\
The expected value of a random variable X is denoted by E(X). For a discrete random variable, E(X) is calculated as

Rolling a "1" has a probability of 1/6.
Rolling a "2" has a probability of 1/6.
Rolling a "3" has a probability of 1/6.
Rolling a "4" has a probability of 1/6.
Rolling a "5" has a probability of 1/6.
Rolling a "6" has a probability of 1/6.

Multiplying the values with their respective probability gives:
1 * 1/6 = 1/6
2 * 1/6 = 2/6
3 * 1/6 = 3/6
4 * 1/6 = 4/6
5 * 1/6 = 5/6
6 * 1/6 = 6/6

1/6 + 2/6 + 3/6 + 4/6 + 5/6 + 6/6 = 3.5

Now solvin Rolling the Dice Twice ,,

EX. The random variable X has the following probability distribution: x pX(x)

2 1 / 36
3 2 / 36
4 3 / 36
5 4 / 36
6 5 / 36
7 6 / 36
8 5 / 36
9 4 / 36
10 3 / 36
11 2 / 36
12 1 / 36

The random variable X assumes a value equal to the sum of two dice rolls. Its expected value is calculated as

= 2(1/36) + 3(2/36) + 4(3/36) + 5(4/36) + 6(5/36) + 7(6/36) + 8(5/36) + 9(4/36) + 10(3/36) + 11(2/36) + 12(1/36)
= (1/36) (2 + 6 + 12 + 20 + 30 + 42 + 40 + 36 + 30 + 22 + 12)
= (252/36) = 7

In the long run, the average value of two dice rolls using regular dice is 7.

• 2
IdrisAlMalki:

I don't know why this called "Hardest Probability Questions" ..
I assume it is a kind of Joke ..

The random variable X assumes a value equal to the sum of two dice rolls. Its expected value is calculated as

= 2(1/36) + 3(2/36) + 4(3/36) + 5(4/36) + 6(5/36) + 7(6/36) + 8(5/36) + 9(4/36) + 10(3/36) + 11(2/36) + 12(1/36)
= (1/36) (2 + 6 + 12 + 20 + 30 + 42 + 40 + 36 + 30 + 22 + 12)
= (252/36) = 7

In the long run, the average value of two dice rolls using regular dice is 7.

dude, just add the expected value of two independent variables and you get 7. no need for all that calculations.

are you actually retarded?

IdrisAlMalki:

I don't know why this called "Hardest Probability Questions" ..I assume it is a kind of Joke ..

Lets Solve first Rolling Once ..\The expected value of a random variable X is denoted by E(X). For a discrete random variable, E(X) is calculated as

Rolling a "1" has a probability of 1/6.Rolling a "2" has a probability of 1/6.Rolling a "3" has a probability of 1/6.Rolling a "4" has a probability of 1/6.Rolling a "5" has a probability of 1/6.Rolling a "6" has a probability of 1/6.

Multiplying the values with their respective probability gives:1 * 1/6 = 1/62 * 1/6 = 2/63 * 1/6 = 3/64 * 1/6 = 4/65 * 1/6 = 5/66 * 1/6 = 6/6

Adding them together gives:1/6 + 2/6 + 3/6 + 4/6 + 5/6 + 6/6 = 3.5

Now solvin Rolling the Dice Twice ,,

EX. The random variable X has the following probability distribution: x pX(x)

2 1 / 363 2 / 364 3 / 365 4 / 366 5 / 367 6 / 368 5 / 369 4 / 3610 3 / 3611 2 / 3612 1 / 36

The random variable X assumes a value equal to the sum of two dice rolls. Its expected value is calculated as

= 2(1/36) + 3(2/36) + 4(3/36) + 5(4/36) + 6(5/36) + 7(6/36) + 8(5/36) + 9(4/36) + 10(3/36) + 11(2/36) + 12(1/36)= (1/36) (2 + 6 + 12 + 20 + 30 + 42 + 40 + 36 + 30 + 22 + 12)= (252/36) = 7

In the long run, the average value of two dice rolls using regular dice is 7.

..........Are you the kind of person who thinks the "7 8 9" joke is funny?

assuming a strategy where you only chose to roll the die again if the outcome of the first die was below the expected value of rolling a fair die (3.5)

you get 4.25 because you still have (1/6)(4 + 5 + 6) but you also add (1/12)(1 + 2 + 3 + 4 + 5 + 6)

basically you have to relate the confidence interval (involving standard deviation of the sample mean, which is a summation of bernoullis (i.e. binomial)) to the size of the test, and produce the minimum "n" that will guarantee that your confidence interval is correct.

certainly challenging (especially with the 15 minute time frame).

This is incorrect. The only possible results of the toss with your strategy (which is correct) is 1,2,3,4,5,6, 4,5,6. Hence the expectation value (mean payout) is 4, not 4.25.

• 1

Jesus christ. This was at an interview? I'm guessing given the difficulty it was either Jane Street or DE Shaw.

Jesus christ. This was at an interview? I'm guessing given the difficulty it was either Jane Street or DE Shaw.

I'm guessing he did a stats heavy course at university/grad level, and the interviewer had too. Neither of those companies will ask you about concepts you haven't seen before. It's a core part of their ethos.

having had that kind of training, this question gets easier. I'm not saying it isn't hard, but this is a lot more approachable if you've done similar stuff for the past 3 years

what is O? I assume its not zero...

bigblue3908:

what is O? I assume its not zero...

http://en.wikipedia.org/wiki/Big_O_notation

No no i have the hardest problem, given that f(x) is differenciatable at x = 0 and Lim/(x?>0) (f(x)?f(kx))/x = a, where a and k are constants. Show that f'(0) = a/1-k

"...all truth passes through three stages. First, it is ridiculed. Second, it is violently opposed. Third, it is accepted as being self-evident."

• Schopenhauer
seabird:

No no i have the hardest problem, given that f(x) is differenciatable at x = 0 and Lim/(x?>0) (f(x)?f(kx))/x = a, where a and k are constants. Show that f'(0) = a/1-k

lim/(x->0) (f(x)/x) - lim/(x->0) (f(kx)/x) = a
take derivative of top and bottom (L'Hopital's rule) and get
lim/(x->0) f'(x) - lim/(x->0) kf'(kx) = a
plug in 0
f'(0) - k
f'(0) = a
factor
f'(0) (1-k) = a
f'(0) = a/(1-k)

no fuckin clue about the probability problem tho

Hmm I disagree, but I could be wrong. Just did this on scrap paper and got the same thing just (d-1/2)^2 in the denominator. Are you sure thats the correct answer?

**Ops read the question wrong. Disregard this.

Looks like Strong Law of Large Numbers to me.
Given the hint, the P( max | x - x_n_trials | > d ) < = summation (1 to n) P( | x - x_n_trials | > d ) (Boole's inequality)
since each xi is bounded by 0 and 1, the latter probability is gonna be a finite (1 to n) sum of 2exp (-2d^2*n). (Hoeffding's inequality)
Such thing converges since it is a geometric series. It should give you the result.

^ yes

^ Almost surely

Actually the hardest question for this forum would be prove pi is irrational!

Have done proof for e and pi...

lik omg infinite descent ~~!!!

look at my skill!!!!!!!!!!!!!!!!!

i think proofs are over rate BC at the end of the day it has to be judged to actually be a proof proof :/

and proofs != end all to maths

TBH

/DEBATE

gogogo

4.25

Another one: given a circle, make n random cuts, what is expected length of longest piece?

For each draw {A , B , C , D} , isn't there a 1/4! chance that you draw this draw in ascending order.
Therefore, isn't the EV of this game (1/4!)* 10 - ( ((4! - 1)/4!)*1).

for your first 2 cards, the probability of the second being > first is even, yes. After that, though, the probability of each subsequent card being > than the one before decreases.

in fact, my guess would be it goes something like this - P(2>1) = 0.5

P(3>2) = 48.5/95 (we know card 1 is one of the cards less than card 2, so the probability of card 3 being greater than card 2 is always skewed by 1 card out of 95?)

P(4>3) = 49/94 (repeat logic from above but skewed by 2 cards out of 94)

So you get (1/2)(48.5/95)(49/94) = ~0.133, slightly greater than (1/2)^3, which is intuitive because these are non-independent in a way where each previous step positively impacts the probability of the subsequent step?

Caveat: I'm not 100% confident in the answer, I'm also more confident in my reasoning than my math (embarrassingly enough)

so the 97 cards are just labeled 1-97? i'm confused

The probability that a random draw of 4 cards will be in ascending order should be 1/4!, irrespective of the sample size. Why?
Consider the case where you draw {1 , 2 , 3 , 4}. Given that you have drawn {1, 2, 3, 4}, the probability that you have drawn them in exactly that order is 1/4! or 1/24. Since the situation is symmetric with respect to each such quadruplet, the probability that your random draw is in ascending order is 1/4!.

From here we can calculate the EV of the game 1/24*10 - 23/24 = -13/24 < -0.5 .

GS is absolutely correct. Whether you are drawing from 97, 500, or 1,000,0000 cards, the probability of drawing 4 cards in ascending order is the same. You can draw 4 cards in 4! or 24 ways. Furthermore, only 1 of those 24 ways will yield cards in ascending order.

GS is right. I wrote a quick python script and got about -.57 over a million tries

http://www.codesend.com/view/738cb0bbd40e9642bb720...

Cruncharoo:

GS is right. I wrote a quick python script and got about -.57 over a million tries

http://www.codesend.com/view/738cb0bbd40e9642bb720...

Isn't your code drawing the cards with replacement? Looks like it to me (I don't know how the random class works but if it's a generic random num generator, then you are drawing them with replacement)

Yeah you're right. Fixing it now

http://www.codesend.com/view/435936f17ff0e09a91f16...
There. That gets me a lot closer to the -13/24 value. Good catch.

yea you are right the red herring threw me off

I believe the answer to the snake one is no. My explanation may not make a lot of sense though. Basically, in order for you to get all the snakes a single color you need to get the other 2 sets of snakes to be equal. Okay, so you have 13, 15, and 17 snakes. Every time you make a change, 2 numbers go down by 1 and one number goes up by 2. This means that you need 1 type of snake to be exactly 3 less than another in order to equal them out.

Right now the snakes are all 2 (or 4) apart in quantity. There is no change or set of changes you can do to make 2 colors seperated by 3. This is due to each mutation either closing the gap by 3 or keeping them equal. Hence, impossible.

yes one or two of the previous explanations to the snake problem are correct. The snakes cannot all become the same colour simply because of the relative difference between sankes gaiven being divisible by two. By inspection , only a difference between any two snake numbers that is divisible by 3 will work . Hence the answer is no!

There is some thread laying around on the net that focuses on quantitative questions
Also, you probably heard about the "Heard on the Street". The new book by Mark Joshi is much better

do they let you use paper or a calculator for these?

Seriously?

My first thought:

(4 C 1) (15 C 3) / (16 C 4)

This is wrong, though, since it's 1. I'll double-check and get back to you.

Second thought: Yep, it's 1/4.

Are the cards being replaced after each draw or not? If yes, then it is 1/4 since on the 4th draw you will have 4 clubs out of 16. But if the cards are not being replaced, then you have to sum up the 4 distinct possibilities. By the last draw, there will be 13 cards left. If you had picked clubs in each of the previous 3 draws, then your probability of drawing the last club will be 1/13. If you had drawn 2 clubs in the previous 3 draws, your probability would be 2/13 and so forth. Hence, you will need to add up 1/13+2/13+3/13+4/13 to get 10/13, which should be the final answer.

It doesn't matter. It's 1/4. The problem is perfectly symmetrical.

This is actually a very clever interviewing question. Among other things, it tests how sure of yourself you are and if you're the type to just jump into math without considering the situation first.

I had two other clever questions pop up in an interview recently, both of which made you go through long-winded calculations whose answer was most of the work of the problem, but not the brief final step from there that they wanted. Luckily I noticed the second time it happened, but watch out for that.

brainteaser:

What is the probability that the fourth card drawn is a club. (you aren't told what you drew in cards 1 2 or 3).

let me rewrite this question in a simpler way for you.

"What is the probability of choosing a Club when choosing a card at random?".

It doesn't matter if it's the fourth or fourteenth card, if you don't know anything about the other cards being drawn.

Here's another one for you:

You have a friend who has two children. At least one of the children is a boy, what are the odds that your friend has a girl?

brainteaser:

What is the probability that the fourth card drawn is a club. (you aren't told what you drew in cards 1 2 or 3).

let me rewrite this question in a simpler way for you.

"What is the probability of choosing a Club when choosing a card at random?".

It doesn't matter if it's the fourth or fourteenth card, if you don't know anything about the other cards being drawn.

Here's another one for you:

You have a friend who has two children. At least one of the children is a boy, what are the odds that your friend has a girl?

And the relevant:

You have a friend who has two children. The eldest is a boy, what are the odds that your friend has a girl?

[It's important to know that the answers are not the same]

Here's another one for you:

You have a friend who has two children. At least one of the children is a boy, what are the odds that your friend has a girl?

And the relevant:

You have a friend who has two children. The eldest is a boy, what are the odds that your friend has a girl?

[It's important to know that the answers are not the same]

What are the answers to these? Why are they different?

Think of it as a permutation of the 16 cards. The probability of any one card out of the 16 being the 4th card is 15!/16! = 1/16. And since there're 4 clubs, the probability of the 4th card being a club is 4 x 1/16 = 1/4.

And also, as one of the above posters mentioned, since the condition for each of the 4 suits is identical, the probability of the 4th card being each suit has to be equal, which is a much quicker way to get the answer.

After you have drawn 3 cards, the expected no of cards of each suit left are (4 - 3/4). Thus, prob of 4th being from any suit is 1/4.

Correct

GS:

Correct

There are four possible outcomes, each equally likely:

Boy-Boy, Boy-Girl, Girl-Boy and Girl-Girl

Probability that couple has "a girl" is 3/4. The probability that the couple has "a boy" is also 3/4. They're not mutually exclusive of course. The probability that a couple has two boys is 1/4, two girls is also 1/4 etc.

"You have a friend who has two children. At least one of the children is a boy, what are the odds that your friend has a girl?"

You can exclude the Girl-Girl outcome. That leaves Boy-Boy, Boy-Girl, Girl-Boy. In two out of these three possible outcomes the friend has a girl, so the probability for that is 2/3.

"You have a friend who has two children. The eldest is a boy, what are the odds that your friend has a girl?"

In this case you can exclude two outcomes Girl-Boy and Girl-Girl because the older kid is a boy. That leaves Boy-Boy, Boy-Girl and in one of these the kid is a girl, so the probability is 1/2.

I think the previous poster forgot one fact that changes things a bit.

The key here is that the problem says "you can roll the die again and get paid what the second roll shows INSTEAD of the first" (caps added). This means that you if you decide to re-roll, you should disregard your first roll and just keep the second one.

The first time you roll the dice, you have two options:

• You keep you score
• You re-roll the die

Since the expected value (EV) of one dice roll is 3.5, it makes sense to keep your score if it is higher than 3.5, and re-roll if you get a score less than 3.5.

If you roll over 3.5, then the score you got must have been either 4, 5, or 6. Of these three numbers, the EV is 5.

The other possibility (which is equally likely) is that you get 1, 2, or 3. Then, it makes sense to re-roll. Again, on your second roll, your EV is 3.5.

Since you have two options, which are equally likely, the EV is:
1/2(the chance you don't reroll) + 1/2(the chance you re-roll and ignore your first score)
= 1/2(5) + 1/2(3.5)
= 1/2(8.5)
= 4.25

Note: If you had to keep the first roll and add it to your second score, it would always make sense to re-roll. Then, as the last poster mentioned, the EV would be 3.5+3.5 = 7.

• Claudia

The problem was missing information, as the expected value depends on the strategy the user uses. To find the optimal strategy, we can safely argue that, by intuition/inspection, the strategy we want to use is that if we roll n or less, then we roll a second time, otherwise we keep our first roll. The expected value is then

(6 + n)/2 * (6 - n)/6 + 3.5 * n/6

Treating n as continuous will give us a maximum at n = 3.5. We then check n = 3 and n = 4 to see which value gives us the highest expectation, and when n = 3 we get 4.25 whereas when n = 4 we get 4, so we choose the strategy with n = 3 which will give us the max expectation.

c'mon what're the odds you get probability questions?

according to Bayes' Rule (what is the probability of getting a probability question, given that you have an interview for an entry level position in structured finance): about 13%

(interview for entry level position in structured finance | probability of getting a probability question) = 0.13

Or did I get the order wrong? Game theory was so long ago....

Well according to Murphy's law, there is a 100% chance that you will get a probability question, given how worried you seem.

thx guys

Hang in there!

Hahaha, good response.

I think that P(X>Y) is 1/4. Here's why:

the odds that you get two straight heads on your 2nd roll is 1/4
the odds that you get two straight heads on your 3rd roll is (1/4)(1/2)... 1/2 is the odds that first roll is tails
the odds that you get two straight heads on your 4rd roll is (1/4)
(1/2)*(1/2)

Getting the two straight heads by roll 3 or roll 4 has the same likelihood as getting 3 straight (1/8 and (1/8)*(1/2)). The only difference between X and Y is that Y can't happen in 2 rolls, whereas X has a 1/4 chance

Downtown:

the odds that you get two straight heads on your 2nd roll is 1/4
the odds that you get two straight heads on your 3rd roll is (1/4)(1/2)... 1/2 is the odds that first roll is tails
the odds that you get two straight heads on your 4rd roll is (1/4)
(1/2)*(1/2)

Downtown, you're overlooking the fact that there are two ways to create the event X=4

You can have the following for X=4 (where T is tails and H is heads)

T T H H
H T H H

The actual probabiliy function of X=x is:

f(x) = F_(x-1) * (1/2)^x, where F_(x-1) represents the Fibonnaci sequence and x=2,3,4,....

Derivation of f(x)
If you write it out, you will discover that there's

1 way to make X=3
2 ways to make X=4
3 ways to make X=5
5 ways to make X=6
8 ways to make X=7

The probability of two heads is (1/4) or equivalently, (1/2)^2
The probability of anything before that is (1/2)^(x-2).

Therefore f(x) = F_(x-1)(1/2)^(x-2)(1/2)^2
= F_(x-1)*(1/2)^x

If you check out the Wikipedia article on Fibonnaci Power Series, you can show that the sum of f(x) from x=2 to x=infinity does equal 1, and thus f(x) is a proper probability mass function.

Regarding event Y...

Y is a little bit more dfficult. Since X and Y are independent, the P(X>Y | Y=y) = P(X > y) = sum of f(x) from x=y+1 to x=infinity or equivalently, 1 - f(2) - f(3) ... f(y).

To show the P(X>Y) for all Y, we must calculate the infinite sum:
P(Y=3)P(X>3)+P(Y=4)P(X>4)+P(Y=5)*P(X>5)...

The PMF of Y=y is:

g(x)=G_(x-2)*(1/2)^x for x=3,4,5,6,...

where G_x= G_(x-1)+G_(x-2)+G_(x-3) with the seed G_0=0, G_1=1, and G_2=1

If you write it out, there's

1 way to make Y=4
2 ways to make Y=5
4 ways to make Y=6
7 ways to make Y=7
13 ways to make Y=8

.... and I don't know how to handle that sequence...

So.... that one problem probably takes the cake for hardest probability problem... perhaps there's some elegant solution to the problem

Not a textbook, but covers just about every quant question you'll face in interviews:

http://www.amazon.com/Heard-Street-Quantitative-Qu...
If you want a textbook, here's an old, but well written probability text:

http://www.amazon.com/Probability-Stochastic-Proce...

Many thanks Monkey!

Also, anyone have a pdf of the solomon book?

Very true. The expectation operator is a linear one.

1-((1-.3)^3) = 65.7%

i was thinking more like this:

5 / 3! = 5 / 6 = 83%
6 combinations of scenarios:
FFF
FFW
FWF
WWW
WFW
WFF
the only scenario in which he loses more all games is FFF so that leave 5 success

First, there are 8 possible combinations not 6 (2^3). Second, you're ignoring the fact that the chances of F and W are not equal. All you have to do to answer the question is calculate the probability of FFF which equals .7*.7*.7 which equals 34.3%. That's the probabilty that no goals are scored in any game. All of the other scenarios meet your requirement of at least 1 goal scored. So the answer is 1 minus .343.

Since saying "the odds he will score at least one" is the same as saying "1 minus the odds that he won't score any", we use the "no goals" scenario as our basis.

30% chance of scoring means 70% chance of not scoring.

Since each game is an independent event, the odds of him not scoring at any of the three is simply their product:
70% x 70% x 70% = (.70)^3 = 0.343

So since we have a 34.3% chance that he will not score anything, we know that there is a 100% - 34.3% = 65.7% chance he will score at least one.

this is a simple binomial question

1-0.7^3

as someone said, simple binomial problem

Prove that no three positive integers a, b, and c can satisfy the equations (a^n)+(b^n) = (c^n) for any integer value of n greater than two.

balbasur:

Prove that no three positive integers a, b, and c can satisfy the equations (a^n)+(b^n) = (c^n) for any integer value of n greater than two.

sorry, i don't think andrew wiles posts here anymore.

Art Vandelay:
balbasur:

Prove that no three positive integers a, b, and c can satisfy the equations (a^n)+(b^n) = (c^n) for any integer value of n greater than two.

sorry, i don't think andrew wiles posts here anymore.

never know <3

balbasur:

Prove that no three positive integers a, b, and c can satisfy the equations (a^n)+(b^n) = (c^n) for any integer value of n greater than two.

Haha, we did that in IB HL math (Grade 12)

• 1

you need to explain more specifically what you are trying to do. are you trying to get a qualitative understanding of the greeks?

are you looking for a textbook for an intro college probability course (with multivar + continuous distributions)? that will get you started in options, but to derive even a relatively simple model yourself (like black-scholes) you need to go down the rabbit hole into the world of stochastic calculus...

HopefulIBGuy:
balbasur:

Prove that no three positive integers a, b, and c can satisfy the equations (a^n)+(b^n) = (c^n) for any integer value of n greater than two.

Haha, we did that in IB HL math (Grade 12)

Sure, you solved this in grade 12! This is why it took more than 3 centuries to find a proof to this simple problem. It was only solved in 1994 by Andrew Wiles and Richard Taylor. (@ Art Vandelay: I'm not sure everyone got your post ;)

Read "Fermat's last theorem" by Simon Singh. Highly recommendable book!
Or have a quick look into wikipedia: http://en.wikipedia.org/wiki/Fermat%27s_Last_Theorem

If you are looking for something about the history of risk and probability, this was an interesting read.

Like structure sort of said, you will need to be very comfortable mathematically (relative to normal finance math).

That said, Shreve's Stochastic Calculus for Finance I and II (2 books) are great from what I've heard (although I've never read them past the introductions, the second book has 2 chapters introducing/reviewing probability theory through information & conditioning). Maybe see if your library has a copy before buying, though.

shreve's is the best stuff out there - actually explains most of the math instead of merely laying out the underlying assumptions.

in general, it's not probability that gets most people hung up on options pricing - it's the differential equations. for example, a lot of pricing books (ie - Hull) will simply say 'and here we solve the black scholes PDE' - but they don't always explain how they derive or solve it, or really what it means.

hrmph well i guess i'll look to see if my library has shreve's text book. thanks

yesman has it spot on. PDEs are very important for trading options.

Grimmet and strizaker if you are serious about it

and if he did, no one here would understand the proof anyway.

the probability in the interviews is nothing that is technically/theoretically advanced, its really the foundation concepts of probability like Bayes theorem but under a great amount of time pressure and in a brainteaser format.

Are you from IU?

interview probability will be simple EV type stuff that just gets marginally harder to make sure you can think analytically/prove you can do some sort of math. nothing mind boggling. know dice/coinflip EV etc..

Thanks. Guess I'll just look over some of my old notes, good to know it's not going to be too bad.

the answer to the snake question is YES, agree with ralph! Coin question is 4.25, and dice quesiton is 3.5

It depends if the cards are replaced or not...if not you need to make a path dependent tree.

blackthorne:

It depends if the cards are replaced or not...if not you need to make a path dependent tree.

doesn't matter. if u don't know what was picked then it's 25% whether you replace or not. Let me say it this way:

if there are 16 cards, same way as in the OP and if I picked 15 randomly and didn't replace, what are the odds that the 16th card is a club? 25%. Same with the first, second, etc.

if the card is replaced, it is 25%

if not, you need to take into account the first 3 cards

Born in hell, forged from suffering, hardened by pain.

Think of it as a permutation of the 16 cards. The probability of any one card out of the 16 being the 4th card is 15!/16! = 1/16. And since there're 4 clubs, the probability of the 4th card being a club is 4 x 1/16 = 1/4.

Miscer the probabilities might match so you could be right, the three will spit at 1/4 as well

Or you could do it like this.

16! combinations of cards, lock in the fourth as a club (4 ways to do this), You end up with 15! of arranging the cards. Thus you get 4*15!/16!=1/4. Yep you guys are right

1. let's say A keep tossing a fair coin, until he get 2 consecutive heads, define X to be the number of tosses for this process; B keep tossing another fair coin, until he get 3 consecutive heads, define Y to be the number of the tosses for this process.

Calculate P{X>Y}

Solution: 3/32
Problem can be thought of as "what is the probability that B gets 3 consecutive heads and A doesn't get 2 consecutive heads". Doesn't matter when A gets his heads as long as B gets it first.

This covers all cases because this defines when X>Y, and it becomes an easy problem.

P{A doesn't gets 2 consecutive heads} = 1 - P(A gets 2 consecutive heads} = 1 - (1/2)(1/2) = 3/4.
P{B gets 3 consecutive heads} = (1/2)
(1/2)*(1/2) = (1/8).

P{both occuring} = (3/4) * (1/8) = 3/32.

edit nvm i think this is wrong

Do past events effect future probabilities? I am going to assume no, and say that the team still has a 60% chance of winning their next game.

I agree with the past results (losing streak) not affecting the next game. Not sure about the probability, because you could make an argument about the .600 might not be a good predictor if it is not many games, and it probably depends on what team they are playing. Maybe I'm reading too much into it though haha.

Dude, you can't figure this out? That's very basic probabilities.

Either you consider the probability of losing 4 games then winning 1, P(LLLLW)=.4^4*.6^1 assuming independance, or you consider the probability of winning one game, 60%.

PS : obviously, P(LLLLL) < P(LLLLW)... but P(W|LLLL) = P(W).

of course its basic, but you gave two answers, depending on how you interpret it. i prefer the former, but others can make the case for 60%.

"as the team continues losing (and assuming that the .600 success rate does not change)"

just a question, if the .600 is the win-loss RATIO (which doesn't seem the same as the probability of winning a single game), then how can that ratio NOT change if the team continues losing???

You're over analyzing it.

Assuming the question is exactly how you describe, using the memoryless property, past performances do not affect future performances. So the answer is yes. It will win with a probability of 60%.

I mean, I can see how it could be done with a weighted coin - you can theoretically adjust the weight 60-40, so it comes out heads 60% of the time and tails 40%. In this case we can say both the long term percentage of heads AND the probability for a single event will approach 60%.

So if we make the analogy that we have a 60-40 weighted coin, and it just came up tails the last 4 times, then we'd still expect it to come up heads the 5th time because of the weighting.

But I'm not sure you can say the same with a baseball game, so this is the crux:

Is the HISTORICAL winning percentage necessarily equal to the probability they will win a certain game?

I think i'm understanding this better. P(LLLLW) is really only the probability of a four game losing streak with one win, which is different than the probability of one win.

60% -If you flip a coin and get tails 4 times in a row, the probability of heads on the 5th flip is still .5.

what does a 747 weigh

but that was a recent consulting interview

I completely read the problem wrong, I believe drexel is correct.

^ Disagree. If I remember probability correctly, that would be the odds of four Queens in a row. Also, you should be multiplying, not adding. He's drawn 13 cards, and already knows the first is a Queen, so the question is what are the odds there are another three Queens in that. Intuitively, I think it would be:

((3/51)12)((2/50)11)((1/49)*10)

leaving out the first queen, you now have 51 cards to choose from to form the remaining 12 cards in your hand. There are (51 choose 12) possible hands.

Of these hands, there are (3 choose 3) * (48 choose 9) hands that have all 3 remaining queens in them

So your probability is (3 choose 3) * (48 choose 9) / (51 choose 12), which is roughly 1%

i'm no math major, but wouldn't it be

P(four queens)/P(at least one queen) ? (edit)

we know the denominator is true

this would be

[(4 c 4)(48 c 9)/(52 c 13)] / [1 - (48 c 13)/(52 c 13)]

if my plugging-into-calc skills are good, this equals ~0.38%

I have taken a probability class and I agree with drex as well

Not a math major either but I agree with drexel

People like Coldplay and voted for the Nazis, you can't trust people Jeremy

Toronto:

I have taken a probability class and I agree with drex as well

Oh this is rich.

Drexel is actually wrong, jackpot779 is correct.

Toronto:

I have taken a probability class and I agree with drex as well

Oh this is rich.

Drexel is actually wrong, jackpot779 is correct.

Yes, I'm pretty sure jackpot is right.

Notice he said that you flip the first one over and it turns out to be a queen/king/ace/whatever. So the question really is what's the probability that there will be 3 more of the first card in the remaining 12 face down.

You assume that the other 3 of the same cards are somewhere within the 12 remaining cards. That leaves you 9 spots to fill, and 48 choices to fill them with (since the 4 of the same card are already there). Then count how many ways that can happen, which is (48 C 9).

Agree with jackpot's solution.

chewingum:

So the question really is what's the probability that there will be 3 more of the first card in the remaining 12 face down

Missing this speaks volumes of my attention to detail lol

Sorry OP, hope it wasn't that important...

People like Coldplay and voted for the Nazis, you can't trust people Jeremy

I used a different method but also arrived at 1%.

oops I messed up

Wouldn't it just be (3/51)(2/50)(1/49) [The probability of doing it with the 3 cards] * 12 C 3 [# of ways to arrange that in 12 cards]

so (3/51)(2/51)(1/49)* 12!/(3!9!) which comes out to be about 1.05%

lol half the commenters didn't even read/understand the question.

jackpot is correct. Didn't read rest though.

jackpot is correct

John

thanks guys for posting your solutions

Shouldn't it be then (3 choose 3) * (48 choose 8) / (51 choose 12), since we have 8(instead of 9) more cards to choose out of 48 ? There are four queens so it seems to me that we have to subtract 4 from 12, not 3.

jackpot is right i think

it is 9. out of the other 12 cards you have, you want to figure out how many combinations of hands with
- 3 queens and
- 9 other cards
there are

After you draw 4 queens, you have 9, not 8, more cards to choose because 4 + 9 = 13 and the problem says we drew 13 cards.      Competition is a sin.

-John D. Rockefeller  • 1  