Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Isn't the pdf of a uniform distribution just f(x) = 1. What he gave was an exponential distribution with rate parameter lambda. The rest of the math seems correct. But the dependence on e is not surprising as it is relevant to the initial pdf. Am I missing something?


> Am I missing something?

No, I was confused too. The maths doesn't make sense to me and the whole blog seems to exist just to push amazon affiliate links.

If you are actually interested in probability theory, I highly recommend this (non-affiliate-link) book:

http://www.amazon.co.uk/Probability-Computing-Randomized-Alg...

It does a good job of motivating the subject by applying the new theory in each chapter to the study of useful randomized algorithms.


It's very easy to verify experimentally that e is likely the correct answer.


Doubly confirmed. 2.71832 over 100M trials.

  import random

  totalsteps = 0
  for i in range(100000000):
  	sum = 0.0
  	steps = 0
  	while sum < 1.0:
  		sum += random.random()
  		steps += 1
  	totalsteps += steps
  
  print('AVG STEPS: ' + str(totalsteps / i))


This is almost exactly what I wrote, I've done more than 4 billion trails and I have: 2.7182718267 (pypy speeds it up a lot).


It's funny. I read the problem, estimated that it was "probably 2 or 3" and then wasn't all that surprised to find that the solution was e. The derivation of it is pretty cool, too.


:) I was too fast for my own good and ddos-ed my box with this loop; it took me 5 minutes to stop it and replace range with xrange


That's because you're using an obsolete Python :)


Confirmed again:

    perl -e 'my $testcount = 10_000_000; my $total = 0; foreach my $i (0 .. $testcount) { my $sum = 0; while($sum < 1) { $sum += rand(); ++$total } } print $total / $testcount'

    2.7186292


I confirmed it for fun. Got 2.746 over 10000 trials. Close enough.

  (average (map-n (fn ()
		    (1+ (position-if (let1 x 0
				       (fni (< 1.0 (incf x (random 1.0)))))
				     (range 100))))
		  10000))


Yep: http://codepad.org/gIaWdVY9

EDIT: Beaten to it :)


You're right. I think the article's reasoning is fallacious. At least, I don't see what the Poission distribution has to do with this at all.

One way to arrive at the answer correctly is to write the number of samples needed before they first sum to 1 as N and observe that the probability that P[N > k] = 1/k!. You can get this from a k-dimensional integral. A general formula for E[N] is Sum_k=0^infty P[N > k] thus E[N] = e.


Yes, you're right; the Poisson distribution is completely superfluous to the argument.


You are correct.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: