Advertisement
Guest User

Probability notions for Star-Breaker Poster!

a guest
Jun 19th, 2019
100
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 3.93 KB | None | 0 0
  1. Let's start by saying that the probability of an event, let's call it p, is a number between 0 and 1
  2. So 0<=p<=1
  3. For example, the probability of rolling a 4 with a balanced six-faced dice is 1/6
  4. The sum of the probability of all the possible events must be 1
  5.  
  6. The models used to describe the probability of an event happening are called distributions. I'll explain roughly a few of those operating on natural numbers
  7.  
  8. Bernouilli distribution: let's imagine an event with only two possible outcomes, like a coin toss. Let's assign the value 1 to heads and 0 to tails.
  9. Considering a generic coin (no necessarily balanced), if we consider p the chance of getting heads, 1-p must be the chance of getting tails
  10. The Bernouilli distribution, denoted as X = B(1, p), tells us that the chance of obtaining a certain value in this scenario is:
  11. P(X = i) = (p^i)*((1-p)^1-i) with i = 0,1
  12.  
  13. This works for a single coin toss. Summing n times the same distribution B(1, p) we obtain the binomial distribution
  14.  
  15. Binomial distribution: the binomial distribution, denoted as X = B(n, p) modelizes the chances in scenario where we have n trials with p probability of success
  16. n must be a natural number >= 1
  17. Given k natural number and k<=n, the chances of obtaining k successes in n trials is P(X = k) = (n k)(p^k)((1-p)^(n-k))
  18. Example: let's suppose we must toss three times a balanced coin. p = 1/2 is the chance of getting heads. Then 1 - p = 1/2 is the chance of getting tails
  19. Expectedly they're equal. Let's suppose we'll execute 5 throws. This means n = 5. So our distribution is X = B(5, 1/2)
  20. The chances of getting three heads are then P(X = 3) = (5 3)((1/2)^3)*((1-1/2)^(5-2)) = 10*((1/2)^3)*((1/2)^2) = 10/32
  21.  
  22. The binomial distribution can be used in any scenario were the chances of success and in-success do not change after each trial.
  23. For example, it can be used in the case of a lottery were each number is reinserted after each extraction
  24. If we have a scenario where the odds change with each trial, we need another distribution.
  25.  
  26. Hypergeometric distribution: let's suppose we have a box with r+b balls, r of which are red, and the other b are blue
  27. r and b must be natural numbers
  28. We'll be extracting n balls without reinserting them in the box and n <= b+r
  29. Our current model is an hypergeometric distribution, denoted as X = Hyp(n, r, b)
  30. Given k natural and k<=r and k<=n, the chances of extracting k red balls in n trials is P(X = k) = ((r k)*(b n-k))/(b+r n)
  31. Red and blue balls are only an example. Hypergeometric distribution can be used in any scenario where there are two distinct sets of results and where the trial are without reinsertion
  32. Example: let's modelize Italy's biggest lottery. 6 numbers are extracted without reinsertion in the set {1,...,90}. If you guess all of them, you win the jackpot.
  33. So we have n = 6, b+r = 90. Our red balls will be the numbers we have chosen to try, so r = 6 and b = 90 - 6 = 84
  34. Since we have to guess all of them, k = 6
  35. So, our model is X = Hyp(6, 6, 84) and the chances of winning the jackpot are P(X = 6) = ((6 6)*(84 0))/(90 6) = around 9.6*10^(-9)
  36. This is why lotteries are a scam, but there's still more to add later on
  37.  
  38. Last, but not least, we may want to know how likely is to have a success after a certain number of trials
  39. This is what we need the modified geometric distribution for
  40.  
  41. Modified geometric distribution: given p probability of success of a certain event
  42. The modified geometric distribution, denoted as X = Geom(subscript mod)(p), tells us the chance of having a success after k (natural) trials
  43. The chance is P(X = k) = p*(1-p)^k
  44. There's one more interesting thing about the modified geometric distribution.
  45. The average of such a distribution is 1/p. Since 0<p<=1, 1/p>1. What we obtain is that 1/p is the average number of attempts it takes to have a success
  46. Going back to the example of the Italian lottery, it means that it takes on average around 104 million attempts to win the jackpot. Is it really worth it?
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement