Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- \documentclass{article}
- \usepackage{amsmath, amsthm, amssymb, amsfonts}
- \usepackage{mathrsfs}
- \usepackage{algorithmic}
- \usepackage{listings}
- \usepackage{graphicx}
- \usepackage{float}
- \usepackage{enumerate}
- \usepackage{fancyhdr}
- \usepackage[left=0.75in, top=1in, right=0.75in, bottom=1in]{geometry}
- \pagestyle{plain}
- \newtheorem*{thm}{Theorem}
- \newtheorem*{cor}{Corollary}
- \newtheorem*{claim}{Claim}
- \floatname{algorithm}{Procedure}
- \renewcommand{\algorithmicrequire}{\textbf{Input:}}
- \renewcommand{\algorithmicensure}{\textbf{Output:}}
- \begin{document}
- \rhead{Maxwell Anselm\\MATH4335 HW1\\10/11/11\\1.7,10,24, Supplementary problems}
- \thispagestyle{fancy}
- $\quad$ %for vertical space
- \section*{1.7}
- A Poisson distribution is a discrete distribution used when counting how many
- events occur over a period of time when
- \begin{itemize}
- \item the events occur independently of each other and
- \item the expected number of occurrences over a fixed amount of time is
- constant.
- \end{itemize}
- In such situations, the probability of $k$ events occuring is
- \[P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!}\]
- where $\lambda$ is the expected number of events in that time period.
- The textbook application for a Poisson distribution is in the radioactive decay
- of atoms, where the average rate of emission is known but the events occur
- independently. Poisson distributions do not necessarily need to be over time,
- however. One example is the mass production of chocolate chip cookies. We do not
- know the exact number of chocolate chips being used, but we can measure the
- expected number of chocolate chips per cookie, and it is safe to assume that
- chips appear in cookies independently. We can thus use a Poisson distribution to
- calculate the probability of a cookie having a specific number of chips.
- \section*{1.10}
- \begin{itemize}
- \item[(a)] Let $n$ be the number of days in the year. Let our sample space $S$
- be the set of all sequencies of $n+1$ people. Let $X$ be the number of people
- who have entered the room when we get the first repeat and let $X_i$ be the
- decision variable defined by
- \[X_i(s) = \begin{cases}
- 1 & \text{if the $i^\text{th}$ person is the first repeat}\\
- 0 & \text{otherwise}
- \end{cases}\]
- for all $s \in S$. Then the expected value is
- \[E(X) = \sum_{s \in S} P(s) X(s) = \sum_{s \in S} P(s) \sum_{i = 1}^{n + 1} i
- X_i(s) = \sum_{i = 1}^{n + 1}i\sum_{s \in S}X_i(s)P(s).\]
- But the inner sum simply counts the number of ways for the first repeat to be
- when the $i^\text{th}$ person enters the room. Thus
- \[\sum_{s \in S}X_i(s)P(s) = \left(\frac{n!}{(n - i + 1)!} (i - 1) n^{n + 1 - i
- }\right) \frac{1}{n^{n + 1}}\]
- because we can permute the first $i - 1$ people, person $i$'s birthday must
- collide with one of the first $i - 1$ and then we can choose whatever birthdays
- we want for the remaining $n + 1 - i$ people. Thus
- \[E(X) = \sum_{i = 1}^{n + 1} \frac{i (i - 1) n!}{n^i (n - i + 1)!}\]
- For $n = 365$ we get $E(X) = 24.61658$.
- \item[(b)]
- \[P(X \leq N) = 1 - P(X > N) = 1 - \frac{n}{n} \cdot \frac{n - 1}{n} \cdot \ldots \cdot
- \frac{n - N + 1}{n} = 1 - \frac{n!}{n^N (n - N)!}\]
- Using a computer program to iterate over $N$, we find that the the first
- value for which the probability is greater than $\frac{1}{2}$ is $N = 23$.
- \item[(c)] As in part (b), we get that $N = 58$.
- \item[(d)] If we have 365 or fewer people, it is possible for each person
- to have a unique birthday. If, however, we have 366 people then by the
- pigeonhole principle there must be two people who share a birthday.
- \item[(e)] $\sqrt{365} \approx 19.1$
- \section*{1.24}
- Let $A$ and $B$ be the events that 1 and 2 are fixed points of a permutation,
- respectively. Then
- \[P(A \cap B) = \frac{(n-2)!}{n!}\]
- \[P(A^C \cap B) = \frac{(n - 2)(n - 2)!}{n!}\]
- where we can think of the second probability coming from the first because if 1
- is not a fixed point then we can start with 1 being a fixed point and then
- swap it with one of the $n-2$ other numbers.
- We can perform a sanity check with these probabilities since
- \[P(A \cap B) + P(A^C \cap B) = \frac{(n - 2)! + (n - 2)(n - 2)!}{n!} = \frac{(n
- - 2)!(1 + n - 2)}{n!} = \frac{(n - 1)!}{n!} = \frac{1}{n} = P(B)\]
- \end{itemize}
- We can also calculate
- \[P(B|A) = \frac{n (n - 2)!}{n!} = \frac{(n - 2)!}{(n - 1)!} = \frac{1}{n - 1}\]
- \[P(B|A^C) = \frac{(n - 2)(n - 2)!}{(n - 1)(n - 1)!} = \frac{n - 2}{(n - 1)^2}.\]
- Let $X$ and $Y$ be random variables that can take on values $x_j$ and $y_k$ for
- $j \in J$, $k \in K$, respectively. Then
- \begin{align*}E(X + Y) &= \sum_j\sum_k(x_j + y_k)P(X = x_j, Y = y_k)\\
- &= \sum_j\sum_k x_j P(X = x_j, Y = y_k) + \sum_k\sum_j y_k P(X = x_j, Y = y_k)\\
- &= \sum_j x_j P(X = x_j) + \sum_k y_k P(Y = y_k)\\
- &= E(X) + E(Y)
- \end{align*}
- Since no restrictions were placed upon $X$ and $Y$, this relationship holds for
- all sums of random variables. Thus for
- \[X = \sum_{i = 1}^nX_i,\]\[E(X) = \sum_{i = 1}^nE(X_i).\]
- \section*{Supplement}
- Note Stirling's approximation:
- \[n! \sim \sqrt{2\pi n}\left(\frac{n}{e}\right)^n\]
- Then we can approximate
- \[\prod_{j = 0}^{k - 1}\frac{n - j}{n} = \frac{n!}{n^k (n-k)!} \approx
- \frac{\sqrt{2\pi n}}{n^k \sqrt{2\pi (n - k)}} \left(\frac{n}{e}\right)^n
- \left(\frac{e}{n-k}\right)^{n-k} = \left(\frac{n}{n-k}\right)^{n - k +
- \frac{1}{2}} e^{-k}\]
- \end{document}
Add Comment
Please, Sign In to add comment