Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- \documentclass[10pt, a4paper]{scrartcl}
- \usepackage{amsmath}
- \usepackage{amssymb}
- \usepackage{graphicx}
- \usepackage{blindtext}
- \usepackage[utf8]{inputenc}
- \usepackage{tikz}
- \usepackage{dsfont}
- \usepackage[english]{babel}
- \usepackage{fullpage}
- \usepackage{ wasysym }
- \usepackage{algpseudocode}
- \title{Randomized Algorithms and Probabilistic Methods: Graded Homework 2 }
- \author{Stefan Lochau - 12-913-752}
- \newcommand\abs[1]{\left|#1\right|}
- \newcommand\pr[1]{\text{Pr}\left[#1\right]}
- \begin{document}
- \maketitle
- \section*{Exercise 1}
- We first define the probability space $\Omega=\prod\limits_{e\in E} \Omega_e$ as the product of the $\Omega_e=([k],\text{uniform distribution})$. We note that $\abs{E}\leq\frac{n \cdot d}{2}$. We see that the influence of the $i$-th coordinate $c_i$ on $X$ is at most $k$. Thus we get by applying Azuma's inequality:
- $$ \text{Pr}\left[ \abs{X-\mathbb{E}(X)} \geq \alpha \sqrt{n} \right] \leq 2 e^{-\frac{\alpha^2 n}{2\sum\limits_{e\in E} k^2}}\leq 2 e^{-\frac{\alpha^2 n}{nd k^2}}= 2 e^{-\frac{\alpha^2}{d k^2}}\in 2 e^{-\omega(1)}\subset o(1)$$
- \section*{Exercise 2}
- \subsection*{a)}
- We first define $X_{v_i}$ as the indicator variables that $v_0$ informed $v_i$ in the first $64 \log(n)$ rounds, and their sum $X=\sum_{i=1}^n X_{v_i}$. Furthermore
- $$\mu:=\mathbb{E}\left[X\right]=\sum_{i=1}^n \text{Pr}(X_{v_i}=1) = \sum_{i=1}^n \left(1-\text{Pr}(X_{v_i}=0)\right) = n - n(1-\frac{1}{n})^{64\log(n)} $$
- We are interested in $\text{Pr}\left[ X \leq 16 \log(n) \right]$. As this can be interpreted as the "balls in the bins" problem we can easily see that the $X_{v_i}$ are negatively associated. Thus we can apply Chernoff and the inequality $1-x \geq e^{-x-x^2}$ for $x\in[0,1/2]$, and get:
- $$ \text{Pr}\left[ X \leq 16 \log(n) \right] = \text{Pr}\left[ X \leq (1- \frac{\mu-16\log(n)}{\mu})\mu \right] \leq e^{-\frac{\mu \delta^2}{2}}= e^{n-n(1-\frac{1}{n})^{64\log(n)}-16\log(n)} \leq$$
- $$e^{n-ne^{(-\frac{1}{n}-\frac{1}{n^2})64\log(n)}-16\log(n)} = e^{n-ne^{o(1)}-16\log(n)}=e^{O(1)} n^{-16} \leq c \cdot n^{-16}$$
- for some $c$ and $n$ large enough.
- \subsection*{b)}
- We are interested in $p:=\text{Pr[at least half of the informed nodes send the information to uninformed nodes]}$. For $k$ already informed nodes we can calculate the probability that at least $k/2$ of the already informed nodes inform other already informed nodes by $ \frac{\binom{k}{k/2} \left(\frac{k}{2}\right)^{k/2}}{n^k}$, and get $$p \geq 1- \frac{\binom{k}{k/2} \left(\frac{k}{2}\right)^{k/2}}{n^k} \geq 1- \frac{(2ek/k)^{k/2} (k/n)^{k/2}}{n^k}=1-\left(\frac{2ek}{n}\right)^{k/2}$$
- \section*{Exercise 3}
- \subsection*{a)}
- We first note that the algorithm does return a valid vertex cover. We denote $C= \text{VertexCover}(G, \prec)$ and assume it is not a vertex cover, i.e. there is an edge $e$ for which $e \cap C = \emptyset$. However this is not possible as the for-loop went over $e$ and should have added both endpoints to $C$ in the iteration. As elements which are once put into $C$ stay there, this is a contradiction. We instantly get that $vc(G)\leq \abs{C}$.
- We note that for all sets of vertices $C'$, we have that $\abs{C'\setminus C}+\abs{C'\cap}$
- %%%%%%%%%% NEEDS STUFF
- \subsection*{b)}
- We need to show that ``$e$ is used to construct $C ``\Leftrightarrow ``\forall e'$ incident to $e$, $e' \prec e$, and $e'$ was not used''.
- The left side is equivalent to ``$e \cap C = \emptyset$ in the iteration''.
- \subsubsection*{``$\Leftarrow$''}
- As no incident $e' \prec e$ was used to construct $C$, neither of the endpoints can have been added to $C$, so this is true.
- \subsubsection*{``$\Rightarrow$''}
- By contraposition: We see that if one edge $e' \prec e$ was used to construct $C$ the node incident to both was added in the $e'$ iteration. This means that in the $e$ iteration later, $e\cap C \neq \emptyset$ as desired.
- $\hfill \square$
- \subsection*{c)}
- We see that starting from a fixed vertex there are at most $d^t$ possible paths of length $t$. With a uniformly selected ordering of the edges, the probability that a fixed path of length $t$ is decreasing, is $1/t!$. So we get by adding the probabilities of the two endpoints of our fixed edge $e$, that the probability of having a decreasing path of length $t$ starting in $e$ is at most $2d^t/t!$.
- \subsection*{d)}
- \section*{Exercise 4}
- \subsection*{a)}
- We abbreviate $\mathcal{H}(n \times k,p)$ by $\mathcal{H}$.
- We note that $\text{Pr}\left[ e \text{ isolated} \land e \in \mathcal{H} \right]=\pr{e \text{ isolated}| e \in \mathcal{H}} \pr{e \in \mathcal{H}}$. The second factor is equal to $p$, so we need to estimate the first one. There are $\sum\limits_{j=1}^{k-1} \binom{k}{j} (n-1)^j$ possible edges that share vertices with our distinctly selected edge $e$. They are not in the graph with probability $1-p$ independently, so we get that $$\pr{e \text{ isolated}| e \in \mathcal{H}}= (1-p)^{\sum\limits_{j=1}^{k-1} \binom{k}{j} (n-1)^j}\geq (1-p)^{kn^{k-1}+O(n^{k-2})}\geq e^{(-p-p^2)(kn^{k-1}+O(n^{k-2})} =$$
- $$e^{(-\frac{\delta}{n^k}-\frac{\delta^2}{n^{2k}})(kn^{k-1}+O(n^{k-2}))}\geq e^{-\delta k (1+O(n^-1))} \stackrel{n \text{ large enough}}{\geq} e^{-\delta k}-\beta\geq 1-2\beta$$
- Going back to the initial problem $\text{Pr}\left[ e \text{ isolated} \land e \in \mathcal{H} \right]$, we get the lower bound $p (1-2\beta)=\delta(1-2\beta)n^{-(k-1)}$ as desired.
- \subsection*{b)}
- We see that with the $\Omega_i$ as $Bernoulli(p)$ variables for each possible edge $e\in V_1\times \dots \times V_k$ and $J$ as a maximal matching, $\psi(r)=r$ satisfies the condition of Talagrand's inequality. Furthermore $c_i=1$ as required. Thus we can apply Lemma 8.2 and get together with $\mathbb{E}[X]+f(n) \sqrt{n} \in O(n)$:
- $$ \pr{\abs{X-\mathbb{E}[X]}\geq f(n) \sqrt(n)} = 4 e^{-\Omega\left( \frac{f(n)^2 n}{\mathbb{E}[X]+f(n) \sqrt{n}}\right)}=4 e^{-\Omega\left( f(n)^2 \right)}$$
- $\hfill \square$
- \end{document}
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement