Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- \documentclass[10pt]{article}
- \usepackage[margin=0.8in]{geometry}
- \usepackage[font={footnotesize}]{caption,subfig}
- \bibliographystyle{abbrv}
- \renewcommand{\labelenumiii}{(\alph{enumiii})}
- \begin{document}
- \title{\bf\Large {Multilinear Functions}}
- \author{\normalsize{Vivek Bhattacharya}}
- \date{\normalsize{January 28, 2010}}
- \begin{singlespace}
- \maketitle
- \end{singlespace}
- \begin{enumerate}
- \item Call the vector spaces $U$ and $V$, with ordered bases $u_i$ and $v_i$.
- Suppose $A$ has rank 0. Then choose $\omega_1$ and $\omega_2$ to
- be the 0 operators. Thus, suppose $A$ has rank 1. Then, the linear
- transformation $L$ corresponding to $A$ takes any vector $v \in \mathbb{R}^{n_2}$
- to a vector of the form $c p$, where $c\in \mathbb{R}$ and $p$ is a particular
- vector in $\mathbb{R}^{n_1}$. Let the components of $p$ be denoted $p_i$ in some
- ordered basis of $\mathbb{R}^{n_1}$. We claim that defining $\omega_1$ such that $\omega_1(u_i) = p_i$
- and letting $\omega_2 \in V^*$ suffices to let $\mu = \omega_1\omega_2$. If we
- let $q\in\mathbb{R}^{n_2}$, we see that $(Lq)_i = \sum_{j=1}^{n_2} q_j \mu(u_i, v_j)$,
- where $q_j$ are the components of $q$ in some ordered basis. Then, $(Lq)_i = \mu(u_i, \sum_{j=1}^{n_2} q_jv_j) = c p_i$.
- Define $\omega_1 \in U^*$ such that $\omega_1(u_i) = p_i$. Choose $\omega_2 \in V^*$
- such that $\omega_2(\sum_{j=1}^{n_1} q_jv_j) = c$. Then, $\mu = \omega_1\omega_2$.
- Suppose conversely that $\mu = \omega_1\omega_2$. We wish to show that there exist
- $c_q \in \mathbb{R}$ and $p \in \mathbb{R}^{n_1}$ such that $Lq = c_q p$ for all $q \in \mathbb{R}^{n_2}$.
- Note that we can write $(Lq)_i = \sum_{j=1}^{n_2} q_j \mu(u_i, v_j) = \sum_{j=1}^{n_2} q_j \omega_1(u_i) \omega_2(v_j)$.
- We can write this as $\omega_1(u_i) \left(\sum_{j=1}^{n_2} q_j \omega_2(v_j)\right)$. Choose $c_q = \left(\sum_{j=1}^{n_2} q_j \omega_2(v_j)\right)$
- and $p$ such that $p_i = \omega_1(u_i)$. We have shown that the rank of $L$ (and thus
- of $A$) is at most 1. (The rank will be 0 if $c_q$ is 0 for all $q$.)
- \item (First show that...)
- We have thus shown that $\{v_1^*\cdots v_m^*w\}$ span the space. To show they
- are linearly independent, consider the linear combination
- \[\sum_\alpha c_\alpha v_1^*\cdots v_m^* w,\]
- where $\alpha$ is an index that covers the entire set $B_1 \times \cdots B_m \times C$.
- For the above combination to equal 0, it must equal 0 when operating on
- an arbitrary vector $(a_1, \cdots, a_m)$, where $a_i \in V_i$. Then,
- \[\sum_\alpha c_\alpha v_1^*(a_i)\cdots v_m^*(a_m) w.\]
- But, note that if $b \in B_i$, then $b^*(a_i)$ is the component
- of $a_i$ along $b$. If we choose $a_i \in B_i$ for all $i$, it becomes
- clear that the only way to ensure the above sum is 0 is to let $c_\alpha = 0$ for all $\alpha$.
- This shows linear independence.
- The dimension of the space follows from simple counting.
- \item Suppose $|\mu(v_1, \cdots, v_m)| \leq M |v_1| \cdots |v_m|$. Then,
- $|\mu(v_1, \cdots, v_m)|$ is bounded above by $M$ when the norms of the $v_i$
- are at most 1. Then, $||\mu|| \leq M$ since the supremum of the relevant
- set cannot exceed $M$. Now suppose $||\mu|| \leq M$. Then,
- \[\left|\mu\left(\frac{v_1}{|v_1|}, \cdots, \frac{v_n}{|v_n|}\right)\right| = \left|\frac{1}{|v_1|\cdots |v_n|}\mu(v_1, \cdots, v_m) \right| \leq M,\]
- since the first expression has $\mu$ evaluated at vectors with unit norm. From
- properties of the norm on $W$, we clearly see
- \[|\mu(v_1, \cdots, v_m)| \leq M |v_1| \cdots |v_m|.\]
- We have
- \begin{align*}
- ||\mu + \nu|| & = \sup\{|\mu(v_1,\cdots, v_n) + \nu(v_1, \cdots, v_n)| : v_i \in V_i \text{ and } |v_i| \leq 1 \} \\
- & \leq \sup\{|\mu(v_1,\cdots, v_n)| + |\nu(v_1, \cdots, v_n)| : v_i \in V_i \text{ and } |v_i| \leq 1\} \\
- & \leq \sup\{|\mu(v_1,\cdots, v_n)| : v_i \in V_i \text{ and } |v_i| \leq 1\} + \sup\{|\nu(v_1, \cdots, v_n)| : v_i \in V_i \text{ and } |v_i| \leq 1\} = ||\mu|| + ||\nu||.
- \end{align*}
- We have
- \begin{align*}
- ||c\mu|| & = \sup\{|c \mu(v_1,\cdots, v_n) | : v_i \in V_i \text{ and } |v_i| \leq 1 \} \\
- & =\sup\{|c| | \mu(v_1,\cdots , v_n) | : v_i \in V_i \text{ and } |v_i| \leq 1 \} \\
- & = |c| \sup\{ | \mu(v_1,\cdots, v_n) | : v_i \in V_i \text{ and } |v_i| \leq 1 \} \leq |c|||\mu||.
- \end{align*}
- (4)
- \item
- \item For convenience, let $\{v_i\}$ be an ordered orthonormal basis of $V$.
- We first show that $\beta$ is onto. Choose $\omega \in V^*$ with $\omega = \sum_i c_i v_i^*$.
- Then, for an arbitrary vector $V \ni \tilde v = \sum_i \tilde c_i v_i$, we have $\omega(\tilde v)
- = \sum_i c_i v_i^*(\tilde v) = \sum_i c_i \tilde c_i$. We claim that $V \ni v = \sum_i c_i v_i$
- satisfies $\beta(v) = \omega$. Note that
- \[\left(\sum_i \tilde c_i v_i\right) \bullet \left( \sum_j c_j v_j\right) = \sum_i\sum_j \tilde c_i c_j (v_i \bullet v_j).\]
- Using orthonormality, we have $v_i \bullet v_j = \delta_{ij}$. This lets the sum collapse to
- $\sum_i \tilde c_i c_i$, as required.
- We now must show that $\beta$ is univalent. Choose $w, x \in V$ such that $w = \sum_i a_i v_i$
- and $x = \sum_i b_i v_i$. Then, for an arbitrary $V \ni \tilde v = \sum_i \tilde c_i v_i$, we have
- $\tilde v \bullet w = \tilde v \bullet x$. From out discussion above, $\sum_i \tilde c_i a_i = \sum_i \tilde c_i b_i$
- for all $\tilde c_i$. If we choose $\tilde v$ to be $v_j$, so $\tilde c_i = \delta_{ij}$, we see that
- $a_j = b_j$ for all $j$. Thus, $w = x$, meaning $\beta$ is univalent.
- \item
- \item
- \item
- \item
- \item
- \item
- \item Therefore, we have
- \begin{align*}
- ||L|| & = \sup\{|L(v) \bullet w| : v\in V, |v| \leq 1, w\in W, |w| \leq 1\} \\
- & = \sup\{|v\bullet L^*(w)| : v\in V, |v| \leq 1, w\in W, |w| \leq 1\} = ||L^*||.
- \end{align*}
- \end{enumerate}
- \end{document}
Add Comment
Please, Sign In to add comment