MATH 403 Homework

Homework 8 due Apr 4

Problem 1

Let \(\{X_{n,k}\}_{1\leq k\leq n}\) be a triangular array of random variables such that \(X_{n,1},\dots,X_{n,n}\) are independent for every \(n\). Let \(b_n>0,\lim_{n\to\infty}b_n=\infty\) and define the truncated variables \(\bar{X}_{n,k}=X_{n,k}1_{\vert X_{n,k}\vert \leq b_n}\). Suppose

  1. \(\sum_{k=1}^nP(\vert X_{n,k}\vert >b_n)\to 0\) as \(n\to \infty\),

  2. \(b_n^{-2}\sum_{k=1}^nE\bar{X}_{n,k}^2\to 0\) as \(n\to\infty\).

Let \(S_n=X_{n,1}+\dots+X_{n,n}\) and \(a_n=\sum_{k=1}^nE\bar{X}_{n,k}\). Show that \(\frac{S_n-a_n}{b_n}\to 0\) in probability as \(n\to\infty\).

Problem 2

If \(n\) is an integer, a permutation of \(\{1,2,\dots,n\}\) is an reordering of these integers. E.g. \(\pi=(7,5,8,3,2,1,6,4)\) is a permutation if \(n=8\). There are, of course \(n!\) permutations, given \(n\). There are many ways to represent permutations. For example, the same \(\pi\) as above can be written as

\(\pi=\left(\begin{array}{cccccccc} 1&2&3&4&5&6&7&8\\ 7&5&8&3&2&1&6&4 \end{array}\right),\)

representing the fact that \(7\) is in the first position, \(5\) in the second, etc. A different way to think about \(\pi\) is a function from \(\{1,2,\dots,8\}\) to \(\{1,2,\dots,8\}\), sending \(1\to 7\), \(2\to 5\), etc. Lastly, when thinking of \(\pi\) as a function, we can write it in cycle form: we start with \(1\), and see where it maps to, in this case \(7\), then see where \(7\) maps to, in this case \(6\), and \(6\) maps to \(1\) completing our first cycle \((176)\). We start with the smallest integer we haven’t encountered, \(2\) here, and repeat: \(2\) maps to \(5\), \(5\) maps back to \(2\), so we get another cycle \((25)\). Lastly we get the cycle \((384)\). Thus, the cycle structure of \(\pi\) is \(\pi=(176)(25)(384)\), so it has three cycles.

Let’s ask the following question: if we pick a permutation uniformly at random out of the \(n!\) different ones (i.e. all \(n!\) are equally likely), how does the number of cycles behave as \(n\to\infty\)?

In this problem you will answer this question. In the cycle-structure construction above let \(X_{n,k}\) be \(1\) if the \(k\)‘th number encountered finishes a cycle and \(0\) otherwise. In our example \(X_{8,3}=X_{8,5}=X_{8,8}=1\) and all the others are \(0\).

  1. Prove that \(X_{n,1},\dots,X_{n,n}\) are in fact independent, and \(P(X_{n,j}=1)=\frac{1}{n-j+1}\) for all \(j=1,\dots,n\).

  2. Let \(S_n=X_{n,1}+\dots+X_{n,n}\), \(\mu_n=ES_n\) and \(\sigma_n^2=\Var(S_n)\). What does \(S_n\) represent?

  3. Show that $\forall\varepsilon>0$ we have \(\frac{S_n-\sum_{m=1}^n\frac 1m}{(\ln n)^{0.5+\varepsilon}}\to 0\) in probability, as $n\to\infty$.

  4. Conclude that \(\frac{S_n}{\ln n}\to 1\) in probability as \(n\to\infty\).

Problem 3

Suppose \((X_i)_{i=1}^\infty\) be a sequence of iid random variables on \((\Omega,\Sigma,P)\) that take values in some topological space \((E,\mathcal{B}(E))\). Show that if \(A\) is any event such that \(P(A) > 0\), then \(P(X_i \in A \text{ i.o.}) = 1\). This is the probabilistic iid special case of the Poincare recurrence theorem, which you will revisit in Ergodic theory.

Suppose we have a monkey typing out letters on a typewriter. Let \(X_i \in \mathcal{A}\) be the \(i\)th letter typed out by the monkey, where \(\mathcal{A}\) is the English alphabet (including spaces and punctuation marks). Let \(n\) be the length of Shakespeare’s Hamlet in the English alphabet. Show, using the previous result, that if the monkey types infinitely many letters, then the monkey will type out Hamlet at some point. (Please prove this statement in an appropriate mathematical model).

Problem 4

Prove that every set of six people have at least 3 mutual acquaintances or 3 mutual strangers. Compute the Ramsey number \(R_3\).

Problem 5

Let \(\pi\) denote a collection of edges in the graph \(\Z^d\). We say that \(\pi\) is a path of length \(n\) if there are \(n\) edges in \(\pi\) and they connect neighboring points in \(\Z^d\), starting with the origin. A path is self-avoiding if no 3 of its edges share a vertex.

  1. Prove that there are at most $\lambda_{n,d} := 2d(2d - 1)^{n-1}$ self-avoiding paths of length $n$.
  2. Recall in edge-percolation that you keep an edge of \(\Z^d\) with probability \(p\) and discard it with probability \(1-p\). Prove that the expected number of paths of length n that remain after keeping and discarding edges is at most \(p^n \lambda_{n,d}\). Use this to prove that \(p_c(\Z^d) \geq (2d - 1)^{-1}\).

This is called Peierl’s argument in percolation theory.

Homework 7 due Mar 28

  1. Suppose \(\W = \{1,\ldots,n\}\) and let \(H(\Prob) = - \sum_{i=1}^n \Prob(x) \log \Prob(x)\) be the entropy of any measure on \(\W\). Show that \(H\) is maximized by the uniform measure.

  2. Let \((X_n)_{n\in \N}\) be a sequence of real random variables, suppose they are uniformly bounded in $L^2$, and uncorrelated. Show that

    1. \(\frac{S_{n^2} - \E[S_{n^2}]}{n^2} \to 0\) as \(n \to \infty\) a.s.
    2. Deduce that \(\frac{S_n - \E[S_n]}{n} \to 0, \almostsurely.\)
  3. Suppose \((X_n)_{n=1}^\infty\) is an iid sequence such that \(X_n \in L_4(\Omega,\Sigma,P)\). Compute \(\operatorname{Var}(S_n/n)\) and use the Borel-Cantelli lemma to conclude that

    \(\frac{S_{n} - \mathbb{E}[S_{n}]}{n} \to 0 \text{ a.s.}\)

  4. Let \(\{X_n:n\geq 1\}\) be independent and exponentially distributed with parameter \(1\).

    1. Show

      \(\mathbb{P}\left(\limsup_{n\rightarrow\infty}\frac{X_n}{\ln n}=1\right) = 1, \almostsurely\)

    2. Let \(Z_n = \max\{ X_1, \ldots, X_n \}\). Verify that

      \(\liminf_{n \to \infty} \frac{Z_n}{\log n } \geq 1, \almostsurely\)

    3. For an appropriately chosen sequence \(n_k \to \infty\), show that

      \(\limsup_{k \to \infty} \frac{Z_{n_k}}{\log n_k } \leq 1, \almostsurely\)

      Conclude that \(\lim_{n \to \infty} (\log n)^{-1} Z_n = 1, \almostsurely\).

  5. Suppose \((X_n)_{n=1}^\infty\) is an iid sequence such that \(\E[X_n^+] = +\infty\) and \(\E[X_n^-] < \infty\). Show that \(\lim_{n \to \infty} S_n/n \to +\infty ~\almostsurely\).

Homework 6 due Mar 7

  1. Let \(\{X_i\}_{i=1}^\infty\) be iid random variables, all taking values \(0\) and \(1\) with probability \(1/2\) each, and define \(Y \colon = \sum_{i=1}^\infty 4^{-i} X_i\). If $\mu_Y$ denotes the distribution of $Y$, then $\mu$ is a probability measure on $([0,1],\mathcal{B}([0,1]))$. Show that

    1. \(\mu_Y(\{x\}) = 0\) for all \(x \in [0,1]\).
    2. Let \(A\) be the range of \(Y\), \(A \subset [0,1]\). Show that \(\mu_Y(A) = 1\) but \(\lambda(A) = 0\), where \(\lambda\) is Lebesgue measure.
    3. Find the Hausdorff dimension of \(A\).
  2. Consider a population of \(n\) individuals and \(r ∈ \{1, \ldots , n − 1\}\).

    1. Give a probability space for the random experiment consisting in choosing at random a sample of \(r\) individuals in the population.
    2. Suppose that the population is composed of individuals of two types, with \(n_i\) individuals of type \(i\) for \(i=1,2\) and \(n_1 + n_2 = n\). Let \(X\) be the number of individuals of type 1 in the sample. Prove that the law of \(X\) is given by

      \(P(X = k) = \binom{n_1}{k} \binom{n_2}{r-k} / \binom{n}{r}\)

      for every \(k ∈ {0, 1, . . . , r}\) (with the convention that \(\binom{k}{j}=0\) if \(j > k\)). This is the so-called hypergeometric distribution.

    3. Show that when \(n\), \(n_1\), \(n_2 → ∞\) in such a way that \(n_1/n\) tends to \(p ∈ (0, 1)\), and \(r\) remains fixed, the law of \(X\) becomes close to the binomial \(\Binomial(r, p)\) distribution. Interpret this result.
  3. Let \(X=(X_1,\ldots,X_n)\) be an absolutely continuous random vector. Show that \(X_i\) \(i=1,\ldots,n\) are independent iff the density \(p(x_1,\ldots,x_n) = \prod_{i=1}^n p_i(x_i)\), factors as a product and each marginal \(P_{X_i}\) has density \(p_i(x)/\int p_i(x)dx\).

  4. Show that \(\Prob(Z > \theta \E[Z]) \geq (1-\theta)^2 \E[Z]^2/\E[Z^2]\). This is the Paley-Zygmund inequality.

  5. Let \(\{A_i\}\) be a set of independent events, let \(Z_n = \sum_{i=1}^n 1_{A_i}\), and suppose \(\E[Z_n] \to \infty\).
    1. Show $\E[Z_n^2] \leq 2 \E[Z_n]^2$ for all large enough $n$.
    2. Prove the second Borel-Cantelli lemma using the Payley-Zygmund inequality.
  6. Prove the Kolmogorov 1-series theorem: If \(\{X_i\}\) are independent mean-zero random variables and if \(\sum_j \E[X_j^2] < \infty\), then \(\sum_i X_i\) converges almost surely. Use this to prove the following: suppose \(\{X_i\}_{i=1}^\infty\) are iid \(L^2\) mean-zero random variables, then \(\sum_{i=1}^\infty X_i/i\) converges; i.e., the random harmonic series converges.

Homework 5 due Feb 25

  1. Let \(C_c(\R^d)\) be the set of continuous functions with compact support. Show that they are dense in \(f \in L^p(\R^d, \mathcal{B}(\R^d), \lambda)\) for \(p \in [1,\infty)\).

  2. Let \(p \in [1,\infty)\) and \(q = p^*\) the conjugate Holder exponent. Let \(f \in L^p(\R^d, \mathcal{B}(\R^d), \lambda)\) and \(q \in L^q (\R^d, \mathcal{B}(\R^d), \lambda)\), where \(\lambda\) is Lebesgue measure. Show

    1. \(\vert f \ast g(x)\vert \leq \Norm{f}{p} \Norm{g}{q}\), and
    2. \(f \ast g(x)\) is uniformly continuous on \(\R^d\). For this, first show it for \(f\) and \(g\) continuous with compact support.
  3. Let \(X_n \overset{d}{=} N(0,1/n)\) be a sequence of Gaussian random variables. For any bounded continuous function \(f\), show that

    \(\E[ f(X_n) ] \to f(0)\)

  4. Let \(X\) be a \(\operatorname{Gamma}(\alpha)\) \(\Exponential(\lambda)\) and \(N(\mu,\sigma^2)\) random variables. Compute in each case

    1. \(\E[X]\)
    2. \(\Var(X)\)
    3. \(\phi_X(t) = \E[ \exp(i X t) ]\)
  5. Let $X_i$, $i=1,\ldots,N$ be random variables in $L^2$. Show that

    \(\Var( \sum_{i=1}^N a_i X_i ) = \sum_{i=1}^N a_i^2 \Var(X_i) + \sum_{i \neq j} a_i a_j \Cov(X_i, X_j)\)

    How does the formula simplify if you assume that the variables are independent?

  6. Suppose \(\mu\) is a finite, inner regular measure on \((\R^d,\mathcal{B}(\R^d))\). Show that \(\mu\) is also outer regular.

Homework 4 due Feb 18

  1. Find an example where

    1. \(X_n \to X\) in \(L^p\) \(1 \leq p \leq \infty\) but \(X_n \not\to X \almostsurely\).
    2. \(X_n \to X\) in \(\almostsurely\) but \(X_n \not\to X\) in \(L^p\) \(1 \leq p \leq \infty\).
  2. Suppose $X \sim \Binomial(n,p)$. Compute $\E[X]$, $\Var(X)$.

  3. Suppose \(p ∈ [1, ∞)\).

    1. If \((E, A, μ)\) is a measure space, show that the set of all integrable simple functions is dense in \(L^p (E, A, μ)\).
    2. If \((E, d)\) is a metric space, and \(μ\) is an outer regular measure on \((E, B(E))\), show that the set of all bounded Lipschitz functions that belong to \(L^p (E, \mathcal{B}(E), μ)\) is dense in $L^p(E, \mathcal{B}(E), μ)$.
    3. Consider \(L^\infty( \N, 2^\N, \mu)\) where \(\mu\) is counting measure. This is the sequence space \(\ell^\infty\). Show that
      this space is not separable.
  4. Let \(X \sim \Geometric(p)\) have pmf \(\Prob(X = k) = (1-p)^{k-1} p, \quad k=1,2,\ldots\). Suppose you keep flipping a coin until you flip heads, where the probability of heads is \(p\). \(X\) represents the toss at which the first heads occurs.

    1. Show that \(\Prob(X > k) = (1-p)^k\).
    2. Let \(X_n = \Geometric(\lambda/n)\). Show that \(X_n/n \overset{d}{\to} X\), and identify the distribution of \(X\).

Homework 3 due Feb 11

  1. Suppose we have \(X_n \colon \W \to \R\), \(n=1,2,\ldots\) measurable random variables defined on the same probability space.

    1. Give an example of \(X_n \to X\) in distribution, but \(X_n \not\to X\) in probability.
    2. Show that if \(X_n \to X \quad a.s.\), then $X_n \to X$ in probability (in measure).
    3. Give an example of \(X_n \to X\) in probability, but \(X_n \not\to X \quad a.s.\).
  2. Choose and fix $p_1,\ldots,p_n > 0$ such that $\sum_i p_i = 1$. Prove that if $x_1,\ldots,x_n$ are positive, then

    \(\prod x_i^{p_i} \leq \sum_i p_i x_i\)

  3. (Hausdorff Dimension) If \(A\) is a subset of \(R^d\), we denote the diameter of \(A\) by \(Δ(A) = \sup\{\vert x − y\vert : x, y ∈ A\} ∈ [0, ∞]\). For every \(α > 0\) and every \(ε > 0\), we set, for every subset \(A\) of \(R^d\),

    \(μ_{α,ε} (A) = \inf_{(U_k )_{k∈N} ∈ \mathcal{R}_ε (A)} \sum_{k∈N} Δ(U_k)^α\)

    where \(\mathcal{R}_ε (A)\) is the set of all countable coverings of \(A\) by open sets of diameter smaller than $ε$. 1. Observe that \(μ_{α,ε} (A) ≥ μ_{α,ε'}(A)\) if \(ε < ε'\) , and thus one can define a mapping \(A \mapsto μ_α (A)\) by

    \(μ_α (A) := \lim_{ε↓0} μ_{α,ε} (A) ∈ [0, ∞].\)

    1. Prove that \(μ_α\) is an outer measure on \(R^d\).
    2. Verify that, for every subset \(A\) of \(R^d\) , there exists a (unique) real number \(\operatorname{dim}(A) ∈ [0, d]\), called the Hausdorff dimension of \(A\), such that, for every \(α > 0\),

      \(μ_α (A) = \begin{cases} 0 & \text{if } α > dim(A) \\ ∞ & \text{if } α < dim(A). \end{cases}\)

    3. Let \(α > 0\) and let \(A\) be a Borel subset of \(R^d\). Assume that there exists a measure \(μ\) on \((R^d , \mathcal{B}(R^d ))\) such that \(μ(A) > 0\) and \(μ(B) ≤ r^α\) for every open ball \(B\) of radius \(r > 0\) centered at a point of \(A\). Prove that \(dim(A) ≥ α\).
  4. Suppose \(X\) is a random variable such that with probability a \(1/2\) it takes the value \(1\), and with probability a half it takes the values of \(\Uniform([0,2])\). Find the cdf of \(X\). Does it have a density?

    1. Let \(\nu\) be a finite measure on the finite measure space \((\W,\mathcal{F},\mu)\). Then, show that there is a decomposition \(\nu = \nu_1 + \nu_2\) where \(\nu_1 \ll \mu\) and \(\nu_2 \perp \mu\), where we take the last \(\perp\) to mean there exists \(A\), \(B\) measurable such that \(A \cup B = \W\), and \(\nu_2(A) = 0\), and \(\mu(B) = 0\). This is called the Lebesgue decomposition theorem. Hint: apply the Radon-Nikodym theorem to \(\mu\) and \(\lambda = \mu + \nu\), or look at Axler, Rudin or Royden.

    2. Give the Lebesgue decomposition of \(\mu_X\) with respect to Lebesgue measure on \([0,2]\).

Homework 2 due Feb 4

Solution for Problem 2, Scheffe’s Lemma

  1. Let \(\varphi \in L^1(\R, \mathcal{B}, \lambda)\), where \(\lambda\) is Lebesgue measure. Show that the Fourier transform defined by \(\hat\varphi \colon \R \to \C\) defined by

    \(\hat\varphi(u) = \int \exp(i u x ) \varphi(x) \lambda (d x)\)

    is a continuous function on \(\R\). More generally, if \(\mu\) is a finite measure on \(\R\), show that

    \(\hat\mu(u) = \int \exp(i u x ) \mu (d x)\)

    is a continuous function on \(\R\).

  2. Let \((E, \mathcal{A}, \mu)\) be a measure space, and let \((f_n)_{n \in \N}\) be a nonnegative sequence of measurable functions on \(E\). Suppose \(f_n(x) \to f(x)\) \(\mu\) a.e. and \(\int f d \mu < \infty\). Show that

    \(\int f_n d\mu \to \int f d \mu \Leftrightarrow \int \vert f_n - f \vert d\mu \to 0\)

  3. Let \(\phi \colon [0,1] \to [0,1]\) be the Cantor-Lebesgue function (Devil’s staircase). Let \(\psi(x) = \phi(x) + x\). Show that \(\psi\) is a bijective, continuous function from \([0,1]\) to \([0,2]\), and show that the image of the Cantor set \(\mathbf{C}\) under \(\psi\) has Lebesgue measure \(1\). We know that any subset of positive measure contains a nonmeasurable set (by the Vitali construction). Thus \(\psi(\mathbf{C})\) must contain a nonmeasurable set \(V\), but \(\psi^{-1}(V)\) is measurable. Conclude that

    1. There is a continuous function \(g\) and a (Lebesgue measurable) set \(B\) such that \(g^{-1}(B)\) is nonmeasurable.
    2. There is a measurable set that is not Borel.
  4. A function \(f \colon I \to \R\), where \(I = [a,b]\) is called convex if \(f(\lambda x + (1 - \lambda) y) \leq \lambda f(x) + (1-\lambda) f(y)\) for all \(x,y \in I\).

    1. Prove that if \(f'' \geq 0\) on \(I\) then \(f\) is convex on \(I\).
    2. Prove that \(f\) has left and right derivatives everywhere; i.e., \(\lim_{\e \downarrow 0} (f(x \pm \e) - f(x))/\e\) exists for all \(x \in I\). These are the left and right derivatives \(\pm \partial_\pm f(x)\) respectively. Prove that they are both nondecreasing, and that \(\partial_- f(x) \leq \partial_+ f(x)\).
    3. \(L(x) = a x + b\) is a supporting line of \(f(x)\) at \(x_0\) if \(f(x) \geq L(x)\) and \(f(x_0) = L(x_0)\). Show that \(f\) has a supporting line at every point \(x_0 \in I\).

Homework 1 due Jan 28

  1. Construct a countable family of \(\sigma\)-algebras \(\{\mathcal{F}\}_{i=1}^\infty\) such that \(\mathcal{F}_i \subset \mathcal{F}_{i+1}\) and \(\sigma( \cup_{i=1}^\infty \mathcal{F}_i ) \neq \cup_{i=1}^\infty \mathcal{F}_i\). (3.2 from Khoshnevisan)

  2. Prove that Lebesgue measure on \(\R^d\) is regular. That is, for every \(A\) measurable, we have

    \(\begin{align*} \lambda(A) & = \inf \{ \lambda(U) \colon U \text{ is open }, A \subset U \}\\ & = \sup \{ \lambda(F) \colon F \text{ is compact }, F \subset A \}\\ \end{align*}\)

    In other words, Lebesgue measure is outer and inner regular respectively.

    Extend this to all metric spaces \((E,d)\), where \(\mu\) is a finite measure on \((E,\mathcal{B}(E))\) where \(\mathcal{B}(E)\) is the Borel \(\sigma\)-algebra on \(E\). Then for every \(A \in \mathcal{B}(E)\),

    \(\begin{align*} \mu(A) & = \inf \{ \lambda(U) \colon U \text{ is open }, A \subset U \}\\ & = \sup \{ \lambda(F) \colon F \text{ is closed }, F \subset A \}\\ \end{align*}\)

    (Proposition 3.1 in Le Gall)

  3. A function \(F \colon \R \to [0,1]\) is cumulative distribution function on \(\R\) if it is

    1. It is non-decreasing and right-continuous
    2. \(\lim_{x \to -\infty} F(x) = 0\) and \(\lim_{x \to \infty} F(x) = 1\).

    Prove that if \(\mu\) is a probability measure on \((\R,\mathcal{B}(\R))\); i.e., \(\mu(\R) = 1\), then \(F(x) = \mu((-\infty,x])\) defines a distribution function on \(\R\). Conversely, if \(F(x)\) is a distribution function on \(\R\), then there is a unique probability distribution \(\mu\) on \((\R,\mathcal{B(\R)})\) such that \(F(x) = \mu((-\infty,x])\).

    (3.9 from Khoshnevisan)