Mean & Variance — Ultra Line‑by‑Line

Every substitution, Jacobian, boundary limit, Beta/Gamma identity, and mgf derivative made explicit. If it looks long… that’s the point.

Uniform [a,b]

From PDF to Var — no shortcuts

Support & PDF: \(f(x)=\frac{1}{b-a}\,\mathbf 1_{[a,b]}(x)\), with \(a

1. Mean:
\(\displaystyle E[X]=\int_{-\infty}^{\infty} x f(x)dx=\int_a^b x\frac{1}{b-a}dx\)
\(=\frac{1}{b-a}\Big[\tfrac{x^2}{2}\Big]_a^b=\frac{b^2-a^2}{2(b-a)}=\frac{(b-a)(b+a)}{2(b-a)}=\boxed{\tfrac{a+b}{2}}.\)
2. Second moment:
\(\displaystyle E[X^2]=\int_a^b x^2\frac{1}{b-a}dx=\frac{1}{b-a}\Big[\tfrac{x^3}{3}\Big]_a^b=\frac{b^3-a^3}{3(b-a)}=\boxed{\tfrac{b^2+ab+a^2}{3}}.\)
3. Variance (show definition):
\(\displaystyle Var(X)=E[X^2]-E[X]^2=\frac{b^2+ab+a^2}{3}-\left(\frac{a+b}{2}\right)^2=\frac{4(b^2+ab+a^2)-(a^2+2ab+b^2)}{12}=\boxed{\frac{(b-a)^2}{12}}.\)

Key idea: always compute variance as \(Var(X)=E[X^2]-(E[X])^2\).

Normal (\(\mu,\sigma\))

MGF via completing the square → differentiate

PDF \(\phi_{\mu,\sigma}(x)=\frac{1}{\sqrt{2\pi}\sigma}\exp\!\big(-\frac{(x-\mu)^2}{2\sigma^2}\big)\).

1. Start with \(M_X(t)=E[e^{tX}]\):
\(\displaystyle M_X(t)=\int_{-\infty}^{\infty} e^{tx}\,\phi_{\mu,\sigma}(x)dx= e^{\mu t}\int \frac{1}{\sqrt{2\pi}\sigma}\exp\!\left(-\frac{(x-\mu)^2}{2\sigma^2}+t(x-\mu)\right)dx.\)
2. Complete the square explicitly:
\( -\frac{(x-\mu)^2}{2\sigma^2}+t(x-\mu) = -\frac{1}{2\sigma^2}\big[(x-\mu)^2-2\sigma^2 t (x-\mu)\big] \)
\(= -\frac{1}{2\sigma^2}\big[(x-\mu-\sigma^2 t)^2-\sigma^4 t^2\big] = -\frac{(x-\mu-\sigma^2 t)^2}{2\sigma^2}+\frac{\sigma^2 t^2}{2}.\)
3. Pull out constants; recognize a centered Normal pdf:
\( M_X(t)= e^{\mu t+\tfrac{1}{2}\sigma^2 t^2}\int \frac{1}{\sqrt{2\pi}\sigma}\exp\!\left(-\frac{(x-\mu-\sigma^2 t)^2}{2\sigma^2}\right)dx = e^{\mu t+\tfrac{1}{2}\sigma^2 t^2}. \)
4. Differentiate at 0:
\( E[X]=M'_X(0)=\mu,\quad E[X^2]=M''_X(0)=\sigma^2+\mu^2 \Rightarrow Var(X)=\sigma^2. \)
Alternative: symmetry integral

Show \(\int (x-\mu)\phi_{\mu,\sigma}(x)dx=0\) by oddness in the shifted variable; then compute \(E[(X-\mu)^2]\) directly.

Exponential (\(\lambda>0\))

Two integrations by parts — with limits by L'Hôpital

PDF \(f(x)=\lambda e^{-\lambda x}\) on \([0,\infty)\).

Mean \(E[X]\) — every line

Take \(u=x\), \(dv=\lambda e^{-\lambda x}dx\) ⇒ \(du=dx\), \(v=-e^{-\lambda x}\).
\(\displaystyle E[X]=\int_0^{\infty} x\lambda e^{-\lambda x}dx = [uv]_0^{\infty}-\int_0^{\infty} v\,du.\)
\(= \Big[-x e^{-\lambda x}\Big]_0^{\infty}+\int_0^{\infty} e^{-\lambda x}dx.\)
Boundary: \(\lim_{x\to\infty} x e^{-\lambda x}=\lim \frac{x}{e^{\lambda x}}=\lim \frac{1}{\lambda e^{\lambda x}}=0\).
Remaining integral: \(\int_0^{\infty} e^{-\lambda x}dx = \big[-\tfrac{1}{\lambda}e^{-\lambda x}\big]_0^{\infty}=\tfrac{1}{\lambda}.\)
\(\boxed{E[X]=1/\lambda}.\)

Second moment \(E[X^2]\) — parts again + a sub‑proof

Use \(u=x^2\), \(dv=\lambda e^{-\lambda x}dx\) ⇒ \(du=2x\,dx\), \(v=-e^{-\lambda x}\).
\(\displaystyle E[X^2]=\Big[-x^2 e^{-\lambda x}\Big]_0^{\infty}+\int_0^{\infty} 2x e^{-\lambda x}dx.\)
Boundary: \(x^2 e^{-\lambda x}\to 0\) (polynomial vs. exponential).
Sub‑proof: compute \(I=\int_0^{\infty} x e^{-\lambda x}dx\) by parts with \(u=x\), \(dv=e^{-\lambda x}dx\) ⇒ \(du=dx\), \(v=-\tfrac{1}{\lambda}e^{-\lambda x}\). Then \(I=[-\tfrac{x}{\lambda}e^{-\lambda x}]_0^{\infty}+\tfrac{1}{\lambda}\int_0^{\infty} e^{-\lambda x}dx=0+\tfrac{1}{\lambda}\cdot\tfrac{1}{\lambda}=\tfrac{1}{\lambda^2}.\)
Thus \(E[X^2]=2I=\boxed{2/\lambda^2}\Rightarrow Var(X)=\boxed{1/\lambda^2}.\)

Lognormal (underlying Normal \(m,s\))

Reduce to Normal moments of \(Y=\ln X\)

If \(Y\sim\mathcal N(m,s^2)\), then \(M_Y(t)=\exp(mt+\tfrac{1}{2}s^2 t^2)\).

1. Mean:
\(E[X]=E[e^{Y}]=M_Y(1)=\boxed{e^{m+\tfrac{1}{2}s^2}}.\)
2. Second moment:
\(E[X^2]=E[e^{2Y}]=M_Y(2)=\boxed{e^{2m+2s^2}}.\)
3. Variance:
\(Var(X)=E[X^2]-(E[X])^2= e^{2m+s^2}(e^{s^2}-1).\)
Option B: change of variables (every Jacobian)

Let \(x=e^y\Rightarrow dx=e^y dy\), lognormal pdf \(f_X(x)=\dfrac{1}{x s\sqrt{2\pi}}\exp\!\left(-\dfrac{(\ln x - m)^2}{2s^2}\right)\). Then for \(r>-1\)

\(\displaystyle E[X^r]=\int_0^{\infty} x^r f_X(x)dx=\int_{-\infty}^{\infty} e^{ry}\,\frac{1}{s\sqrt{2\pi}}\exp\!\left(-\frac{(y-m)^2}{2s^2}\right)dy = e^{rm+\tfrac{1}{2}r^2 s^2}.\)

Gamma (\(\alpha,\theta\))

Master integral → moments with \(\Gamma\)

PDF: \(f(x)=\dfrac{x^{\alpha-1} e^{-x/\theta}}{\Gamma(\alpha)\,\theta^{\alpha}}\), \(x>0\).

1. Master integral via \(y=x/\theta\):
\(\displaystyle \int_0^{\infty} x^{r} e^{-x/\theta}dx=\theta^{r+1}\int_0^{\infty} y^r e^{-y}dy=\theta^{r+1}\Gamma(r+1).\)
2. Mean:
\(E[X]=\frac{1}{\Gamma(\alpha)\theta^{\alpha}}\int_0^{\infty} x^{\alpha}e^{-x/\theta}dx=\frac{\Gamma(\alpha+1)\theta^{\alpha+1}}{\Gamma(\alpha)\theta^{\alpha}}=\boxed{\alpha\theta}.\)
3. Second moment:
\(E[X^2]=\frac{\Gamma(\alpha+2)\theta^{\alpha+2}}{\Gamma(\alpha)\theta^{\alpha}}=(\alpha+1)\alpha\,\theta^2.\)
4. Variance:
\(Var(X)= (\alpha+1)\alpha\theta^2-(\alpha\theta)^2=\boxed{\alpha\theta^2}.\)
Why the \(\Gamma\) identities hold

Recall \(\Gamma(z)=\int_0^{\infty} t^{z-1}e^{-t}dt\). Then \(\Gamma(z+1)=z\Gamma(z)\) by integration by parts with \(u=t^{z}\), \(dv=e^{-t}dt\).

Beta (\(\alpha,\beta\))

Beta ratios spelled out

PDF on \((0,1)\): \(f(x)=\dfrac{x^{\alpha-1}(1-x)^{\beta-1}}{B(\alpha,\beta)}\), with \(B(\alpha,\beta)=\dfrac{\Gamma(\alpha)\Gamma(\beta)}{\Gamma(\alpha+\beta)}\).

1. Mean:
\(\displaystyle E[X]=\frac{B(\alpha+1,\beta)}{B(\alpha,\beta)}=\frac{\Gamma(\alpha+1)\Gamma(\beta)}{\Gamma(\alpha+\beta+1)}\cdot \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha)\Gamma(\beta)}=\frac{\alpha\,\Gamma(\alpha)}{(\alpha+\beta)\,\Gamma(\alpha+\beta)}\cdot \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha)}=\boxed{\tfrac{\alpha}{\alpha+\beta}}.\)
2. Second moment:
\(\displaystyle E[X^2]=\frac{B(\alpha+2,\beta)}{B(\alpha,\beta)}=\frac{\Gamma(\alpha+2)\Gamma(\beta)}{\Gamma(\alpha+\beta+2)}\cdot \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha)\Gamma(\beta)}\)
\(=\frac{\alpha(\alpha+1)\Gamma(\alpha)}{(\alpha+\beta)(\alpha+\beta+1)\Gamma(\alpha+\beta)}\cdot\frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha)}=\boxed{\frac{\alpha(\alpha+1)}{(\alpha+\beta)(\alpha+\beta+1)}}.\)
3. Variance:
\(Var(X)=E[X^2]-E[X]^2=\frac{\alpha(\alpha+1)}{(\alpha+\beta)(\alpha+\beta+1)}-\frac{\alpha^2}{(\alpha+\beta)^2}=\boxed{\frac{\alpha\beta}{(\alpha+\beta)^2(\alpha+\beta+1)}}.\)

Weibull (\(k,\lambda\))

Change of variables — Jacobian tracked

PDF: \(f(x)=\dfrac{k}{\lambda}(x/\lambda)^{k-1} e^{-(x/\lambda)^k}\), \(x>0\).

1. Let \(t=(x/\lambda)^k\Rightarrow x=\lambda t^{1/k}\) ⇒ \(dx=\lambda\,\tfrac{1}{k} t^{1/k-1} dt\).
2. r‑th moment:
\(\displaystyle E[X^r]=\int_0^{\infty} x^r f(x)dx = \int_0^{\infty} (\lambda t^{1/k})^r \frac{k}{\lambda} t^{(k-1)/k} e^{-t} \frac{\lambda}{k} t^{1/k-1} dt\)
Power of \(t\): \(\tfrac{r}{k}+\tfrac{k-1}{k}+\tfrac{1}{k}-1=\tfrac{r}{k}\). Constants collapse to \(\lambda^r\).
\(E[X^r]=\lambda^{r}\int_0^{\infty} t^{r/k} e^{-t} dt = \boxed{\lambda^{r}\,\Gamma\!\left(1+\tfrac{r}{k}\right)}.\)
3. Mean/Variance: set \(r=1,2\) and subtract as usual.

Chi‑square (\(\nu\))

Two ways: Gamma form and mgf as sum of squares

If \(X\sim\chi^2_{\nu}\), then \(X\sim\Gamma(\alpha=\nu/2,\theta=2)\).

1. From Gamma: \(E[X]=\alpha\theta=\nu\), \(Var(X)=\alpha\theta^2=2\nu\). For \(E[X^2]\), use \((\alpha+1)\alpha\theta^2=\nu(\nu+2)\).
2. Direct mgf as sum of squares (fully expanded)

If \(Z\sim N(0,1)\), then mgf of \(Z^2\) is \(M_{Z^2}(t)=(1-2t)^{-1/2}\) (compute \(E[e^{tZ^2}]\) by Gaussian integral). For independent \(Z_1,\dots,Z_{\nu}\), mgf of \(\sum Z_i^2\) is \((1-2t)^{-\nu/2}\). Differentiate: \(M'(t)=\nu(1-2t)^{-\nu/2-1}\), \(M''(t)=\nu(\nu+2)(1-2t)^{-\nu/2-2}\). At \(t=0\): \(E[X]=\nu\), \(E[X^2]=\nu(\nu+2)\), variance \(=2\nu\).