Quiz — Joint Probability ENGR200

True/False. Click an answer to see instant feedback. Score updates automatically.

1) For a discrete joint pmf \(f_{XY}(x,y)\), the mass must be nonnegative and sum to 1:

$$\sum_x \sum_y f_{XY}(x,y) = 1,\quad f_{XY}(x,y)\ge 0.$$

2) For a continuous joint pdf \(f_{XY}(x,y)\), the total integral equals 1:

$$\iint_{-\infty}^{\infty} f_{XY}(x,y)\,dx\,dy = 1.$$

3) Discrete marginalization: sum the joint over the other variable.

$$f_X(x)=\sum_y f_{XY}(x,y),\qquad f_Y(y)=\sum_x f_{XY}(x,y).$$

4) For densities, the marginal of \(Y\) is obtained by integrating out \(x\) (not \(y\)):

$$f_Y(y) = \int_{-\infty}^{\infty} f_{XY}(x,y)\,dx.$$

5) Discrete case: probability at a point equals the joint pmf at that point.

$$P(X=a, Y=b)=f_{XY}(a,b).$$

6) Continuous case: density is not probability; probability at a point is zero.

$$P(X=a, Y=b)=0 \quad\text{even though}\quad f_{XY}(a,b)\ \text{may be positive.}$$

7) If the support (where \(f_{XY}>0\)) is non-rectangular, \(X\) and \(Y\) cannot be independent.

8) Independence is equivalent to the product form for all valid pairs \((x,y)\):

$$f_{XY}(x,y)=f_X(x)\,f_Y(y)\ \ \text{for all }(x,y).$$

9) If the product form holds only for some \((x,y)\) but not all, we cannot claim independence.

10) Discrete expectation of a function of two variables:

$$E[g(X,Y)] = \sum_x \sum_y g(x,y)\,f_{XY}(x,y).$$

11) In general, \(E[XY]\neq E[X]E[Y]\) unless \(X\) and \(Y\) are independent.

$$E[XY]=E[X]\,E[Y]\ \ \text{iff}\ \ X\perp Y.$$

12) Covariance sign/order:

$$\operatorname{Cov}(X,Y)=E[XY]-E[X]E[Y].$$

13) Correlation can achieve \(\pm 1\) under a perfect linear relationship.

$$-1\le \rho \le 1,\quad \rho=\pm1\ \text{is possible.}$$

14) Zero covariance does not imply independence (in general).

15) In a joint pmf table, columns sum to \(f_X(x)\) and rows sum to \(f_Y(y)\).

Score: 0/15 correct