Click an answer to see instant feedback. Score updates automatically.
1) The nonparametric bootstrap estimates uncertainty by resampling the observed data with replacement.
2) In a nonparametric bootstrap, resamples are drawn without replacement.
3) A parametric bootstrap assumes a model (e.g., Normal) and plugs in parameters estimated from the sample.
4) Increasing the number of bootstrap resamples nB reduces the estimator’s true sampling variability even if the original sample size n is fixed.
5) With small n and skewed data, a normal-theory (t) CI is always better than a bootstrap percentile CI.
6) The bootstrap cannot be used to estimate the standard error of the median.
7) For a two-sample mean difference, “label-preserving” bootstrap means resample within each group separately (A from A, B from B).
8) A 95% bootstrap percentile CI takes the 2.5th and 97.5th percentiles of the bootstrap statistics.
9) In simple linear regression, MLE gives the same slope and intercept as OLS when errors are i.i.d. Normal with constant variance.
10) OLS and MLE will coincide even with heavy-tailed or heteroskedastic errors.
11) Maximum likelihood means “choose parameters that make the observed data most likely” under the assumed model.
12) If regression errors are Normal, minimizing SSE is equivalent to maximizing the likelihood over the line’s parameters.
13) Bootstrap results are identical across runs regardless of the random seed.
14) The nonparametric bootstrap approximates the population by a Normal distribution with mean x̄ and SD s.
15) To use the bootstrap you must have a closed-form formula for your statistic’s standard error.