Stat 4382 Practice Exam 1 Answers And Solutions
Stat 4382 Practice Version Of Exam 1answers And Solutionsyuly Koshevn
Stat 4382 Practice Version of Exam 1 answers and solutions by Yuly Koshevnik. The problem set includes six detailed questions covering Markov chains, joint and marginal densities, expected values, variances, and distributions related to various stochastic processes. Key concepts encompass transition probability matrices, stationary distributions, conditional expectations, Markov chain absorption probabilities, distributions of mixed types, and properties of Poisson and Gamma distributions. The solutions employ formulas for stationary distributions, transition matrices, conditional densities, and moments, requiring analytical calculations and probability theory fundamentals.
Paper For Above instruction
The first problem deals with a Markov chain representing the number of umbrellas at the professor’s residence, attempting to find its transition matrix, stationary distribution, and long-term expected number of umbrellas. It explores the probabilities associated with the professor carrying an umbrella during his walks, incorporating the assumption of rain independence and umbrella management rules. The transition probability matrix (P) involves specified probabilities based on rain occurrence and umbrella counts, leading to solution derivations for stationary distribution q = (q1, q2, q3, q4). The system of equations derived from πP = π under the normalization condition yields q1 = q2 = q3 = (1/3 + q) and q4 = q/(3 + q). The probability that the professor carries an umbrella during a random encounter combines rain probability q and the stationary probability of having umbrellas at the residence, culminating in an expected number of umbrellas calculated as a weighted sum based on stationary probabilities.
The second problem involves joint densities of continuous variables X and Y with the specified exponential structure. Marginal densities are computed by integrating over appropriate bounds, resulting in gamma distributions for X (~Gamma(1, 1/5)) and Y (~Gamma(2, 1/5)). Conditional densities f_{X|Y} and f_{Y|X} are established, showing that given Y=y, X|Y=y is uniform over [0, y], and given X=x, Y|X=x is an exponential shifted by x. Expectations E[X|Y=y] = y/2 and E[Y|X=x] = x + 1/5 are derived from these densities.
The third problem centers on hierarchical models with a density for Q and a conditional binomial for N|Q, leading to marginal moments. The distribution of Q is Beta(2,2) with mean 2/3, variance 1/18. Since N|Q=q is binomial(3, q), the unconditional expectation E[N] is computed as 2. The variance Var[N] incorporates the law of total variance, resulting in a value of 1. The conditional density of Q given N=1 derives from Bayes' rule, revealing a Beta(3,3) distribution for Q|N=1.
The fourth problem features a compound sum S = 2N + sum_{k=1}^{2N+1} X_k, where N follows a geometric distribution with parameter q=1/3, and X_k are i.i.d. Gamma(2, 0.5). Expectations and variances are calculated through properties of these distributions. Specifically, E[N] = 2, Var[N] = 6, E[X_k] = 1, and Var[X_k] = 0.5. The expectation E[S] follows as approximately 5, and the variance Var[S] sums the variances weighted appropriately, resulting in 26.5.
Problem five examines a three-state Markov chain with specified transition probabilities: states 0 and 1 transition with 50% chance each to the other states, and state 2 is absorbing. The transition matrix is squared and cubed to find the probabilities of being in particular states after specific steps. The conditional probability P[X_3=1|X_0=0] is obtained from (P^3)_{0,1} = 14/8, simplifying to 7/4. The probability P[X_5=1|X_2=1] from (P^3)_{1,1} is 1/8. If X_1 is uniformly distributed over {0,1,2}, the probability that X_2=1 is computed using the law of total expectation, resulting in 1/6.
The sixth problem involves a variable W with an exponential density and a Poisson distribution for N|W=w. The marginal distribution of N is geometric with parameter 0.75, derived through integration. The conditional density of W|N=n follows a Gamma(n+1, 1/4) distribution, with expectation E[W|N=n] = (n+1)/4. The unconditional E[N] = 0.75/(1-0.75) = 3/4, and E[W|N=n] is used for Bayesian inference, illustrating the hierarchical structure. The last problem reiterates for W, N, and their moments, emphasizing the connection between hierarchical and Bayesian models in the context of continuous and discrete random variables.
References
- Ross, S. M. (2014). Introduction to Probability Models (11th ed.). Academic Press.
- Grinstead, C. M., & Snell, J. L. (1997). Introduction to Probability. American Mathematical Society.
- Williamson, J. (2001). Markov Chains and Stochastic Stability. Springer.
- Bishop, Y. M. M., Fienberg, S. E., & Holland, P. W. (2007). Discrete Multivariate Analysis. Springer.
- Bishop, C. M. (2006). Pattern Recognition and Machine Learning. Springer.
- Koller, D., & Friedman, N. (2009). Probabilistic Graphical Models: Principles and Techniques. MIT Press.
- Casella, G., & Berger, R. L. (2002). Statistical Inference (2nd ed.). Duxbury.
- Johnson, N. L., Kemp, A. W., & Kotz, S. (2005). Univariate Discrete Distributions (3rd ed.). Wiley.
- Kallenberg, O. (2002). Foundations of Modern Probability. Springer.
- Ross, S. (2000). Applied Probability Models with Markov Chains. Academic Press.