# AMS207, Spring 2014, Section 01: Homework

1. Obtain the mean and the variance for the beta-binomial distribution. Show that it tackles   the overdispersion problem. Hint: use the formulas for conditional expectations and variances.
2. Obtain the Laplace approximation for the posterior expection of  logit(mu) and log(tau) in the cancer mortality rate example (data available from the package LearnBayes).
3. Obtain the results for the parameters of the posterior distribution in the multivariate normal inverse wishart model on Page 70 of the book.
4. Redo the Iris Sepal example.
5. Problems 3.1,3. 2, 3.5; 4.1, 4.2
6. Problems 5.7,5.8,5.10, 5.12
7. Repeat the SAT example with:
1. Direct sampling from the posterior
2. Gibbs sampling from the posterior
3. Abrams and Sansó '98 approximations for the posterior moments
8. Problems 5.14, 5.15
9. Write the Bayes factor, BIC and DIC to compare a model where n observations are assumed to be sampled with a poisson distribution with a gamma prior, to a model where the observations are sampled from a binomial distribution, with a known number of trials and beta prior for the probability of success
10. Consider the SAT example, use the DIC to compare the models with no  pooling, total pooling and partial pooling based on a hierarchical model with unknown variance.
11. Problems 6.2,6.6; 7.4, 7.5
12. Problems 8.11, 8.14
13. For each of the examples considered in class regarding the censored and truncated weights data develop an approach based on MCMC with auxiliary variables and write the full conditionals.
14. Consider the following data regarding the heights in inches of male students at a college. First interval: less than 66, counts 14; second interval 66 to 68, counts 30; third interval 68 to 70, counts 49; fourth interval 70 to 72, counts 70; fifth interval 72 to 74, counts 33; sixth interval greater than 74, counts 15. Assume that the height of students is normally distributed, and assume a non-informative distribution for the parameters of the normal.
1. Use Metroplis Hastings to estimate the parameters of the normal using a multinomial likelihood.
2. Introduce latent variables a use Gibbs sampling to do the estimation. Compare.
15. Problems 13.5, 13.6
16. Consider an exponential sample that is subject to right censoring. Use the EM algorithm to find the mode of the mean of the distribution.
17. Find the marginal distribution of the regression coefficients in a linear normal model.
18. Show that the posterior predictive distribution of a new observation corresponding to a regression model is a student distribution as indicated in the slides.
19. Show that the posterior distribution of the regression parametersusing an informative prior is the same when the quadratic forms are completed direcrtly and when the prior is considered as additional data.
20. Consider a normal linear model where the errors have a cvariance matrix that is a multiple of the identity. Is it possible to obtain a conjugate prior for the regressioin coefficients and the variance? If so, find the posterior distribution.
21. Obtain the expressions for the matrginal distributions of the data and the Bayes factors for linear models using g-priors.
22. Use the EM algorithm to estimate the model of he posterior distribution of the location and scale parameters in a model where the observations follow a student distribution with fixed degrees of freedom and the prior is non-informative.
23. Show that a Beta-Binomial likelihood tackles the overdispersion problem in models for Poisson count data.
24. Problems 17.2, 17.3
25. Consider data corresponding to a mixture of M exponential densities. Consider appropriate conjugate priors for the parameters of the model. Find the full conditionals needed to explore the posterior distribution using a Gibbs sampler.