This course is an intro to applying Bayesian data analysis. We will cover the necessary probability theory to understand Bayes rule and to derive posterior distributions for simple models. We will discuss MCMC approaches to estimating posteriors including Gibbs sampling and the basics of Hamiltonian MCMC. In the latter half of the course, we will implement and explore a number of variations of generalized linear models, ranging in complexity from t-tests to hierarchical ordinal regression.
May not be enrolled in one of the following Levels: Undergraduate. Must be enrolled in one of the following Programs: Human Factors/Ind Psych - PHD.