Bayesian Statistics (BYS)

Bayesian statistics consists of methods for data analysis that are derived from the principle of Bayesian inference, that is the process of inductive learning via Bayes’ rule. While in the frequentist paradigm the inference is fully relied based on data, in Bayesian paradigm information from data is combined with experts’ judgement; providing a more comprehensive information.  We often use probabilities informally to express our information and beliefs about unknown quantities, and this uncertainty could be made formal in the modelling process, through Bayesian approach. Therefore, Bayesian could be viewed as a more flexible and realistic -compared to the frequentist approach- in representing and/or modelling the phenomenon of interest.

Prerequisites                          : GLM, SFDS, ADM, MFDS

Audience                                :

Objectives/Content               :

Upon finishing this module, the expected learning outcomes are:

  1. Participants understand the principle of Bayesian and the role of each component in Bayesian approach: prior distribution, likelihood function, and posterior distribution.
  2. Participants are able to determine proper distribution for prior specification given some data/models, and are able to analyze the impact of each the chosen distribution to the inference.
  3. Participants can perform the MCMC simulation using Gibbs and Metropolis-Hastings sampling algorithms
  4. Participants can employ some diagnostic tools to evaluate the convergence of MCMC simulation and assess the goodness of fit of the model.
  5. Participants understand and can apply normal model, linear regression, generalized linear models, hierarchical models, and clustering with Finite Mixture Models using Bayesian approach.
  6. Participants can compare between the standard (frequentist) approach and Bayesian approach, and are able to decide which approach is more preferred given some conditions/data/case studies.

Reference:

  1. Hoff, Peter D. A first course in Bayesian statistical methods. Vol. 580. New York: Springer, 2009.
  2. Congdon, Peter. Applied bayesian modelling. Vol. 595. John Wiley & Sons, 2014.
  3. Barber, D. (2012). Bayesian reasoning and machine learning. Cambridge University Press.
Topic IDTopic TitleLessons
BYS1Introduction and examples-Bayes rule, probability theory: (joint, marginal, conditional) distributions
-Some simple examples of Bayesian analysis in probability theory using hypothetical data
-Some examples of Bayesian inference in data analysis and models, and their corresponding analysis in frequentist inference
BYS2Random variables and distributions-Discrete: Binomial, Poisson, geometric, hypergeometric
-Continuous: Normal, Gamma, Beta, Uniform,
BYS3Bayes rule, model building and model diagnostics-Specification of prior distributions
-Likelihood construction
-Forming the posterior and MCMC for sampling from the posterior distribution
-Model summary and diagnostics: credible intervals, trace plots, autocorrelation plots, Geweke tests, distribution plots, chain mixing.
BYS4BYS Grouped Discussions 1-Recap discussions of BYS 1-3
BYS5The normal model-Model building and parameter estimation
-Inference on the mean
-The normal model for non-normal data
BYS6Group comparisons and hierarchical modeling-Comparing two groups
-Comparing multiple groups
-The hierarchical normal model
BYS7Linear regression-Model building
-Bayesian estimation for a regression model
-Model selection and model averaging
BYS8Generalized linear model-Model building
-Bayesian estimation for a generalized linear model
-Model selection and model averaging
BYS4BYS Grouped Discussions 2-Recap discussions of BYS 5-8
BYS8Finite Mixture Model-Model building
-Bayesian estimation
-Model selection and groups profiling