Bayesian statistics consists of methods for data analysis that are derived from the principle of Bayesian inference, that is the process of inductive learning via Bayes’ rule. While in the frequentist paradigm the inference is fully relied based on data, in Bayesian paradigm information from data is combined with experts’ judgement; providing a more comprehensive information. We often use probabilities informally to express our information and beliefs about unknown quantities, and this uncertainty could be made formal in the modelling process, through Bayesian approach. Therefore, Bayesian could be viewed as a more flexible and realistic -compared to the frequentist approach- in representing and/or modelling the phenomenon of interest.

**Prerequisites** : GLM, SFDS, ADM, MFDS

**Audience :**

**Objectives/Content** :

Upon finishing this module, the expected learning outcomes are:

- Participants understand the principle of Bayesian and the role of each component in Bayesian approach: prior distribution, likelihood function, and posterior distribution.
- Participants are able to determine proper distribution for prior specification given some data/models, and are able to analyze the impact of each the chosen distribution to the inference.
- Participants can perform the MCMC simulation using Gibbs and Metropolis-Hastings sampling algorithms
- Participants can employ some diagnostic tools to evaluate the convergence of MCMC simulation and assess the goodness of fit of the model.
- Participants understand and can apply normal model, linear regression, generalized linear models, hierarchical models, and clustering with Finite Mixture Models using Bayesian approach.
- Participants can compare between the standard (frequentist) approach and Bayesian approach, and are able to decide which approach is more preferred given some conditions/data/case studies.

**Reference:**

- Hoff, Peter D.
*A first course in Bayesian statistical methods*. Vol. 580. New York: Springer, 2009. - Congdon, Peter.
*Applied bayesian modelling*. Vol. 595. John Wiley & Sons, 2014. - Barber, D. (2012).
*Bayesian reasoning and machine learning*. Cambridge University Press.

Topic ID | Topic Title | Lessons |

BYS1 | Introduction and examples | -Bayes rule, probability theory: (joint, marginal, conditional) distributions -Some simple examples of Bayesian analysis in probability theory using hypothetical data -Some examples of Bayesian inference in data analysis and models, and their corresponding analysis in frequentist inference |

BYS2 | Random variables and distributions | -Discrete: Binomial, Poisson, geometric, hypergeometric -Continuous: Normal, Gamma, Beta, Uniform, |

BYS3 | Bayes rule, model building and model diagnostics | -Specification of prior distributions -Likelihood construction -Forming the posterior and MCMC for sampling from the posterior distribution -Model summary and diagnostics: credible intervals, trace plots, autocorrelation plots, Geweke tests, distribution plots, chain mixing. |

BYS4 | BYS Grouped Discussions 1 | -Recap discussions of BYS 1-3 |

BYS5 | The normal model | -Model building and parameter estimation -Inference on the mean -The normal model for non-normal data |

BYS6 | Group comparisons and hierarchical modeling | -Comparing two groups -Comparing multiple groups -The hierarchical normal model |

BYS7 | Linear regression | -Model building -Bayesian estimation for a regression model -Model selection and model averaging |

BYS8 | Generalized linear model | -Model building -Bayesian estimation for a generalized linear model -Model selection and model averaging |

BYS4 | BYS Grouped Discussions 2 | -Recap discussions of BYS 5-8 |

BYS8 | Finite Mixture Model | -Model building -Bayesian estimation -Model selection and groups profiling |

You must log in to post a comment.