Part III: Bayesian methods

The first two parts of this book present a broad toolkit of classical statistical methods for analysing continuous, categorical, and count data. In Part III, we step back from the classical framework and take a fresh look at statistical inference through the Bayesian lens. Rather than treating parameters as fixed but unknown quantities, Bayesian methods treat them as random variables, allowing us to quantify uncertainty about them directly with probability distributions.

Bayesian approaches have grown rapidly in influence across many areas of research because they provide a flexible and unified way to incorporate prior information, adapt models to complex data structures, and make probabilistic statements about quantities of interest. They are particularly powerful when data are sparse, models are hierarchical, or conventional methods are difficult to apply.

In this part, we devote a single chapter to introducing Bayesian reasoning and practice in a way that is accessible yet comprehensive enough to stand on its own. We begin with the fundamental building blocks — priors, likelihoods, and Bayes’s theorem — before moving to practical applications in regression models. You will learn how modern computational tools such as Markov chain Monte Carlo make Bayesian analysis possible for complex models

By presenting Bayesian methods in one self-contained chapter, this part is designed to give you a clear conceptual overview, a practical workflow, and a direct path to applying Bayesian analysis in your own research. Whether you choose to adopt the Bayesian approach as your default or to keep it as a complementary perspective alongside classical methods, you will have the knowledge to do so with confidence.