Priors : Initial Beliefs

fieldnotes
2 min readJun 16, 2021

In this post, we will be introducing the likelihood’s neighbor:

A wonderful version of Bayes Rule presented by Ben Lambert in his text A Student’s Guide to Bayesian Statistics

Priors. They are exactly what it sounds like. They represent our most current, basic understanding and interpretation of a phenomena.

Here, we present a few ways of understanding what a prior is. We will go through each one.

Different ways of describing “priors”

1) p(theta)/a parameter probability distribution

Why do we need a probability distribution for a parameter?

If you remember, we are on the look out for the most optimal parameter, so it makes sense that we have a lot of options to choose from.

By having a probability distribution of priors, we are allowing ourselves to collect multiple samples and therefore, multiple parameters from a population to work with.

With a probability distribution, we are also acknowledging that there is some uncertainty that comes with a “true parameter value”.

2) initial beliefs

Bayes rule is a way for us to update current beliefs, using new data.

Without an initial belief, we cannot advance!

So in order to make the most out of incoming data, we must establish and specify an initial belief through a prior.

3) weighting given to each parameter value

This will be particularly further explored in the next couple posts.

What do we mean by this?

Since the parameters are expressed through a distribution, they essentially are given an individual “weight” that contributes to the shape of the posterior distribution.

Main takeaway: Priors are a key part of Bayes Rule that ensures that our pre-existing knowledge and assumptions are taken into account when we construct updated predictions in the future.

Research done from: “Priors” A Student’s Guide to Bayesian Statistics, by Ben Lambert, SAGE, 2018.

--

--