In this post, you will learn about the difference between Frequentist vs Bayesian Probability. It is of utmost important to understand these concepts if you are getting started with Data Science.
What is Frequentist Probability?
The probability of occurrence of an event, when calculated as a function of the frequency of the occurrence of the event of that type, is called as Frequentist Probability. For example, the probability of rolling a dice (having 1 to 6 number) and getting a number 3 can be said to be Frequentist probability. Consider another example of head occurring as a result of tossing a coin. Note that the Frequentist frequencies can be calculated by conducting the experiment in a repetitive manner for possibly a large number of times and calculating the probability by counting the number of times an of particular type occurred.
What is Bayesian Probability?
The probability of the occurrence of an event when calculated based on the degree of belief (based on the prior knowledge) is called the Bayesian probability. Such events do not fall under repetitive kind of events. For example, let’s say a civil engineer is asked about the likelihood or probability of a flyover bridge crashing down in the coming rainy season. The civil engineer would be able to speak about the chances based on his/her degree of belief (vis-a-vis data made available to him about the life of the bridge, construction material used etc). This is because events such as falling of flyover bridge can’t be repeated multiple times (doesn’t make sense in the first place) to calculate the probability or chance.
Mathematically, a Bayesian probability is calculated using Bayes Rule formula which is used for determining how strongly a set of evidence support the hypothesis. In other words, it is used to calculate the conditional probability of a given hypothesis given a set of evidence. Given the hypothesis is H, and the evidence is E, the fact related to how strongly the hypothesis H is supported by evidence E can be calculated as P(H/E). The following is the formula of Bayes Rule.
In the above formula,
P(H/E) is the probability of hypothesis H to take place (or, H is true) given that the evidence E happened (or, E is true). It is also termed as Posterior Probability of Hypothesis, H.
P(H) is the probability of the hypothesis before learning about the evidence E. It is also called as Prior Probability of Hypothesis H.
P(E/H) is the likelihood that the evidence E is true or happened given the hypothesis H is true.
P(E) is the probability of the evidence E to occur irrespective of whether the hypothesis H is true or false. It is also called the total probability of the evidence.
Using above example, the Bayesian probability can be articulated as the probability of flyover bridge crashing down given it is built 25 years back. It can also be read as to how strongly the evidence that the flyover bridge is built 25 years back, supports the hypothesis that the flyover bridge would come crashing down. Here the hypothesis is that “the flyover bridge crashes down” (let’s call it BRIDGE_CRASHING_DOWN) and the evidence or supporting facts is “the flyover bridge is built 25 years back” (let’s call it BRIDGE_BUILT_25_YEARS_BACK).
The Bayesian probability can be defined as P(BRIDGE_CRASHING_DOWN / BRIDGE_BUILT_25_YEARS_BACK).
It could be read as the following:
P(BRIDGE_CRASHING_DOWN) is the probability of the bridge crashing down even before the evidence such as the age of the bridge was known.
P(BRIDGE_BUILT_25_YEARS_BACK/BRIDGE_CRASHING_DOWN) is the probability that bridge is found to be built 25 years back given that bridge came crashing down.
P(BRIDGE_BUILT_25_YEARS_BACK) is the probability that the bridge is built 25 years back.
In this post, you learned about what is Frequentist Probability and Bayesian Probability with examples and their differences.
Did you find this article useful? Do you have any questions or suggestions about this article? Leave a comment and ask your questions and I shall do my best to address your queries.
- When to use Deep Learning vs Machine Learning Models? - January 17, 2021
- Most Common Types of Machine Learning Problems - January 14, 2021
- Historical Dates & Timeline for Deep Learning - January 10, 2021