Bayes Theorem in Machine Learning - Table of Content
- What is Machine Learning
- What is Bayes Theorem
- Introduction to Bayes Theorem in Machine Learning
- Prerequisites for Bayes Theorem
- Bayes rule in Machine Learning
- Advantages of Bayes rule in Machine Learning
- Disadvantages of Bayes rule in Machine Learning
- Conclusion
What is Machine learning
Machine Learning as the name suggests is a process that refers to the learning of a machine, i.e. a computer system learning like a human being using algorithms and different statistical methods. It is a branch of Artificial Intelligence that learns from all the data it gets and makes decisions according to it without committing any mistakes and without any human intervention.
The systems utilise algorithms in an iterative process to all the data they come across to learn from it and eventually grow to provide more powerful insights. These insights are used by several big organisations that have loads of data to make some important, efficient decisions to make a difference in this competitive world.
Once the machines have learned enough they begin to learn themselves with the help of previous iterations. So, the applications become independent and more reliable once they start using pattern recognition in making decisions.
Become a machine learning Certified professional by learning this HKR Machine Learning Training !
What is Bayes theorem
In plain words, Bayes Theorem is a mathematical formula used in probability and statistics to determine conditional probability. To understand better, you need to know what is conditional probability.
Conditional probability is the probability of something happening, provided something else has already happened, i.e the proximity of one event occurring depends on the probability of the another.
To understand the concept of conditional probability with Bayes Theorem better, let’s take an example:
There are three containers and each container has 5 marbles in it. The first container has 3 white and 2 black marbles, the second one has 2 white and 3 black marbles, and the third one has 1 black and 4 white marbles. The probability of any container being chosen is equal. So, find out the probability of the marble being picked is “white.”
Assume E1, E2, and E3 are the events of choosing the containers. Since, they have an equal probability of being chosen,
P(E1) = P(E2) = P(E3) = 1/3
Now, assume E is the event when a white marble is drawn from the container. So,
P(E/E1) = 3/5
P(E/E2) = 2/5
P(E/E3) = 1/5
Now, applying the total probability theorem - P(E) = n∑i=1P(E/Ei) . P(Ej)
P(E) = P(E/E1) . P(E1) + P(E/E2) . P(E2) + P(E/E3) . P(E3)
P(E) = (3/5 * 1/3) + (2/5 * 1/3) + (4/5 * 1/3)
P(E) = 9/15 = 3/5
Now, when you know about Conditional Probability and Bayes Theorem, you are all set to learn about Bayes Theorem in Machine Learning.
Machine Learning Training
- Master Your Craft
- Lifetime LMS & Faculty Access
- 24/7 online expert support
- Real-world & Project Based Learning
Introduction to Bayes Theorem in Machine Learning
Bayes Theorem utilises the Bayesian method to calculate conditional probabilities in Machine Learning systems. The theorem is widely used to enable systems in determining faultless probabilities and predictions.
There are times when the simplified version of the Bayes Theorem called aïve Bayes classification is preferred as it reduces the time of computation and overall project cost. There are several other names like Bayes Law or Bayes Rule that you will often come across instead of Bayes Theorem.
The basic ideology of the theorem focuses on determining the probability of something depending on some random or uncertain value. Though it may sound complex it is an extremely simple calculation and bridges conditional probability to marginal probability.
Since it is all based on data and random information it always overpowers the places where intuitions are used. Being all about probability and numbers doesn’t mean Bayes Theorem is not only used in the financial industry, it is widely utilised in health, medical, research, and survey industries too.
By now you know the basics of the Bayes Theorem, but before diving deep into it there are certain prerequisites that you must know.
Prerequisites for Bayes Theorem
Several important prerequisites essential to dive deep into the Bayes rule in Machine Learning are:
Experiment
An experiment might sound like a huge scientific word, but it just means an activity that is performed under controlled or foreseeable circumstances.
For example: tossing a coin or pulling a ball out of a box is an experiment.
Sample Space
All the possible outcomes of an experiment are referred to as a sample space, while the exact result is called an outcome.
For example: when you toss a coin, there are two possibilities either a head or a tail, so these two are two sample spaces of the coin toss experiment.
Event
An event can be termed as the set of outcomes, which makes it the subset of the sample space.
For example: if a dice is rolled there can be 6 possible outcomes which means the sample space will be {1, 2, 3, 4, 5, 6}.
Now, here there can be two events, either the values will be even or odd.
So, the two possible events are:
E : even numbers = {2, 4, 6}
O : odd numbers = {1, 3, 5}
Random Variable
A random variable is just some variable that takes on a random value that has some probability.
For example: if you toss a coin, there can be two random variables - for heads it could be +1 and for tails, it could be -1. Both of these random variables will have a probability of 1/2.
Exhaustive Events
Exhaustive Events refer to the set of events out of which one is bound to happen when the experiment takes place.
For example: when you toss a coin, there can be two possibilities, heads or tails. So, both of these are exhaustive events.
Independent Events
As the name suggests Independent Events are the ones that do not depend on the occurrence of other events.
For example: when you flip the coin once it could be heads or tails, and when you flip it again it could be heads or tails again, independent of the previous flip outcome.
Conditional Probability
Conditional Probability refers to the probability of an event depending on the occurrence of some other event.
For example: The probability of drawing a three from the card deck when the card you drew is red.
Marginal Probability
Marginal Probability is the probability of an event occurring independently of the probability of another event.
For example: Flipping a coin twice with both flips having no dependency on each other.
Subscribe to our YouTube channel to get new updates..!
Bayes rule in Machine Learning
Bayes rule is one of the most important and popular concepts of Machine Learning. The theorem focuses on finding the probability of an event with not-enough knowledge and that too when another event has already occurred.
The above can be found using the product rule and conditional probability. Let’s say there are two events X and Y.
So, the probability of X happening after Y has happened:
P(X ? Y)= P(X|Y) P(Y)
Now, the probability of Y happening once you X is:
P(X ? Y)= P(Y|X) P(X)
Now, to get the Bayes equation you must combine the above two:
P(X|Y) = P(Y|X) P(X)
P(Y)
To understand the above Bayes equation better, you must know the following:
Posterior Probability
Here P(X|Y) is the Posterior Probability, i.e. the probability after considering new evidence.
Likelihood
Here P(Y|X) is the likelihood, i.e. evidence’s probability provided the hypothesis is correct.
Prior Probability
Here P(X) is the Prior Probability, i.e. hypothesis’ probability before the evidence was considered.
Marginal Probability
Here P(Y) is the Marginal Probability, i.e. evidence’s probability under any circumstance.
Hence, the Bayes Theorem can be formulated as the product of likelihood and prior probability or evidence.
Bayes Theorem makes the entire Machine Learning system more accurate and efficient. Knowing how important conditional probability is for Machine Learning, it won’t be wrong to conclude that the more the data or information, the more accurate the result will be.
Now, when we know what Bayes Theorem is all about and what is its relevance, especially with respect to Machine Learning, let’s have a look at its advantages.
Advantages of Bayes rule in Machine Learning
Bayes Theorem proves it's worth a lot of time in Machine Learning. Some of the biggest advantages of the Bayes rule are:
- Simple and impactful method to calculate conditional probability and make the overall princess of classification process effective and efficient.
- Bayes law is way simpler than all the other rules and concepts wherever the principle of independent predictors is true.
- It is easy to implement the theorem with other Machine Learning concepts.
- Even a small training can make a lot of difference when it comes to the estimation of test data which reduces the overall time period.
- Improves accuracy of the Machine Learning systems with the help of conditional probability which wouldn’t have been possible otherwise based on intuitions.
Knowing just the advantages is being partial so let’s have a look at the disadvantages of the Bayes Rule too.
Disadvantages of Bayes rule in Machine Learning
Some of the disadvantages of the Bayes rule in Machine Learning are:
- The biggest disadvantage of the Bayes Theorem is that it assumes that all the features are independent which is practically not possible in real life. Considering the real-life scenarios, the features show some of the other dependencies.
- The posterior probability is greatly influenced by the prior probability, but not always the prior probabilities are convincing for the involved parties as they may find the prior probability invalid. Even the slightest miss concerning priors may lead to wrong results.
- The computation costs are high as the model requires a lot of parameters.
Conclusion
By now, you would be well versed with everything you need to know about Bayes Theorem in Machine Learning. You began with learning the basics of Machine Learning followed by the basics of Bayes Theorem or Bayes Law.
Once you were done with the basics of the two you learnt a little about how Bayes Theorem calculates conditional probabilities and makes Machine Learning even more effective. Followed by that, you learnt about the different prerequisites necessary to understand the overall relevance of the Bayes Theorem. These prerequisites focus on several terms related to probability like Exhaustive Events, Independent Events, Conditional Probability, and Marginal Probability.
Being well versed with the prerequisites of the Bayes Theorem, you went to understand the complete picture of the Bayes rule in Machine Learning where you not only focused on its formula but also on how it makes the overall process impactful. In the process, you learned about different terms like prior probability and posterior probability.
Finally, when you had an elaborated idea about Bayes Theorem in Machine Learning you went on to understand the advantages and disadvantages of the same. While at it you saw how the Bayes rule is simple and effective but at the same time fails when it is compared with real-life scenarios.
Related Article:
Classifications in Machine Learning
About Author
A technical lead content writer in HKR Trainings with an expertise in delivering content on the market demanding technologies like Networking, Storage & Virtualization,Cyber Security & SIEM Tools, Server Administration, Operating System & Administration, IAM Tools, Cloud Computing, etc. She does a great job in creating wonderful content for the users and always keeps updated with the latest trends in the market. To know more information connect her on Linkedin, Twitter, and Facebook.
Upcoming Machine Learning Training Online classes
Batch starts on 25th Nov 2024 |
|
||
Batch starts on 29th Nov 2024 |
|
||
Batch starts on 3rd Dec 2024 |
|
FAQ's
Bayes' theorem in Artificial intelligence Bayes' theorem: In probability theory and statistics, Bayes' theorem (alternatively Bayes' law or Bayes' rule; recently Bayes–Price theorem ), named after Thomas Bayes, describes the probability of an event based on prior knowledge of conditions that might be related to the event.
Bayes' theorem allows you to update the predicted probabilities of an event by incorporating new information.It is often employed in finance in updating risk evaluationBayes' rule is used on various occasions, including medical testing for a rare disease. With Bayes' rule, we can estimate the probability of having the condition given the test coming out positive.