How to implement Naive Bayes from scratch with Python

  Рет қаралды 28,523

AssemblyAI

AssemblyAI

Күн бұрын

In the 6th lesson of the Machine Learning from Scratch course, we will learn how to implement the Naive Bayes algorithm.
You can find the code here: github.com/Ass...
Previous lesson: • How to implement Rando...
Next lesson: • How to implement PCA (...
Welcome to the Machine Learning from Scratch course by AssemblyAI.
Thanks to libraries like Scikit-learn we can use most ML algorithms with a couple of lines of code. But knowing how these algorithms work inside is very important. Implementing them hands-on is a great way to achieve this.
And mostly, they are easier than you’d think to implement.
In this course, we will learn how to implement these 10 algorithms.
We will quickly go through how the algorithms work and then implement them in Python using the help of NumPy.
▬▬▬▬▬▬▬▬▬▬▬▬ CONNECT ▬▬▬▬▬▬▬▬▬▬▬▬
🖥️ Website: www.assemblyai...
🐦 Twitter: / assemblyai
🦾 Discord: / discord
▶️ Subscribe: www.youtube.co...
🔥 We're hiring! Check our open roles: www.assemblyai...
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
#MachineLearning #DeepLearning

Пікірлер: 14
@azharafridi9619
@azharafridi9619 5 ай бұрын
the coding is really awesome . i love it.. please would you provide the rest of algorithms of machine learning like adaboost, XGBoost etc and also deep learning ...
@ivanmateev
@ivanmateev Жыл бұрын
Great job
@AssemblyAI
@AssemblyAI Жыл бұрын
thank you!
@nelsondelarosa5490
@nelsondelarosa5490 6 ай бұрын
Excellent
@rashidkhan8161
@rashidkhan8161 7 ай бұрын
This won’t work on imbalance dataset. only class dominate which has more value, because mean and variance would be always higher the other class
@weebprogrammer2979
@weebprogrammer2979 3 ай бұрын
We need to apply techniques to handle imbalance dataset
@RastiMuzic
@RastiMuzic Жыл бұрын
This video is way too fast and not at all explained
@ChristianRoland7
@ChristianRoland7 6 ай бұрын
Probably need to learn the math beforehand, you can't just go into this without studying the math for a while. It takes time and a 14 minute video will never cut it.
@business_central
@business_central Жыл бұрын
"then we can simplify this a little bit so we can first get rid of p of x because this depends not on y at all so just throw this away " What?? That's then totally not the same formula, no clue what's happening here. " all these probabilities here are values between 0 and 1 and if we multiply this then the number can become very small and we can run into inaccuracies so for this we apply a little trick instead of the product we do a sum and then we apply the logarithm so if you apply the logarithm we can change the product with a sum and then this is the final formula to get y" , Ma man u just created a completely different formula and didn't provide any actual explanations or even maths references to check. this is heavily poorly explained and rushed. :/
@gokul.sankar29
@gokul.sankar29 Жыл бұрын
I agree but as an explanation for the first part we get rid of P(X) because for all possible values of classes (y) the X(the test sample) remains the same. Also for the logarithm part log(a.b) = log(a) + log(b) and as the probabilities will be between 0 and 1 the value of log(p(x_n | y)) will be negative. Since we are only doing the argmax i.e. the value of y for which the formula gives the maximum output the formula more or less does the same thing. Note: I am just saying this will happen when we apply argmax, you are 100% correct in saying that the formula has been changed and there should be a better explanation provided for the same.
@aidenbromaghin7303
@aidenbromaghin7303 Жыл бұрын
The goal is find the class with the highest posterior. P(X) will have little/no impact on determining which class will have a larger posterior so it can be removed. The product is converted to a sum of logs by a property of logarithms called the logarithmic product identity. This avoids the issue of numerical underflow--if you multiply a long list of floats, the product will converge towards 0. Using this property of logarithms allows us to avoid that issue. The video creator is assuming some familiarity with math, stats, and probability, but these are foundational to ML so its a fair assumption to make. I'd definitely recommend starting with those topics before tackling ML, or if you've studied them in the past, review any math you don't understand along the way. Actually going through and coding it yourself will also help a lot. Even though the solution is given, it'll give you the opportunity to modify the code to dive deeper into anything you don't understand, review those concepts, and develop a stronger understanding. I would recommend using a notebook environment so that you can write notes and formulas as well. That'll give you a resource to look back on if you need to review it in the future. Happy learning!
@jongxina3595
@jongxina3595 Жыл бұрын
I agree that the math part is rushed but u can find it on other resources. Look for explanations of "maximum a posteriori estimation"
@lubosnagy2741
@lubosnagy2741 Жыл бұрын
And what about head on over to implement kalman filter or particle filter from scratch ? Lets take forex data:)
@ManuelInfoSec
@ManuelInfoSec 2 жыл бұрын
What recording software do you use?
Naive Bayes, Clearly Explained!!!
15:12
StatQuest with Josh Starmer
Рет қаралды 1 МЛН
iPhone or Chocolate??
00:16
Hungry FAM
Рет қаралды 37 МЛН
Every parent is like this ❤️💚💚💜💙
00:10
Like Asiya
Рет қаралды 9 МЛН
когда не обедаешь в школе // EVA mash
00:57
EVA mash
Рет қаралды 2,9 МЛН
POV: Your kids ask to play the claw machine
00:20
Hungry FAM
Рет қаралды 22 МЛН
How to implement Decision Trees from scratch with Python
37:24
AssemblyAI
Рет қаралды 65 М.
Naive Bayes classifier: A friendly approach
20:29
Serrano.Academy
Рет қаралды 143 М.
Naive Bayes Classifier: A Practical Tutorial with Scikit-Learn
17:55
Ryan & Matt Data Science
Рет қаралды 1,9 М.
The Math Behind Bayesian Classifiers Clearly Explained!
11:53
Normalized Nerd
Рет қаралды 88 М.
How to implement Linear Regression from scratch with Python
17:03
Gaussian Naive Bayes, Clearly Explained!!!
9:26
StatQuest with Josh Starmer
Рет қаралды 341 М.
How to implement KNN from scratch with Python
9:24
AssemblyAI
Рет қаралды 88 М.
Decision Tree Classification in Python (from scratch!)
17:43
Normalized Nerd
Рет қаралды 192 М.
How I Would Learn Python FAST in 2024 (if I could start over)
12:19
Thu Vu data analytics
Рет қаралды 347 М.
iPhone or Chocolate??
00:16
Hungry FAM
Рет қаралды 37 МЛН