Neural Network RMSE and Log Loss Error Calculation from Scratch (4.5)

  Рет қаралды 6,174

Jeff Heaton

Jeff Heaton

Күн бұрын

Пікірлер: 11
@anuvratanand5610
@anuvratanand5610 5 жыл бұрын
Hey I’m a student from India You’re an amazing person for putting this on KZbin and an even better teacher Thanks a ton
@BiancaAguglia
@BiancaAguglia 5 жыл бұрын
A great video, Jeff. Going through these calculations by hand is super useful. It's another step in truly understanding ML models and, consequently, becoming an expert at data science and ML. 😊
@pardeepsangruri
@pardeepsangruri 4 жыл бұрын
hi Jeff, i have a problem understanding the example in the sklearn documentation from sklearn.metrics import log_loss >>> log_loss(["spam", "ham", "ham", "spam"], ... [[.1, .9], [.9, .1], [.8, .2], [.35, .65]]) 0.21616... can you please explain this and extend the log_loss idea to multiclass classification. Where they take np.argmax of true_values and subtract the predicted. can you please make a video and expain this concept. Thanks a lot for making this great video
@anandtewari8014
@anandtewari8014 4 жыл бұрын
how do you pass class Label as input in the machine in classification problems?
@HeatonResearch
@HeatonResearch 4 жыл бұрын
If they are input, they need to be encoded to dummy/one-hot values.
@davidrfarris
@davidrfarris 4 жыл бұрын
What white board software do you use for you lessions? I am thinking of doing a KZbin on database design and setup.
@HeatonResearch
@HeatonResearch 4 жыл бұрын
For that one I just used an iPad with an Apple pencil and recorded from a laptop.
@davidrfarris
@davidrfarris 4 жыл бұрын
@@HeatonResearch thank you
@subhashpujari
@subhashpujari 4 жыл бұрын
Hi Jeff, Thank you very much for such a nice explanation. How one can extend it for a multiclass problem?
@ronmedina429
@ronmedina429 5 жыл бұрын
I found that writing things by hand has helped me understand the algorithms more. Thanks, Mr. Heaton. Also I now know that you don't have to have good handwriting to become successful in academia. ;-)
@HeatonResearch
@HeatonResearch 5 жыл бұрын
Good handwriting can actually hold you back! :-) Bad handwriting actually caused one of my teachers in grade school to suggest to my parents that they should buy a computer type reports.
Introduction to Regularization: Ridge and Lasso (5.1)
8:11
Jeff Heaton
Рет қаралды 7 М.
Backpropagation, Nesterov Momentum, and ADAM Training (4.4)
23:30
Who is More Stupid? #tiktok #sigmagirl #funny
0:27
CRAZY GREAPA
Рет қаралды 10 МЛН
Cross Entropy vs. MSE as Cost Function for Logistic Regression for Classification [Lecture 2.5]
19:31
AMILE - Machine Learning with Christian Nabert
Рет қаралды 4,1 М.
Neural Networks Part 8: Image Classification with Convolutional Neural Networks (CNNs)
15:24
Using K-Fold Cross Validation with Keras (5.2)
15:59
Jeff Heaton
Рет қаралды 26 М.
Bootstrapping and Benchmarking Hyperparameters (5.5)
13:54
Jeff Heaton
Рет қаралды 4,7 М.
Lecture 3 | Loss Functions and Optimization
1:14:40
Stanford University School of Engineering
Рет қаралды 903 М.
6.8. Loss Function in Machine Learning
14:19
Siddhardhan
Рет қаралды 13 М.
Deep Learning and Neural Network Introduction with Keras (3.1)
16:21
Why do we need Cross Entropy Loss? (Visualized)
8:13
Normalized Nerd
Рет қаралды 51 М.