Binary Cross Entropy Explained | What is Binary Cross Entropy | Log loss function explained

  Рет қаралды 25,673

Unfold Data Science

Unfold Data Science

Күн бұрын

Binary Cross Entropy Explained | What is Binary Cross Entropy | Log loss function explained
#BinaryCrossEntropy #LogLoss #UnfoldDataScience
Hello ,
My name is Aman and I am a Data Scientist.
About this video:
This video explains the concept of Binary cross entropy. I have explained Binary cross entropy loss and log loss explained. I have explained the concept with stepwise mathematical calculation. Below topics are explained in this video.
1. Binary Cross Entropy explained
2. What is Binary Cross Entropy
3. Log loss function explained
4. Understanding Binary cross entropy loss function
5. Loss function Binary cross entropy
About Unfold Data science: This channel is to help people understand basics of data science through simple examples in easy way. Anybody without having prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at high level through this channel. The videos uploaded will not be very technical in nature and hence it can be easily grasped by viewers from different background as well.
If you need Data Science training from scratch . Please fill this form (Please Note: Training is chargeable)
docs.google.co...
Book recommendation for Data Science:
Category 1 - Must Read For Every Data Scientist:
The Elements of Statistical Learning by Trevor Hastie - amzn.to/37wMo9H
Python Data Science Handbook - amzn.to/31UCScm
Business Statistics By Ken Black - amzn.to/2LObAA5
Hands-On Machine Learning with Scikit Learn, Keras, and TensorFlow by Aurelien Geron - amzn.to/3gV8sO9
Ctaegory 2 - Overall Data Science:
The Art of Data Science By Roger D. Peng - amzn.to/2KD75aD
Predictive Analytics By By Eric Siegel - amzn.to/3nsQftV
Data Science for Business By Foster Provost - amzn.to/3ajN8QZ
Category 3 - Statistics and Mathematics:
Naked Statistics By Charles Wheelan - amzn.to/3gXLdmp
Practical Statistics for Data Scientist By Peter Bruce - amzn.to/37wL9Y5
Category 4 - Machine Learning:
Introduction to machine learning by Andreas C Muller - amzn.to/3oZ3X7T
The Hundred Page Machine Learning Book by Andriy Burkov - amzn.to/3pdqCxJ
Category 5 - Programming:
The Pragmatic Programmer by David Thomas - amzn.to/2WqWXVj
Clean Code by Robert C. Martin - amzn.to/3oYOdlt
My Studio Setup:
My Camera : amzn.to/3mwXI9I
My Mic : amzn.to/34phfD0
My Tripod : amzn.to/3r4HeJA
My Ring Light : amzn.to/3gZz00F
Join Facebook group :
www.facebook.c...
Follow on medium : / amanrai77
Follow on quora: www.quora.com/...
Follow on twitter : @unfoldds
Get connected on LinkedIn : / aman-kumar-b4881440
Follow on Instagram : unfolddatascience
Watch Introduction to Data Science full playlist here : • Data Science In 15 Min...
Watch python for data science playlist here:
• Python Basics For Data...
Watch statistics and mathematics playlist here :
• Measures of Central Te...
Watch End to End Implementation of a simple machine learning model in Python here:
• How Does Machine Learn...
Learn Ensemble Model, Bagging and Boosting here:
• Introduction to Ensemb...
Build Career in Data Science Playlist:
• Channel updates - Unfo...
Artificial Neural Network and Deep Learning Playlist:
• Intuition behind neura...
Natural langugae Processing playlist:
• Natural Language Proce...
Understanding and building recommendation system:
• Recommendation System ...
Access all my codes here:
drive.google.c...
Have a different question for me? Ask me here : docs.google.co...
My Music: www.bensound.c...

Пікірлер: 29
@satyashreerath4945
@satyashreerath4945 3 жыл бұрын
Hi Aman, in corrected probability column you have not written correctly. You explained it correctly that corrected probability is that which takes the one class, then if u take let's say class 1 then for class zero corrected probability will be 0.7 and 0.8 in your example. Thank you.
@UnfoldDataScience
@UnfoldDataScience 3 жыл бұрын
Thanks Satya, I will check out.
@harshdwivedi7102
@harshdwivedi7102 3 жыл бұрын
Best video on Binary Cross-entropy , I have watched so many videos but nobody explained this specific point that we use probability for either of class and not corrected probability.
@UnfoldDataScience
@UnfoldDataScience 3 жыл бұрын
Thanks Harsha for watching.
@ajiteshthawait4771
@ajiteshthawait4771 Жыл бұрын
Loss function helps to optimize the problems. Loss function in classification. Calculate Actual probablity and corrected probability. Entropy= randomness in your data. Rows will helps to calculate corrected probability. Find the log values and the summation of all log values is the binary cross entropy. Probity of falling in either of the class. Of probablity of classification is higher, then lesser randomness.
@letslearndatasciencetogeth479
@letslearndatasciencetogeth479 3 жыл бұрын
Thank you so much sir I was so eager for your explanation on this topics
@AnishManGurung
@AnishManGurung 7 ай бұрын
Another amazing explanation.
@alwaysaditi2001
@alwaysaditi2001 2 жыл бұрын
thank you so much for the great explanation!!
@UnfoldDataScience
@UnfoldDataScience 2 жыл бұрын
Thanks Aditi
@jk-sm6qr
@jk-sm6qr 5 ай бұрын
Nice explaination
@shubha07m
@shubha07m Жыл бұрын
Good explanation!
@jasonbourn29
@jasonbourn29 Жыл бұрын
Couldn't understand but appreciate the effort
@technicalraman8848
@technicalraman8848 Жыл бұрын
Sir Thanks for this nice explanation. Sir can u plz make a video on purpose of hidden layer and what is the use of hidden layer by real life examples.
@sandipansarkar9211
@sandipansarkar9211 2 жыл бұрын
finished watching
@UnfoldDataScience
@UnfoldDataScience 2 жыл бұрын
Thanks Sandipan again.
@navoditmehta8833
@navoditmehta8833 3 жыл бұрын
Well explained....using the graph of this function could be more helpful.
@sanjeevkumarb8416
@sanjeevkumarb8416 10 күн бұрын
what is the base of the log to be used ???
@rajkir2852
@rajkir2852 3 жыл бұрын
Thanks sir, you are doing amazing job. Whiteboard aur thora bada kijiye sir maza ayega
@skvali3810
@skvali3810 2 жыл бұрын
Hi aaman how to Optimize the loss for classification problem. like in regression we use gradient descent for M and C value, like that is there have any Optimizers for classification .... This is a interview question can you please help me on that
@jackiemease4018
@jackiemease4018 2 жыл бұрын
Great video! I am not 100% sure I understand when you have the log of the probabilities in the right side of your table, which probability that is the log of. Is it original or corrected or some combination of both. Thanks in advance!
@UnfoldDataScience
@UnfoldDataScience 2 жыл бұрын
I will have to check,can't answer immediately
@letslearndatasciencetogeth479
@letslearndatasciencetogeth479 3 жыл бұрын
Hello Sir! Please explain why cannot loss function used in linear regression be used here in this case in details?? Also, how loss function is used to reach the global minima. Thanks a lot for the wonderful explaination
@rajkir2852
@rajkir2852 3 жыл бұрын
The job of loss function is used to calculate how much you are far away from actual value i.e predicted minus actual value will give you that. Then use gradient descent, its a optimization algo and it takes value of loss function and some parameters and gradient descent helps to reach global min
@letslearndatasciencetogeth479
@letslearndatasciencetogeth479 3 жыл бұрын
@@rajkir2852 thanks for the explanation
@rajkir2852
@rajkir2852 3 жыл бұрын
Sir, also waiting for categorical cross entropy plz
@dr.bheemsainik4316
@dr.bheemsainik4316 2 жыл бұрын
sir, what is the range of cross-entropy? and what is the unit of cross entropy?
@AftabGaming-st3tq
@AftabGaming-st3tq 2 жыл бұрын
Bhai hindi me bolta to tera kia jata
@UnfoldDataScience
@UnfoldDataScience 2 жыл бұрын
:D
Probability Vs Likelihood | Difference between Probability and Likelihood
7:57
At the end of the video, deadpool did this #harleyquinn #deadpool3 #wolverin #shorts
00:15
Anastasyia Prichinina. Actress. Cosplayer.
Рет қаралды 16 МЛН
Oh No! My Doll Fell In The Dirt🤧💩
00:17
ToolTastic
Рет қаралды 13 МЛН
Now it’s my turn ! 😂🥹 @danilisboom  #tiktok #elsarca
00:20
Elsa Arca
Рет қаралды 12 МЛН
Tips Tricks 15 - Understanding Binary Cross-Entropy loss
18:29
DigitalSreeni
Рет қаралды 21 М.
Intuitively Understanding the Cross Entropy Loss
5:24
Adian Liusie
Рет қаралды 79 М.
Cross Entropy Loss Error Function - ML for beginners!
11:15
Python Simplified
Рет қаралды 38 М.
Binary Cross-Entropy
10:20
Matt Yedlin
Рет қаралды 13 М.
Entropy (for data science) Clearly Explained!!!
16:35
StatQuest with Josh Starmer
Рет қаралды 602 М.
Why do we need Cross Entropy Loss? (Visualized)
8:13
Normalized Nerd
Рет қаралды 47 М.