Cross Entropy vs. MSE as Cost Function for Logistic Regression for Classification [Lecture 2.5]

  Рет қаралды 4,185

AMILE - Machine Learning with Christian Nabert

AMILE - Machine Learning with Christian Nabert

Күн бұрын

Пікірлер: 8
@elonchan9675
@elonchan9675 2 жыл бұрын
Amazing video. I was wondering the different between MSE and Cross Entropy and nothing on the internet gives a clear and detailed explanation as you do. So sad you stop producing new videos, this kind of high quality explanation is what we need. May I ask how do you learn all this stuff?
@amile-machinelearningwithc4547
@amile-machinelearningwithc4547 2 жыл бұрын
Thank you for your nice comment. I hope to find time to make new videos, but the lectutes are beside my daily job just for fun...and fun time is at the momente rare ;-) But to your question: After my phd in physics, i started a job as a Data Scientist in the industry. I wanted to learn all the ML basics and read some books. To reflect my knowledge, as a side project, i wrote a lecture on ML for my former university. I hold the lecture and realized that i missed a lot of explanations and proofs. And, as you also mentioned, for some of these it is diffecult to find something in the internet. But often this does not matter (if you have some knowledge about mathematical proofs - i took a lot of math and physics courses at university). Start with looking up how a statement like "convex shape" is defined. Then write down all the math formulation of your "assumtions" like cost function and so on. They have to be substituted into the "left side" of the statement. Now, of course, the tricky part is to calculate that the "right side" of the statement shows up. This needs some experience you might become by studying math, physics or something similar. But I think most important is, to start by writing the first steps down on paper and not geting afraid that this problem is to diffecult or so. Hope this gives you a quick insight... Best regards
@Omar-kw5ui
@Omar-kw5ui 2 жыл бұрын
Fantastic video. Well done!
@veerjisingh6401
@veerjisingh6401 2 жыл бұрын
great job
@sophiayue2484
@sophiayue2484 2 жыл бұрын
Enjoy the video. Is it possible to get the slides of the presentation?
@amile-machinelearningwithc4547
@amile-machinelearningwithc4547 2 жыл бұрын
Hi, sorry, right kow, there is no website where you can download the slides. If they are available, i give a note here on KZbin.
@siliconvalley-z4j
@siliconvalley-z4j 2 жыл бұрын
why we must use the sigmoid, if we don't use the sigmoid, the loss function will be convex function
@amile-machinelearningwithc4547
@amile-machinelearningwithc4547 2 жыл бұрын
Thanks for your question/comment. Yes, in general, you can either modify the loss function or the classification-function to obtain a convex optimization task for classification. Logistic regession make use of the sigmoid function for solving classification tasks. It is a very common used function for classification. But indeed, you can use different functions for classification as well. For example, a trival choice is the linear function which gives you a convex optimization problem. However, this function has problems when is comes to outliers (kzbin.info/www/bejne/b5e3eaSvapaBldE). Because the logistic function has nice properties when it comes to classification, the easy way is to modify the loss function.
What is the Meaning of Cross Entropy/ Log Loss as Cost Function for Classification? [Lecture 2.6]
10:19
AMILE - Machine Learning with Christian Nabert
Рет қаралды 829
Counter-Strike 2 - Новый кс. Cтарый я
13:10
Marmok
Рет қаралды 2,8 МЛН
Почему Катар богатый? #shorts
0:45
Послезавтра
Рет қаралды 2 МЛН
Categorical Cross - Entropy Loss Softmax
8:15
Matt Yedlin
Рет қаралды 17 М.
Log Loss or Cross-Entropy Cost Function in Logistic Regression
8:42
Support Vector Machines - Main Ideas! [Lecture 3.1]
14:19
AMILE - Machine Learning with Christian Nabert
Рет қаралды 1,9 М.
Cross Entropy Loss Error Function - ML for beginners!
11:15
Python Simplified
Рет қаралды 41 М.
Loss Functions - EXPLAINED!
8:30
CodeEmporium
Рет қаралды 142 М.
Logistic Regression 4 Cross Entropy Loss
8:00
From Languages to Information
Рет қаралды 10 М.
Entropy (for data science) Clearly Explained!!!
16:35
StatQuest with Josh Starmer
Рет қаралды 656 М.
Explaining logistic regression
12:00
Very Normal
Рет қаралды 18 М.
Counter-Strike 2 - Новый кс. Cтарый я
13:10
Marmok
Рет қаралды 2,8 МЛН