Logistic Regression Part 4 | Loss Function | Maximum Likelihood | Binary Cross Entropy

  Рет қаралды 61,535

CampusX

CampusX

Күн бұрын

Пікірлер: 75
@Cave_man_22
@Cave_man_22 Ай бұрын
You are the greatest teacher I ever had in my life. I always had questions about why this and why that but never found satisfied answers. But after watching your videos, everything feels like it should be what it is. You are really amazing, sir. You are the only one who satisfies my hunger for knowledge and what is happening behind the scenes. Hatsoff, sir. I wish one day I would meet you in person. Again, thank you very much for making such informative videos.
@harkiratmakkar9202
@harkiratmakkar9202 10 ай бұрын
Absolutely Love your content sir. Just one correction. We minimise cross entropy because we add an extra minus sign. So, to maximise likelihood, we minimise the cross entropy. What you said is wrong that log of smaller number has higher value. Logarithm is a continuously increasing function, so to maximise f(x), you can maximise log(f(x)) which is equivalent to minimising -log(f(x))
@siddheshkalgaonkar2752
@siddheshkalgaonkar2752 28 күн бұрын
He has exactly said the same. Maybe you have misinterpreted.
@waseemrandhawa5658
@waseemrandhawa5658 3 жыл бұрын
First person on a KZbin whose teaching style is impressive. You are totally awesome. 😍 sir main ne You Tube pay probably 10 to 20 video likes ki hn gi. But aap ka jo teaching style hai majboor kar deta ha video ko like karne me aur dil say feelings ati hai "ja pana tussi great ho" 😇
@bluestone2523
@bluestone2523 2 жыл бұрын
same here
@saiprashanth1587
@saiprashanth1587 2 жыл бұрын
This is the best video on logistic regression and this is the best ML playlist in youtube. Thank you so much sir
@binilrajadhikari2643
@binilrajadhikari2643 3 жыл бұрын
This is one of the best tutorial about Logistic Regression in KZbin.
@sanaayakurup5453
@sanaayakurup5453 2 жыл бұрын
I have never seen such a concise playlist on Logit. Thank you sir, all my doubts are solved!
@arslanahmed1311
@arslanahmed1311 Жыл бұрын
We really appreciate the amount of effort you put into making this playlist
@singnsoul6443
@singnsoul6443 Жыл бұрын
I was looking for content to study machine learning for GATE DA. Thank you so much for, hands down, THE BEST content.
@nitinchityal583
@nitinchityal583 2 жыл бұрын
Sir, you have made my machine-learning journey like a roller coaster... Thoroughly enjoyed
@life3.088
@life3.088 2 жыл бұрын
i have ur all on downlaoad due to net issue,but when i have net than i prefer to watch online .this is a small gift my side
@priyanshutanwar
@priyanshutanwar 3 жыл бұрын
The best explanation of cross entropy Ive come across!
@HA-bj5ck
@HA-bj5ck Жыл бұрын
Give this man a Medal!!!
@purubhatnagar483
@purubhatnagar483 11 ай бұрын
you deserve an appreciation. best content i have read so far. you have my blessing. will buy your mentorship program too.
@dakshbhatnagar
@dakshbhatnagar 2 жыл бұрын
Dhaansu Explanation, Bhai!! It's interesting to see how math is worked out.
@abdulwahabkhan1086
@abdulwahabkhan1086 3 ай бұрын
Yaar kamal k bandy ho aap!! Me jan'na chata hun k aap kitni mehnat krty ho ek video banany me.
@aiml66_bangerashashankchan81
@aiml66_bangerashashankchan81 Жыл бұрын
The most goated Logistic regression explaination ever
@manojssonawane
@manojssonawane 2 жыл бұрын
Salute to you sir....The depth at which you are delivering the machine learning knowledge is very appreciable....Wish you a great fortune sir...
@ParthivShah
@ParthivShah 10 ай бұрын
Thank You Sir.
@akhilp6263
@akhilp6263 Жыл бұрын
This brought meaning to my life ❤🥂
@tanujrana1212
@tanujrana1212 2 жыл бұрын
You are delivering much better content compared to some self proclaimed "leading provider of ML content".. Kudos..✌️✌️
@anuragpandey5748
@anuragpandey5748 5 ай бұрын
best in depth explanation on internet 👍
@PathakMoee
@PathakMoee Ай бұрын
Thank you sir
@pranalipatle2215
@pranalipatle2215 2 жыл бұрын
what a explanation .........felt clear& awesome
@YashJaiswal-lr7vi
@YashJaiswal-lr7vi 8 ай бұрын
Great Explaination. I want to add one correction: -1 < -0.04
@Ravi-sl5ms
@Ravi-sl5ms 3 жыл бұрын
Thank you. Wonderfully explained..
@pritamrajbhar9504
@pritamrajbhar9504 8 ай бұрын
explanation was simple and on point.
@Thebeautyoftheworld1111
@Thebeautyoftheworld1111 Жыл бұрын
great work.Nice explanation!!!!!!God bless
@anilkathayat1247
@anilkathayat1247 9 ай бұрын
Best Explanation Sir❤
@rockykumarverma980
@rockykumarverma980 4 ай бұрын
Thank you so much sir🙏🙏🙏
@MD.SAFANURISLAM
@MD.SAFANURISLAM Ай бұрын
Timestamps 00:02 - The perceptron algorithm failed to produce the optimal solution in logistic regression. 02:33 - In logistic regression, we need to find a loss function that minimizes the mistakes made by the model. 04:51 - The launch function helps determine the quality of a model and choose the best one. 07:37 - Calculating prediction probabilities and understanding the logistic regression model 10:20 - The Maximum Likelihood method calculates the product of probabilities for each data point to determine the better model. 13:01 - Replacing product with sum using logarithm 15:44 - Cross entropy is a key concept in logistic regression. 18:45 - The goal is to minimize the maximum likelihood in the binary cross entropy loss function. 21:14 - The formula used in logistic regression for binary cross entropy may not work for all cases 24:12 - Introduction to loss function and maximum likelihood 27:07 - Logistic Regression Loss Function and Maximum Likelihood
@GauravKumarGupta-fn8pw
@GauravKumarGupta-fn8pw Жыл бұрын
Best explanation. thanks Sir
@kalluriramakrishna5732
@kalluriramakrishna5732 4 ай бұрын
Another level 💯
@justcodeitbro1312
@justcodeitbro1312 Жыл бұрын
wow thanks for this great explanation.
@sahilvimal8795
@sahilvimal8795 7 ай бұрын
00:02 The perceptron algorithm failed to produce the optimal solution in logistic regression. 02:33 In logistic regression, we need to find a loss function that minimizes the mistakes made by the model. 04:51 The launch function helps determine the quality of a model and choose the best one. 07:37 Calculating prediction probabilities and understanding the logistic regression model 10:20 The Maximum Likelihood method calculates the product of probabilities for each data point to determine the better model. 13:01 Replacing product with sum using logarithm 15:44 Cross entropy is a key concept in logistic regression. 18:45 The goal is to minimize the maximum likelihood in the binary cross entropy loss function. 21:14 The formula used in logistic regression for binary cross entropy may not work for all cases 24:12 Introduction to loss function and maximum likelihood 27:07 Logistic Regression Loss Function and Maximum Likelihood Crafted by Merlin AI.
@arslanahmed1311
@arslanahmed1311 Жыл бұрын
100% samajh aa gai
@hasanrants
@hasanrants 4 ай бұрын
thank you Sir, completed on 17th September 2024 @10:25PM
@sujithsaikalakonda4863
@sujithsaikalakonda4863 2 жыл бұрын
Great Explanation sir.
@unkownboy9962
@unkownboy9962 Жыл бұрын
why this channel is so underrated
@rambaldotra2221
@rambaldotra2221 3 жыл бұрын
Loved it Mind Blowing ✨
@vrushabhjain1542
@vrushabhjain1542 3 жыл бұрын
Awesome explaination
@Jayadev218
@Jayadev218 8 ай бұрын
Jai ho nithish sir GOAT
@neelulalchandani7429
@neelulalchandani7429 2 ай бұрын
best ever
@AnjaliSharma-lq8ut
@AnjaliSharma-lq8ut 10 ай бұрын
I wish I could give more than 1 like.
@dikshabatra8657
@dikshabatra8657 Жыл бұрын
Thankyou so much😄😄
@adityabhatt04
@adityabhatt04 3 жыл бұрын
Great video.
@MehulKumar-xw9rx
@MehulKumar-xw9rx 4 ай бұрын
Sir please Explain the mathematics behind the maximum likelihood formula in some new video
@revathivamshi7621
@revathivamshi7621 7 ай бұрын
The Submission of negatives log of maximum likelihood is called cross entropy.
@ajinkyahiwale5748
@ajinkyahiwale5748 Ай бұрын
23:58 How y2 = 0 and y4 = 0 ?? why u took the value as y2= 0
@chetanshetty8368
@chetanshetty8368 2 жыл бұрын
Hi Sir, Awesome videos. Can you please help me understand, when we are in a multi-dimensional space (Say 10 Independent and a Dependent variable(Binary)) how can we determine if the data is linearly separable. Thanks in advance
@COMEDY_007..
@COMEDY_007.. Жыл бұрын
sir aakhir hum kyo perceptron LGA rhe hai jb hamare pass already sklrearn mai pre defined model hai😢😢
@preetisrivastava1624
@preetisrivastava1624 Жыл бұрын
Can we say that uncertainty is inversely proportional to probability ie entropy is inversely proportional to probability or log p or entropy is directly to -log p and then the solution lies in minimizing the loss or entropy or uncertainity
@o2protectordm909
@o2protectordm909 Жыл бұрын
Exercise book @CampusX
@mohinigoyal3063
@mohinigoyal3063 Жыл бұрын
Sir one more question Apne sabse pahle bola linearly separable Hona chahie data logistic regression banate samay Question hai ki hum Regression ko apply kiye bina check kaise karenge ki Vo linearly separable hai ya nahin means kya code hota hai Ya kase dekhe ki bo linearly sepreble hai ya nahi
@anupamprasad8632
@anupamprasad8632 18 күн бұрын
Target variable me check karo ki binary classification hai ki nahi , multi classification me linear separable nahi hoga
@astikpatel6522
@astikpatel6522 3 жыл бұрын
thank you sir
@rafibasha4145
@rafibasha4145 2 жыл бұрын
How maximum likelihood and log loss const functions are related
@mohinigoyal3063
@mohinigoyal3063 Жыл бұрын
Sir please yah bataiye maximum likelihood ham logistic regression Mein kyon find Karte hai Kyunki linear relation mein ham sum of squared error nikalte Hain Jahan hamara error minimum hota hai to logistic regression Mein ham maximum error kyon nikalte Hain maximum likelihood se
@saturdaywedssunday
@saturdaywedssunday Жыл бұрын
how to get that copy of one note what he has written. Did he upload it anywhere?
@rajeshkr0021
@rajeshkr0021 Жыл бұрын
Sir Codes Download nhi ho pa rha github se and Datasets bhi. How to download?
@shravanshukla5352
@shravanshukla5352 2 жыл бұрын
Please upload a real application vedio of interpretation on logistics regression Suppose we analyze insurance data only 9% predict are eligible for personal loan
@radhikawadhawan4235
@radhikawadhawan4235 11 ай бұрын
why did we consider product of probabilities to be Loss Function? in MSE minimising y - y_hat is logical, but i didnt get the logic for the same here.
@abdulwahabkhan1086
@abdulwahabkhan1086 3 ай бұрын
Thora sa probability ka game ha yahan. MSE me literal values le rhy hyn or usme probability nahi hay.
@subhanjalpant8824
@subhanjalpant8824 4 ай бұрын
22:00
@ThanosAtWork
@ThanosAtWork Жыл бұрын
do anyone have made notes of this deep learning playlist ?
@ABHISHEKPRASADPRAJAPATI-jq3cg
@ABHISHEKPRASADPRAJAPATI-jq3cg 8 ай бұрын
Teacher should have also teaching quality and your effort justify that....
@ProgramerSalar
@ProgramerSalar Жыл бұрын
gzb😀😃😄😄😄😄😄😄😄😄😄😄😄😄😄
@kindaeasy9797
@kindaeasy9797 8 ай бұрын
maja aaya ++
@RohanOxob
@RohanOxob 2 жыл бұрын
9:00
@jroamindia1754
@jroamindia1754 Жыл бұрын
How y2 = 0 and y4 = 0 ?? why u took the value as y2= 0
@ajinkyahiwale5748
@ajinkyahiwale5748 Ай бұрын
did you get the answer?
@ajayraho
@ajayraho 5 ай бұрын
Andrew Ng who? 😂
@RohitkumarDanda
@RohitkumarDanda 7 күн бұрын
xactly
Derivative of Sigmoid Function
5:57
CampusX
Рет қаралды 31 М.
Logistic Regression Part 1 | Perceptron Trick
47:06
CampusX
Рет қаралды 138 М.
人是不能做到吗?#火影忍者 #家人  #佐助
00:20
火影忍者一家
Рет қаралды 20 МЛН
Tuna 🍣 ​⁠@patrickzeinali ​⁠@ChefRush
00:48
albert_cancook
Рет қаралды 148 МЛН
She made herself an ear of corn from his marmalade candies🌽🌽🌽
00:38
Valja & Maxim Family
Рет қаралды 18 МЛН
Tips Tricks 15 - Understanding Binary Cross-Entropy loss
18:29
DigitalSreeni
Рет қаралды 23 М.
Maximum Likelihood, clearly explained!!!
6:12
StatQuest with Josh Starmer
Рет қаралды 1,5 МЛН
The Key Equation Behind Probability
26:24
Artem Kirsanov
Рет қаралды 172 М.
Logistic Regression with Maximum Likelihood
15:51
Endless Engineering
Рет қаралды 35 М.
Logistic Regression - THE MATH YOU SHOULD KNOW!
9:14
CodeEmporium
Рет қаралды 160 М.