Multiple Linear Regression | Part 2 | Mathematical Formulation From Scratch

  Рет қаралды 47,592

CampusX

CampusX

Күн бұрын

Dive into the mathematical foundation of Multiple Linear Regression in this second part of our series. We'll guide you through the formulation from scratch, making it easy to grasp the concepts behind this powerful regression technique. Build a solid foundation for your regression modeling skills.
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.campusx.in/s/store
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
E-mail us at support@campusx.in
⌚Time Stamps⌚
00:00 - Intro
00:30 - Types of Linear Regression
03:31 - Mathematical Formulation
26:45 - Detouring for sometime
43:20 - Why Gradient Descent?

Пікірлер: 76
@saptarshisanyal6738
@saptarshisanyal6738 Жыл бұрын
I dont think mathematical explanation given in this video exist in youtube. I have found this in the book "Mathematics for Machine Learning by Marc Peter Deisenroth". This is Simply brilliant. Although, matrix differentiation part is absent, but still this is extra ordinary stuff.
@sidindian1982
@sidindian1982 Жыл бұрын
yes , Well said .. brilliantly explained :-)😍😍😍😇😇
@kashifhabib290
@kashifhabib290 11 ай бұрын
Sir I Can tell You No One Literally No One Video Can Compare through your teaching, I have seen videos of Coding Ninjas and Other Paid Lectures but nobody has got into this dept , I can literally feel Machine Learning in Front of My Imagination. Thank You So Much Sir 🙏
@sudhanshumishra3677
@sudhanshumishra3677 5 ай бұрын
literally, it was the greatest explanation that I have ever seen on KZbin. Hat's off Sir
@Ganeshkakade454
@Ganeshkakade454 Жыл бұрын
Hi Sir..U r truly gem person sharing such a great knowledge free..is blessing for new generation..god bless u sir..Aap hamare Guru ho Aaj Se..
@karthikmanthitta6362
@karthikmanthitta6362 2 жыл бұрын
Such wonderful explanation sir, really thanks a lot♥️♥️♥️you were able to explain something which 100s of videos couldn't explain to me.
@ranirathore4176
@ranirathore4176 Жыл бұрын
Most underrated channel on utube 🥲
@core4032
@core4032 2 жыл бұрын
step by step series in very detail , superb .
@TheAparajit
@TheAparajit 10 ай бұрын
This was just brilliant. You are an incredible teacher. Thankyou.
@gauravpundir97
@gauravpundir97 Жыл бұрын
Thank you for making such fantastic videos!
@nikhildonthula1395
@nikhildonthula1395 Ай бұрын
Man top class stuff, been trying to find mathematical derivation from many days in KZbin.
@user-qo1qe9wq4g
@user-qo1qe9wq4g 4 ай бұрын
Thank you very much for the explanation sir, I searched the whole youtube to get this mathematical explanation!
@lvilligosalvs2708
@lvilligosalvs2708 10 ай бұрын
You are a Gem, Sir. Keep it up. Thank you!
@akash.deblanq
@akash.deblanq 2 жыл бұрын
I jumped when I understood eT*e concept. Thank you so much!!!
@krithwal1997
@krithwal1997 2 жыл бұрын
Made complex things so easy . i.e, CampusX
@ArpanChandra-vv8cg
@ArpanChandra-vv8cg 7 күн бұрын
GOD LEVEL TEACHING SKILL 💡💡
@anant1803
@anant1803 Жыл бұрын
Really amazing video sir.
@manujkumarjoshi9342
@manujkumarjoshi9342 10 ай бұрын
Beautiful, luckily I know it before but awesome teaching skills
@ADESHKUMAR-yz2el
@ADESHKUMAR-yz2el 11 ай бұрын
love you sir, with all respect.
@anshulsharma7080
@anshulsharma7080 Жыл бұрын
Itna saare phle pdha kese bhaiya apne..... Suprrrrr se uprrrrrr vla h, wowooooooooo...
@balrajprajesh6473
@balrajprajesh6473 2 жыл бұрын
Thank you for this sir!
@sameergupta3067
@sameergupta3067 Жыл бұрын
This ML series is making me interested in maths of Machine Learning algorithms.
@nirjalkumarmahato330
@nirjalkumarmahato330 Жыл бұрын
Boss tusi great ho ❤️ struggling straight from 1 months 🙃
@usmanriaz6241
@usmanriaz6241 7 ай бұрын
you are an amazing teacher. never saw with such good explanations on youtube. Love from pakistan
@ParthivShah
@ParthivShah 4 ай бұрын
Thank You Sir.
@beethoven6185
@beethoven6185 9 ай бұрын
Sir you are the best teacher
@rubalsingh4018
@rubalsingh4018 6 ай бұрын
Thank you so much.
@sovansahoo27
@sovansahoo27 10 ай бұрын
Superb content ,easily my semester saviour at IIT Kanpur...Thanks Sir
@ashishraut7526
@ashishraut7526 Жыл бұрын
bosss that was awesome
@ritujawale10
@ritujawale10 Жыл бұрын
Thank you sir 👍
@maths_impact
@maths_impact Жыл бұрын
Wonderful sir
@todaystrending1992
@todaystrending1992 3 жыл бұрын
Eagerly waiting for next video😁😉. Thank you so much sir for this🙏❤️
@dheerajrajput7058
@dheerajrajput7058 Жыл бұрын
Have u achieved your dream now ? Please say yes
@kiran__jangra
@kiran__jangra Ай бұрын
No Doubt sir your teaching is so fantastic I am following your videos Sir I have one doubt in the step where you do X^TXB^T=y^TX B^T=Y^TX(X^TX)^-1 but it shouldn't be B^T=(X^TX)^-1Y^TX basically inverse term must be pre multiplied because we pre multiply by inverse term to cancel it out on left hand side and matrix multiplication is not commutative so we can't write it on other side Please clear my doubt
@mr.deep.
@mr.deep. 2 жыл бұрын
best explain
@kidscreator2268
@kidscreator2268 3 жыл бұрын
1 no. sir
@rupeshramancreation5554
@rupeshramancreation5554 6 ай бұрын
Bahat badi baat bool di aj ap ne
@shubhankarsharma2221
@shubhankarsharma2221 Жыл бұрын
*->Transpose y*(XB)=(XB)*Y can be prove after taking XB=Y(hat) and put them with their respective matrices. y=[y1 y2 yn] y(hat)=[y(h)1 y(h)2...y(n)] putting into equation will prove both are same
@user-oc6lw2rd1q
@user-oc6lw2rd1q 8 ай бұрын
34:18 shouldn't its differentiation be equal to (X^T)y which is transpose of X times y instead of transpose of y times X which is (y^T)X.
@sachin2725
@sachin2725 Жыл бұрын
Hello Sir, XGBoost is not included in playlist, could you please make a video on XGBoost ?
@ronylpatil
@ronylpatil Жыл бұрын
Please make detail video on matrix diff.
@ashishmhatre2846
@ashishmhatre2846 Жыл бұрын
My master's prof. can't explain things better than you ! Thank you for making such awesome videos !
@AbdurRahman-lv9ec
@AbdurRahman-lv9ec Жыл бұрын
Awsome
@abuboimofo6605
@abuboimofo6605 3 ай бұрын
Doubt Sir ji 🙏 36:00 When you differentiate the matrix considering Y = A'XA Your answer is 2XA^T ..while the answer should be dY/dA = 2XA not the transpose .. correct me if I am wrong plzz
@priyadarshichatterjee7933
@priyadarshichatterjee7933 6 ай бұрын
SIr... shouldnot after differentiation and reduction we will be left with yT=XTBT which again transposed gives y=XB and there fore B=X^-1y?
@messi0510
@messi0510 Жыл бұрын
0:30-->2:45 intuition behind MLR 43:20-->47:45 Why gradient descent is more effective as compared to OLS?
@abdulmanan17529
@abdulmanan17529 Жыл бұрын
Math guru as well as machine learning
@ayushbaranwal1094
@ayushbaranwal1094 5 ай бұрын
Sir actually I had a doubt, d/da of At*x*A is 2XA but you have written 2XA transpose, can you explain it?
@MAyyan-gb6hi
@MAyyan-gb6hi 9 ай бұрын
ily!
@Sara-fp1zw
@Sara-fp1zw 2 жыл бұрын
36:00 bhiya kindly upload video on matrix differentiation.
@animatrix1631
@animatrix1631 Жыл бұрын
Matrix differentiation video please upload sir @campusx
@rahulpathak8415
@rahulpathak8415 Ай бұрын
Loss function 1/2m se start hota hai sir ?
@BAMEADManiyar
@BAMEADManiyar 7 ай бұрын
37:23 sir i have a doubt if i multiply LHS by (X`T X)'-1 i will be left with B'T on LHS so therefore i need to multiply RHS also by same term right. But if i do so i'll get some other answer. why is this sir.
@mr.deep.
@mr.deep. 2 жыл бұрын
campusX > MIT
@harshitmishra394
@harshitmishra394 4 ай бұрын
@campusx sir Y(hat)=B0+B.X1+B2.X2......Bn.Xn tha to matrix me different element kaise ho gya????
@roktimjojo5573
@roktimjojo5573 2 жыл бұрын
matrix differentiation ki video kab aaegi
@Khan-ho5yd
@Khan-ho5yd 7 ай бұрын
Hell sir Do you have notes ??
@abdulmanan17529
@abdulmanan17529 Жыл бұрын
@abhishekkukreja6735
@abhishekkukreja6735 2 жыл бұрын
Hi Nitish sir, while calculating the error function we used differenctiation to get the expression but in the very beginnning you said we don't use calculus for ols and for gradient descent we do but we used that in both so how it is closed form or non closed form? whats the concept I got it , but closed and non closed form how they're diff as we're doing differentiation in both of them ? Thanks for these videos.
@spynom3070
@spynom3070 Жыл бұрын
he used calculus to show how ols equation is formed from scratch. In ols machine use final equation to calculate best fit line but in gradient descent it use calculus to reach minima point.
@abhishekkukreja6735
@abhishekkukreja6735 Жыл бұрын
@@spynom3070 thanks for this.
@vanshshah6418
@vanshshah6418 Жыл бұрын
(best best best best ......best)^best
@Jc12x06
@Jc12x06 Жыл бұрын
@CampusX Can someone explain what would happen if the Inverse doesn't exist for that particular matrix in the last step(X^T.X)^-1 i.e. if the determinant is 0?
@yashwanthyash1382
@yashwanthyash1382 Жыл бұрын
Very nice question but iska answer mujhe bhi nahi patha
@rounaksarkar3084
@rounaksarkar3084 Жыл бұрын
The reason is simple. See, X is the matrix consisting of features. Now, there are 2 possibilities for non-existance of the inverse of (X^T.X)^-1; first one is X is a null matrix and hence X^T is also a null matrix; second possibility is X^T.X is a null matrix (but none of them is individually null). You can skip the first possibility because if feature matrix is null nobody cares about the problem. Coming to the 2nd possibility, X is a (nx1) and X^T is a (1xn) matrix; X^T.X will be a (1x1) matrix. Now even if some elements of X^T are negative , it will be multiplied with the same element of X ( Notice : ith element of the 1st row of X^T == ith element of the 1st column of X ) . Hence while multiplicating and adding the elements while performing X^T.X you will never come across any negative element. So, addition of all positive quantity will give you a positive (1x1) matrix. Hence, inverse of X^T.X will always exist.
@abdulmanan17529
@abdulmanan17529 Жыл бұрын
🎉🎉🎉❤
@Star-xk5jp
@Star-xk5jp 6 ай бұрын
Day4 Date:12/1/24
@souviknaskar631
@souviknaskar631 9 ай бұрын
(AT)-1 = (A-1)T using this formula you can prove the last part [(XTX)-1]T = (XTX)-1
@moizk8223
@moizk8223 2 жыл бұрын
The video is awesome. I have a doubt though. At 37:35 you premultiply the inverse on the LHS but post multiply on RHS. Isn't that wrong? Correct me if I am missing something
@readbhagwatgeeta3810
@readbhagwatgeeta3810 Жыл бұрын
Yes correct ..The final value of beta should be: (( X transpose)Y)((X transpose)X)^-1
@moizk8223
@moizk8223 Жыл бұрын
@@readbhagwatgeeta3810 thnxx
@HirokiKudoGT
@HirokiKudoGT Жыл бұрын
sir , at 36:26 i think you used d/dA( A^TxA) = 2xA^T but its 2xA.... so i'm little confused about the last final result 🫤..only this thing else everything , you are great sir ...love your videos .
@rounaksarkar3084
@rounaksarkar3084 Жыл бұрын
Actually na, the differentiation is (X+X^T).A^T. If X=X^T then it becomes 2xA^T.
Multiple Linear Regression | Part 3 | Code From Scratch
16:01
Жайдарман | Туған күн 2024 | Алматы
2:22:55
Jaidarman OFFICIAL / JCI
Рет қаралды 1,7 МЛН
THE POLICE TAKES ME! feat @PANDAGIRLOFFICIAL #shorts
00:31
PANDA BOI
Рет қаралды 24 МЛН
He sees meat everywhere 😄🥩
00:11
AngLova
Рет қаралды 11 МЛН
Multiple Linear Regression By Hand (formula): Solved Problem
42:24
My Story
16:54
Ayush Singh
Рет қаралды 52 М.
Linear Regression From Scratch in Python (Mathematical)
24:38
NeuralNine
Рет қаралды 159 М.
Multiple Linear Regression | Geometric Intuition & Code
20:57
Ordinary Least Squares Regression
17:46
ritvikmath
Рет қаралды 45 М.
Жайдарман | Туған күн 2024 | Алматы
2:22:55
Jaidarman OFFICIAL / JCI
Рет қаралды 1,7 МЛН