Linear Regression using Gradient Descent in Python - Machine Learning Basics

  Рет қаралды 81,564

Adarsh Menon

Adarsh Menon

Күн бұрын

Пікірлер: 166
@iamdurgeshk
@iamdurgeshk 4 жыл бұрын
Man, loved the explanation which I tried multiple times from different sources to understand. Why did you stop making such video? I would love to learn more in Machine Learning. Please start a playlist explaining A-Z of Machine Learning.
@albertobrody1979
@albertobrody1979 3 жыл бұрын
i know Im randomly asking but does anyone know a trick to get back into an instagram account?? I stupidly forgot my account password. I would love any assistance you can give me
@romanrandall2106
@romanrandall2106 3 жыл бұрын
@Alberto Brody instablaster ;)
@iamdurgeshk
@iamdurgeshk 3 жыл бұрын
Ahaha, scammers replying to each other
@albertobrody1979
@albertobrody1979 3 жыл бұрын
@Roman Randall thanks for your reply. I found the site thru google and Im waiting for the hacking stuff now. Seems to take a while so I will get back to you later with my results.
@albertobrody1979
@albertobrody1979 3 жыл бұрын
@Roman Randall it worked and I now got access to my account again. I'm so happy! Thank you so much you saved my account!
@dileepn2479
@dileepn2479 Ай бұрын
I took almost a week to find a such informative video to understand gradient descent ...
@sairasiddique5116
@sairasiddique5116 4 жыл бұрын
so fluent clear concepts that were like a big rock in front of me when our lecturer was giving us a lecture and I was like a dump but now I get everything with each detailing you have explained great work
@AdarshMenon
@AdarshMenon 3 жыл бұрын
Glad it was helpful!
@sathish77j90
@sathish77j90 4 жыл бұрын
Really fantastic sir... I am struggling to understand thia concept from 3 days finally now i got it... Thank u very much sir..
@charlescoult
@charlescoult Жыл бұрын
Simple and straightforward. Thank you Adarsh.
@syedniamath5841
@syedniamath5841 4 жыл бұрын
Great Video Adarsh, short and clear to understand and workout.
@AdarshMenon
@AdarshMenon 4 жыл бұрын
Thank you Syed !!
@umersalman2506
@umersalman2506 5 жыл бұрын
This is amazing man! Pls upload more videos on Machine learning algorithms with this type of proper intuitions and implementation
@AdarshMenon
@AdarshMenon 5 жыл бұрын
Thank you so much ! Yes I am working on more such videos. Thank you for subscribing
@turtlepedia5149
@turtlepedia5149 4 жыл бұрын
sir very very thanks after my job big donation coming to you
@late_nights
@late_nights 4 жыл бұрын
good explanation. Your 1000th subscriber here
@AdarshMenon
@AdarshMenon 4 жыл бұрын
Awesome, thank you!
@modernman00
@modernman00 4 жыл бұрын
You are the best teacher. I recommend this course to anyone studying machine learning at Coursera. Please more video and continue with your teaching approach
@kalaivananr
@kalaivananr 3 жыл бұрын
Thank you, Adarsh. You made it very simple.
@nisarahmed4487
@nisarahmed4487 2 жыл бұрын
@Adarsh ... no doubt you have done great job, but can you tell me why you are running the for loop on length of x?? but not on the epochs as you have already told that you are going to use it ?
@aman5534
@aman5534 6 ай бұрын
in his git repo he is using epochs as he said
@kruti2008
@kruti2008 Жыл бұрын
Very well explained, step by step with clarity 👍
@sayandey1478
@sayandey1478 5 жыл бұрын
Impressive! That was a very clarifying one, but in the real world situations we don't know how many iterations are suitable for a given problem, moreover, you are not checking whether you have reached your minimum but are concluding from the data visualization. If you could clear that training algorithm to fix learning rate, which I have been looking for so far, could be very helpful.
@prathamgenius
@prathamgenius 3 жыл бұрын
Thanks a lot @Adarsh , you helped me to complete ML assignment easily... Cheers 👍
@AdarshMenon
@AdarshMenon 3 жыл бұрын
Glad to hear that!
@rashidmehmood5008
@rashidmehmood5008 3 жыл бұрын
Where did you use epochs in the code?? You declared it but I don’t think, you used it.
@shubhamgoyal977
@shubhamgoyal977 3 жыл бұрын
Great video man!! Loved it!! Helped me understand this concept really well. Thanks!!
@AdarshMenon
@AdarshMenon 3 жыл бұрын
Glad to hear it!
@adarshsingh401
@adarshsingh401 2 жыл бұрын
plz also make a video on multiple linear regression!!. This one was amazing.
@igorjaqueira5788
@igorjaqueira5788 4 жыл бұрын
I believe there is a mistake in the code. First as you pointed out already in other comments, the variable epochs should be used instead of n. But also there should be an error limit, so the loop should run as long as the error is greater than some parameter (epsilon).
@AdarshMenon
@AdarshMenon 4 жыл бұрын
Yeah that is a good idea when training times are long, and it does not overfit. But yes a good best practice I could have mentioned, thanks !
@veerasekhar8551
@veerasekhar8551 3 жыл бұрын
thanks man :) subscribed expecting you to explain more ML algorithms.
@AdarshMenon
@AdarshMenon 3 жыл бұрын
More to come!
@sayonsai8006
@sayonsai8006 3 жыл бұрын
Very easy to understand and implement
@sathishkumaranandashekaren4313
@sathishkumaranandashekaren4313 3 жыл бұрын
Hi , what is the need for adding epochs in the program without epochs also my program is running
@janhvibhosle3852
@janhvibhosle3852 5 жыл бұрын
Excellent video sir, please make more videos on machine learning with easier explanation.
@AdarshMenon
@AdarshMenon 5 жыл бұрын
Thanks Ritesh ! Will sure do !
@davidthunder9804
@davidthunder9804 2 жыл бұрын
Hey how you did that gradient descent converging animation in colab great video do
@chetansahu1505
@chetansahu1505 5 жыл бұрын
Please upload more videos, all the model - KNN, Naive Bayes, Decision Tree, Random forest, dimentioanlity reduction techniques, etc all the models including the Depp learning from scratch. please make. Because the way you explain any layman to fool will easily understand. Awesome explanation bro. Big fan with your one video.
@MOHSINALI-bk2qo
@MOHSINALI-bk2qo 5 жыл бұрын
plz give me two variable example of gradient descent.
@narottamaswal3978
@narottamaswal3978 3 жыл бұрын
I liked and subscribed at this moment 2:47 🙏
@swapnamondal7161
@swapnamondal7161 3 жыл бұрын
Very intuitive video.
@nano7586
@nano7586 4 жыл бұрын
Best explanation so far.. thanks a lot!
@tejaskulkarni5795
@tejaskulkarni5795 3 жыл бұрын
Nice Explanation very clear.
@desavera
@desavera 3 жыл бұрын
Fantastic ... really solid !!! thanks ...
@raghavgaur8901
@raghavgaur8901 5 жыл бұрын
Sir,where did you use epochs in your code as it is just initialized.
@AdarshMenon
@AdarshMenon 5 жыл бұрын
Hi epochs were to be used in the for loop instead of n. Please see the code on GitHub (link in description)
@mohammeddanishreza4902
@mohammeddanishreza4902 4 жыл бұрын
Why I am getting m, C value as in array, can u plz help me?
@neehanthreddy1868
@neehanthreddy1868 8 ай бұрын
Could you provide an example with multiple independent variables?
@jyotshanajha8316
@jyotshanajha8316 Жыл бұрын
very well explained. thank you
@sudeshnasen2731
@sudeshnasen2731 2 жыл бұрын
What is epoch? Why are we declaring it without using it? Why run the loop for the length of X instead of the epoch?
@muhammadalikhan5003
@muhammadalikhan5003 Жыл бұрын
Thank you. It was very useful.
@bibekanandasahoo3497
@bibekanandasahoo3497 2 жыл бұрын
Adarsh can you please make a vidio which explain how the gradient discent work in multiple linear regression ? please !
@MinaNaghshnejad
@MinaNaghshnejad 3 жыл бұрын
Nice explanation. In gradient descent, the loop shouldn't for loop changed to for i in range(epochs)?
@AdarshMenon
@AdarshMenon 3 жыл бұрын
Hi yes, you are right. It was a mistake, but it worked because the dataset converged quickly. Thank you for pointing it out !
@DB-in2mr
@DB-in2mr 5 жыл бұрын
Hi just a note if somebody might get some issue by gathering data from your github data.csv ...there is an error related to " " or also like "Error tokenizing data. C error: Expected ....etc". Apprently ideal is to upload your data in raw format from your git area here ( raw.githubusercontent.com/chasinginfinity/ml-from-scratch/master/03%20Linear%20Regression%20in%202%20minutes/data.csv ) , adding ", header=None" .
@vinaywaingankar6133
@vinaywaingankar6133 4 жыл бұрын
Great explanation
@AdarshMenon
@AdarshMenon 4 жыл бұрын
Glad you liked it
@SaurabhKumar-gj5mp
@SaurabhKumar-gj5mp 5 жыл бұрын
One of the best video. Nice explanation man
@AdarshMenon
@AdarshMenon 5 жыл бұрын
Thank you !
@aishwaryaravi5249
@aishwaryaravi5249 3 жыл бұрын
this video really helped me! one question though, it gives errors for higher learning rates. why is that so?
@sreeprakashneelakantan5051
@sreeprakashneelakantan5051 5 жыл бұрын
Very well explained 👌
@sjback2145
@sjback2145 2 жыл бұрын
Great thanks to you. It's nice and clear explanation
@Christian-mn8dh
@Christian-mn8dh 5 жыл бұрын
Hey man, love this vid. I tried to build a gradient descent algorithm from scratch too. Why isn't mine working? Why can't you multiply the learning_rate directly by the error? Here's my code: for i in range(4): ypred = m * x + b error = (ypred - y) **2 m = m - (0.001 * error) b = b -(0.001 * error) m = m.sum() b = b.sum() #My 'm' and 'b' values decrease infinitely
@kaustubhkapare807
@kaustubhkapare807 Жыл бұрын
length of training dataset typecasted to float....and that float was given as a parameter to range function....
@Mason-yj1ip
@Mason-yj1ip 3 жыл бұрын
This is better than my stupid university lecture. Bro you should take my tuition.
@ROHANJUNEJABAI
@ROHANJUNEJABAI 4 жыл бұрын
We repeat this process until our loss function is a very small value or ideally 0 (which means 0 error or 100% accuracy). The value of m and c that we are left with now will be the optimum values. THAT THE 4TH STEP OF GRADIENT DESCENT but according to your code, we are just updating the values till epochs =100 . HOW THIS JUSTIFY THAT WE ARE UPDATING THE VALUES TILL THE LOSS FUNCTION IS MINIMIZED?
@hirakmondal6174
@hirakmondal6174 4 жыл бұрын
Exactly my point. This code is wrong and some people with 0 to none knowledge who are learning are saying this to be GOD LEVEL..smh
@2020-h2s
@2020-h2s 3 жыл бұрын
have you implemented gaussian kernel regression from scratch for two independent features dataset.
@destinedtotrade3214
@destinedtotrade3214 3 жыл бұрын
I liked & subscribed Thx !
@harshmankodiya9397
@harshmankodiya9397 4 жыл бұрын
loved the explanation
@AdarshMenon
@AdarshMenon 4 жыл бұрын
Glad to hear!
@navinbondade5365
@navinbondade5365 4 жыл бұрын
I think we you need to add for i in range(iteration)
@namanjain1164
@namanjain1164 5 жыл бұрын
The best!! Loved it!!
@sarthak810
@sarthak810 3 жыл бұрын
After plotting the regression line can we apply the liner regression algorithm from sklearn to it
@agirlfrom_mountains
@agirlfrom_mountains 4 жыл бұрын
Where we r using epiches here and please explain me using max and min how we are are Ploting the gradient decent line???
@tomtrask_YT
@tomtrask_YT 4 жыл бұрын
You really should do something like this to read your csv (the way you do it, it uses the first row as column names) data = pd.read_csv('data.csv', header=0, names=["x_val", "y_val"]) You don't need to assign names but it's "nice" to do that, that way you don't have to use indexes (e.g. use data["x_val"] instead of data.iloc[:, 0])
@AdarshMenon
@AdarshMenon 4 жыл бұрын
Thanks Tom, yes totally agree with you. Will keep this in mind from now onwards !
@usamazahid1
@usamazahid1 2 жыл бұрын
excellent tutorial....kudos
@praveenbhatt3127
@praveenbhatt3127 3 жыл бұрын
great video. Can someone tell me how he did that animation of different predicted lines in the starting of the video ?
@AdarshMenon
@AdarshMenon 3 жыл бұрын
I generated images using opencv for all the values and then used premier pro to generate a gif from the images (lots other gif generating tools available online)
@gchumbes
@gchumbes 3 жыл бұрын
Excellent thanks for sharing
@tejaskulkarni5795
@tejaskulkarni5795 3 жыл бұрын
Can you also give an example of how to use gradient descent to find the best alpha for ridge and lasso regression please.
@firstkaransingh
@firstkaransingh 2 жыл бұрын
Good explanation thanks bro ☺️
@turtlepedia5149
@turtlepedia5149 4 жыл бұрын
sir if epochs is used for n then i will go beyond its limit so what will be the interpretetion
@turtlepedia5149
@turtlepedia5149 4 жыл бұрын
sorry sir i got it thanks
@merohan619
@merohan619 4 жыл бұрын
Is it necessary that gradient descent lines will be straight..? If the scatter plot is in form of tan function then?
@myentropyishigh8708
@myentropyishigh8708 4 жыл бұрын
gradient is not a line or something its a technique ,its is used here to find optimum values of slope m and constant c its not a line
@vivekpuurkayastha1580
@vivekpuurkayastha1580 5 жыл бұрын
Could you please explain what will be the change in the partial derivative equation for multiple predictors?
@anthonybeaulamaryj5372
@anthonybeaulamaryj5372 3 жыл бұрын
Is there any way to find the accuracy of this manual implementation?
@hreedishkakoty6771
@hreedishkakoty6771 2 жыл бұрын
Why is your for loop over len(X) and not epochs ?
@kanokyusuki
@kanokyusuki 5 жыл бұрын
Thanks a lot! it was a great tutorial.
@sivaramakrishna2248
@sivaramakrishna2248 5 жыл бұрын
Simply awesome ....
@sankojushivasai8028
@sankojushivasai8028 3 жыл бұрын
where did the epochs used?
@AdarshMenon
@AdarshMenon 3 жыл бұрын
Hi epochs were to be used in the for loop instead of n. I made that mistake in the video. Please see the code on GitHub (link in description)
@datascientist2958
@datascientist2958 3 жыл бұрын
Can you continue on same data with stochastic gradient descent and mini batch
@AdarshMenon
@AdarshMenon 3 жыл бұрын
will definitely add it to my list
@turtlepedia5149
@turtlepedia5149 4 жыл бұрын
what is the use of the itertion variable please tell sir the epochs var u didnt used at all
@AdarshMenon
@AdarshMenon 4 жыл бұрын
Yes correct, i made a small mistake there, and instead of epochs in the loop I used n. The program worked because it was small values and didnt make much difference
@SahibzadaIrfanUllahNaqshbandi
@SahibzadaIrfanUllahNaqshbandi 4 жыл бұрын
Thank you for the great video. I have a question. If I multiply X with -1 then Gradient descent and linear regression fail to obtain negative slope . Can you please tell me whats the problem?
@AdarshMenon
@AdarshMenon 4 жыл бұрын
I think you would also need to reverse the signs in the derivatives calculated. Try calculating the derivatives with negative x, use those equations to do the regression
@JainmiahSk
@JainmiahSk 3 жыл бұрын
To find best-fit line should I include Gradient Descent code or sklearn library automatically do that while include Linear Models?
@AdarshMenon
@AdarshMenon 3 жыл бұрын
Doing it with sklearn is the easies way
@JainmiahSk
@JainmiahSk 3 жыл бұрын
@@AdarshMenon To clarify doubts how to contact you? whatsapp or Telegram?
@JainmiahSk
@JainmiahSk 3 жыл бұрын
@@AdarshMenon Thank you.
@Saimelodies2512
@Saimelodies2512 4 жыл бұрын
can you explain the back propagation derivation of a simple neural network? plZZ???? step by step (using matrix calculus)
@AdarshMenon
@AdarshMenon 4 жыл бұрын
I made an entire series on it - kzbin.info/aero/PLP3ANEJKF1TwHRDS9sPANOzYaAIfJIuam
@siddhantdutta612
@siddhantdutta612 3 жыл бұрын
I g you forgot to use epoch .. instead you used len(X) int the part where we find the value of m and c .. I may be wrong but i just spotted this
@AdarshMenon
@AdarshMenon 3 жыл бұрын
You are right, I forgot to use the variable epochs. The code works because, n iterations were enough for the code to converge. Thanks for pointing this out!
@siddhantdutta612
@siddhantdutta612 3 жыл бұрын
@@AdarshMenon yaa exactly😅 ... Btw the video was very helpful .. good job.👍 Yaa and one more thing.. it would be even better if you make videos on multivariate regression too...
@NavneetKaur-ps8bn
@NavneetKaur-ps8bn 4 жыл бұрын
Why am I getting the values of m and c as nan and nan respectively. and also (FloatingPointError: overflow encountered in double_scalar) this error . Also i tried by fixing the values of learning rate and epochs but it won't work .
@AdarshMenon
@AdarshMenon 4 жыл бұрын
Inside the loop, you can try printing the values of m and c, to see exactly at which iteration it is overflowing. If it is happening from the first iteration itself, maybe a problem with the dataset, otherwise could be learning rate/epoch values
@tejaltatiwar4682
@tejaltatiwar4682 Жыл бұрын
Amazing
@vihaangoyal5134
@vihaangoyal5134 3 жыл бұрын
NICE AND LEARNATE VIDEO
@swaroopsasikumar
@swaroopsasikumar 4 жыл бұрын
Great Video.
@AdarshMenon
@AdarshMenon 4 жыл бұрын
Thanks!
@shivangchauhan8609
@shivangchauhan8609 2 ай бұрын
Good one, thanks
@himanshusoni9876
@himanshusoni9876 5 жыл бұрын
If the data set have multiple features than gradient descent is applicable or not??? please reply ..... Thank you
@AdarshMenon
@AdarshMenon 5 жыл бұрын
yes can be applied towardsdatascience.com/machine-learning-bit-by-bit-multivariate-gradient-descent-e198fdd0df85
@nikhil7129
@nikhil7129 4 жыл бұрын
sir can we use it when we have multiple independent features
@AdarshMenon
@AdarshMenon 4 жыл бұрын
Yes you can, use sklearn because it is easier using that
@ACC861
@ACC861 4 жыл бұрын
If you use a different learning rate.. the results will vary right ?
@AdarshMenon
@AdarshMenon 4 жыл бұрын
Technically yes, but for such a small dataset I dont think there will be a huge difference. The learning rate determines how fast the function converges, so it is important to experiment with values and choose a suitable one for your use case
@turtlepedia5149
@turtlepedia5149 4 жыл бұрын
Sir I am currently into the Andrew NG course on coursera... Sir after i complete this can you suggest me some advance level course that u can follow after thus course.. As I want to get mor better in ml
@AdarshMenon
@AdarshMenon 4 жыл бұрын
To get better at ml, after doing the courses and understanding the basics, I would suggest you try and solve real problems. Kaggle would be a good place to start.
@bhavyanavuluri9177
@bhavyanavuluri9177 5 жыл бұрын
Hi , I am getting output as NAN . why?
@balubalaji9956
@balubalaji9956 5 жыл бұрын
Thanks Adarsh for making me understand 😅
@ManjeetSingh-ly3fk
@ManjeetSingh-ly3fk Жыл бұрын
Pls share a code for automatic updation of learning rate in this linear regression with scratch.
@Viratfanpage369
@Viratfanpage369 3 ай бұрын
pls give the code for the graph that was in the starting of the video
@Yzyou11
@Yzyou11 2 жыл бұрын
Thank you bro 👍🙏
@ahmednabil2181
@ahmednabil2181 5 жыл бұрын
6 for i in range(len(x)): ----> 7 y_pred = m*x + c 8 dm = (-2/len(x))*sum(x*(y - y_pred)) 9 dc = (-2/len(x))*sum(y - y_pred) it gives me an error in line number 7 telling me that can only concatenate list (not "int") to list how do i fix that?
@navinbondade5365
@navinbondade5365 4 жыл бұрын
you did not use iteration or epochs in this code, what is use of it
@vrstudios9978
@vrstudios9978 2 жыл бұрын
Hii Adarsh if possible can you tell how to present output as animation gif with moving line for every iteration or If anyone knows let me know
@nikhil7129
@nikhil7129 4 жыл бұрын
i think their should be any condition to stop when we find optimal value?
@AdarshMenon
@AdarshMenon 4 жыл бұрын
you can adjust he value of epochs and stop where the loss is approaching 0
@riyazbagban9190
@riyazbagban9190 2 жыл бұрын
where did you use epochs
@IndiCEO
@IndiCEO 5 жыл бұрын
Thanks man.........
@jayaprakashkumar4573
@jayaprakashkumar4573 3 жыл бұрын
The output after printing m and c is "nan nan " for me. Plzz help
@christiankentorasmussen7492
@christiankentorasmussen7492 3 жыл бұрын
Super good video :)
@suritchakraborty9055
@suritchakraborty9055 4 жыл бұрын
Why am I getting the values of m and c as nan and nan respectively. I didn't have a CSV data set but instead I created two lists and converted them into panda series data type. I don't know where am I going wrong
@AdarshMenon
@AdarshMenon 4 жыл бұрын
i think you should use the csv or some other real world data. If the data is already the optimum value, gradient descent wont work. you can also experiment with other vales of learning rate and epochs. Maybe also try printing the value of m and c to see how many times the loop run before they become nan
@suritchakraborty9055
@suritchakraborty9055 4 жыл бұрын
@@AdarshMenon it worked out needed to fix the learning rate as my data was small thanks a lot
@sathishkumaranandashekaren4313
@sathishkumaranandashekaren4313 3 жыл бұрын
Hi your program doesnt show iteratioins and not clear about global minima
@kassa1384
@kassa1384 4 жыл бұрын
nice video
@AdarshMenon
@AdarshMenon 4 жыл бұрын
Thanks
Linear Regression From Scratch in Python (Mathematical)
24:38
NeuralNine
Рет қаралды 176 М.
22. Gradient Descent: Downhill to a Minimum
52:44
MIT OpenCourseWare
Рет қаралды 79 М.
Я сделала самое маленькое в мире мороженое!
00:43
Кушать Хочу
Рет қаралды 3,3 МЛН
Миллионер | 2 - серия
16:04
Million Show
Рет қаралды 829 М.
🕊️Valera🕊️
00:34
DO$HIK
Рет қаралды 3,1 МЛН
ML Was Hard Until I Learned These 5 Secrets!
13:11
Boris Meinardus
Рет қаралды 319 М.
Gradient Descent From Scratch in Python - Visual Explanation
28:44
Gradient Descent, Step-by-Step
23:54
StatQuest with Josh Starmer
Рет қаралды 1,3 МЛН
Building the Gradient Descent Algorithm in 15 Minutes | Coding Challenge
22:29
Я сделала самое маленькое в мире мороженое!
00:43
Кушать Хочу
Рет қаралды 3,3 МЛН