i need to say this: you are the gamechanger here!! as a data scientist +2 years of experience, i ALWAYS learn something new with your content! please nich, never stop doing this things, and also, never cut your smile in your face, even if your are having bugs!! thanks for everything
@NicholasRenotte2 жыл бұрын
Thank you so much for your kind words @javierjdaza!
@lakshman5872 жыл бұрын
Set the time limit to 20 mins from next time Because you are even explaining us. This is really awesome!!
@NicholasRenotte2 жыл бұрын
Thanks a million @Lakshman!! I try to keep it pretty tight so it’s a good challenge otherwise I know I’ll just talk for 22 minutes anyway😅
@alyt98702 жыл бұрын
Love the channel Nicholas, have recently graduated from an NLP Master's degree and seeing you explain stuff in a simpler way and your coding challenges is really helping me connect with the material I've learned! Keep it up and I'll keep watching!
@NicholasRenotte2 жыл бұрын
Woah congrats @Ally 🎊 🎉 glad you’re enjoying the challenges, plenty more to come!!
@Powercube72 жыл бұрын
the zoom in on the unsaved icon was personal 💀 one of the reasons why I use autosave
@NicholasRenotte2 жыл бұрын
😅 I was angry at myself when editing, I had to make a point of it lol😂
@nikitaandriievskyi34482 жыл бұрын
Once you initialized lr to 0.0, I knew you were going to forget to change it lol. Love the challenges tho, keep doing them, I think it would be cool to see how you implement a neural network from scratch
@NicholasRenotte2 жыл бұрын
I'm still kicking myself that it was the lr that tripped me up 😅, literally it's so different coding under pressure stuff that should just flow goes out the window. OHHH yeah, I thought about a good challenge building NNs while I was at the gym, stay tuned!
@thegeeksides Жыл бұрын
@@NicholasRenotte did you ever make that video? A NN from scratch for hadnwritten digits (MNIST) classification would be so awesome!
@spencerbertsch73752 жыл бұрын
Hey Nicholas! Love your channel and I'm really appreciating these 15 minute coding challenges - please keep it up! Also, you can disable those annoying VS Code popups you ran into at 8:35 by going to Code > Preferences > Settings, then typing "editor.hover.enable", then unchecking the "Editor > Hover" option. Hope that's useful!
@NicholasRenotte2 жыл бұрын
You are a lifesaver @Spencer, will do it next time i'm on the streaming rig!
@spicytuna08 Жыл бұрын
wow. you make the subject come alive with excitements and simplicity. you are really gifted. i will take you over hard to understand but smart Ph.D professors from Ivy league any day.
@MiguelNFer2 жыл бұрын
Awesome video !! It's preety cool to see such theoretical concepts coded and explained like this. Keep going Nich !!
@NicholasRenotte2 жыл бұрын
YESSSS, right?! Glad you liked it Miguel!
@Beowulf245 Жыл бұрын
I've been following your channel for a while now and I always find new cool stuff here. Keep up the good work, it's really helpful. Also, I love your positive personality, you really make complex stuff look entertaining.
@cavaliereoscuro10982 жыл бұрын
the essence of Deep learning in a few lines of code... awesome
@leonardputtmann84042 жыл бұрын
This was oddly intense. Great job Nicholas! Even though you ran out of time, this video is still a win to me. 😉
@NicholasRenotte2 жыл бұрын
It definitely felt intense at the time Leonard 😅, the pressure is definitely real. I don't know what it is, but coding under pressure is just a completely different beast. Thanks a million, I'll take the win and thanks for checking it out!
@ibrahim47x Жыл бұрын
ChatGPT won this challenge instantaneously lol : import numpy as np # Set the learning rate learning_rate = 0.01 # Set the number of iterations num_iterations = 1000 # Define the data points X = np.array([[0, 1], [1, 0], [1, 1], [0, 0]]) y = np.array([1, 1, 0, 0]) # Initialize the weights weights = np.zeros(X.shape[1]) # Train the model for i in range(num_iterations): # Compute the predicted values y_pred = 1 / (1 + np.exp(-1 * np.dot(X, weights))) # Compute the error error = y - y_pred # Update the weights weights += learning_rate * np.dot(X.T, error) # Print the weights print("Weights:", weights) A.I. description of the code: "This script defines a simple dataset with four data points and trains a model using the gradient descent algorithm to learn the weights that minimize the error between the predicted values and the true values. The model uses a sigmoid activation function to make predictions. The script initializes the weights to zeros, and then iteratively updates the weights using the gradient descent algorithm, computing the predicted values, the error, and the gradient of the error with respect to the weights. The learning rate determines the size of the step taken in each iteration. After training the model, the final weights are printed out. You can use these weights to make predictions on new data points by computing the dot product of the data points and the weights, and applying the sigmoid function."
@williamstephenjones3863 Жыл бұрын
This is a very novel and cool way to teach coding. I really enjoyed it, and it was good to see you troubleshoot and get stuff wrong.
@brunospfc85112 жыл бұрын
i'll give you half a win, since it was a small detail
@NicholasRenotte2 жыл бұрын
Cheers @brunospfc!!
@Mohacks Жыл бұрын
Wow. This youtuber has only 197k. For this absolutely high-quality videos. you deserver more than 1m+, only thing to say, is keep grinding, and you'll get to it.
@juliansteden29802 жыл бұрын
Great Video! Would be cool to come back to this and add visualization during gradient descend using matplotlib and show what is actually happening. For example drawing data points, regression line, individual loss between line and data points and showing stats like current step, w, b, total loss! :)
@NicholasRenotte2 жыл бұрын
OHHHH MANNN, I thought about doing that but I was debating whether I'd hit the 15 minute deadline already. Good suggestion @Julian!
@einsteinsboi2 жыл бұрын
Amazing! I'm learning so much watching you code. Thank you for sharing.
@NicholasRenotte2 жыл бұрын
Thanks a mil @einsteinboi!!
@sergioquijano77212 жыл бұрын
You are so good at explaining these complicated concepts. Also, if you want to close the explore tab in VSCode try: Ctrl + b
@NicholasRenotte2 жыл бұрын
Legend, thanks a million @Sergio!!
@sergioquijano77212 жыл бұрын
@@NicholasRenotte :D I can give you more shortcuts if you tell me where I can learn more about Machine Learning concepts as you explained
@NicholasRenotte2 жыл бұрын
@@sergioquijano7721 DONE, fair trade!! Been studying this book in a ton of depth this week: themlbook.com/ I threw my own spin on the grad descent example but the fundamentals are in there!
@dipendrathakuri6429 Жыл бұрын
I think you missed dividing the derivative by 2. Because in the formula for cost function, we have (1/2*no. of training data)*sum of squared error, when we take the derivative, 2 from dldw and 1/2 from cost function cancel each other. Anyway, it was a cool video, keep up the good work brother
@VictorGiustiniPerez_ Жыл бұрын
Really nice video! Love the energy and the enthusiasm. Thanks for the help!
@darshitgoyani2094 Жыл бұрын
Lots of Thanks, Nick :)
@patrickm.392 жыл бұрын
Are you reading my mind or something? Every time I'm stuck on a topic, you drop a video about it...
@NicholasRenotte2 жыл бұрын
Ayyyy, so glad you like it @Patrick. For the last two weeks I've just been making videos on stuff I find hard or want to get my head around I figure it's not just me staring there at some of these concepts like huh?!? Thanks for checking it out!!
@MSCAIMLRBRITHANYA2 жыл бұрын
oh god! you forgot to save and i involuntarily kept shouting SAVE IT! SAVE IT!
@abdulbary3668 Жыл бұрын
You should create a model to Reduce the pressure during last minutes. Such that finding an optimal time tolerance (+-) ( 15+-b) 😂😂😂😂. 😢 but we need more videos like this to have good dataset 😂😂🎉. Thanks man
@akumlonglongkumer3824 Жыл бұрын
Pretty impressive. This is awesome. Cheers
@luis96xd2 жыл бұрын
Great video, I like this kind of video where you code some AI task counterclock, you teach us the concepts and show us the reality of implementing it👏 Well explained 😄👍
@11harinair2 жыл бұрын
Thanks for the video, subscribed! A suggestion : this small change to your code would demonstrate a real-world gradient descent solution for linear regression with noisy data. E.g. : x = np.random.randn(20,1) noise = np.random.randn(20,1)/10 # w = 5.8, b = -231.9 y = 5.8*x - 231.9 + noise
@ChangKaiHua3002 жыл бұрын
Man you actually made it, unless you say tuning hyperparameter is part of the challenge lol
@NicholasRenotte2 жыл бұрын
You're my new best friend @Kai-Hua, I could've just wrote it off and said "So that's a regression model with gradient descent...and nooooowww, we'll tune it!"
@_danfiz2 жыл бұрын
This is cool, seeing it realtime.
@NicholasRenotte2 жыл бұрын
Glad you enjoyed it @NHMI!
@kashishrajput49342 жыл бұрын
That's so informative thank you so much
@NicholasRenotte2 жыл бұрын
Glad you enjoyed it @Kashish!!
@sana73882 ай бұрын
Could you please provide the whole code, maybe in the description or else where? Thank you! Your videos are a life saver.
@kartik_exe_ Жыл бұрын
how amazing it is that he set timer for 15 mins and the vid is 22 mins long
@tomoki-v6o2 жыл бұрын
I wonder how much i takes the backpropagation algorithm ?
@lvjianlvj4604 Жыл бұрын
I really like this video. It is great!
@Felicia-1262 жыл бұрын
Amazing video!! Thank you so much
@majdabualnour2 жыл бұрын
I realy love your vedio the idea of the vedio is insain and i realy like it
@NicholasRenotte2 жыл бұрын
So stoked you liked it 🙏
@adipurnomo56832 жыл бұрын
Nice implementation bro
@jakekisiel7399 Жыл бұрын
Is there any other machine learning/NVIDIA Jetson video tutorials you would recommend?
@SomebodythatIusetoknow1238 ай бұрын
Thee learning raaate haha cool vid !
@grahamfernando87752 жыл бұрын
Can you please do a tesorflow instance segmentation video using Mask RCNN. There isn't much of a KZbin content related to this online.
@rokunuzjahanrudro7571 Жыл бұрын
Great video 🎉🎉
@aiforyounow2 жыл бұрын
Nick but I thought there are existing algorithms that u can feed your data into ? I love the way you’re doing it though but is it good doing your style or used existing ones ??
@NicholasRenotte2 жыл бұрын
100% use the prebuilt ones in sklearn, this is more to understand how they work and to provide intuition for tuning and preprocessing!! Good question 👍
@aiforyounow2 жыл бұрын
@@NicholasRenotte that’s why I call u Khalid of deep learning
@alfathterry72158 ай бұрын
this is gold!
@terrencejeffersoncimafranc1002 жыл бұрын
Can you explain the notears algorithm? It would be a great help.
@quadropheniaguy98112 жыл бұрын
Could you please upload correct code to github? I lost track of your logic after "def descend () etc".
@NicholasRenotte2 жыл бұрын
Correct code is on there @Quadrophenia, not working?
@rrrfamilyrashriderockers68912 жыл бұрын
so can you please do this algorithm for multiple variables
@alexisjulianrojashuamani1582 Жыл бұрын
U R GOD MAN, so much thanks
@vialomur__vialomur56822 жыл бұрын
Thanks waiting for the part 5 forza
@msa72022 жыл бұрын
Please do a video building a NN from scrath!!
@MrElectrecity2 жыл бұрын
Please check the Auto Save in file drop down list it's really time saver 😃 I need to see the video many times to understand what are you doing But great work I love all what you do Thumb up 👍👍
@NicholasRenotte2 жыл бұрын
Thanks for the suggestion @MrElectrecity!
@birgenc5961 Жыл бұрын
Love it!
@ShiftKoncepts8 ай бұрын
Does gradient descent work for polynomial with multi-variable problems?
@Kishor_D74 ай бұрын
yes
@lakshman5872 жыл бұрын
Gift card not valid :( But it was fun! You are amazing!!
@NicholasRenotte2 жыл бұрын
Got claimed super fast this time @Lakshman!!
@lakshman5872 жыл бұрын
@@NicholasRenotte My bad I have turned on the notification of your channel! Waiting for the next code that challenge!!!! Hope you win next time! 🤞🤞🤞
@Pedrommelos Жыл бұрын
hey man! I have a friend from Lyon and you guys have the same surname, haha Any chance you have roots from there?
@miraculousladynoir10232 ай бұрын
man i am new to this. Why are the updates not zero when learning rate is? What does a learning rate of 0 mean if it does not learn what is the purpose of building it? Edit: Nvm. Saw the rest of the video ,lol.
@carlosvasquez-xp8ei2 жыл бұрын
Great video. Set time to 20 mins.
@विशालकुमार-छ7त Жыл бұрын
why is it necessary for x and y to be list of lists ?
@adipurnomo56832 жыл бұрын
Bro, how to implement gradient descent as weight in K nearest neighbor ?