No video

MATLAB - Gradient Descent

  Рет қаралды 15,175

ThatMathThing

ThatMathThing

Күн бұрын

Пікірлер: 31
@reddlno9517
@reddlno9517 2 жыл бұрын
Very helpful video, intresting too, keep doing what you're doing
@JoelRosenfeld
@JoelRosenfeld 2 жыл бұрын
Thank you! I’m glad you liked it!
@josefradetzky8873
@josefradetzky8873 2 жыл бұрын
awesome, thanks for the video
@JoelRosenfeld
@JoelRosenfeld 2 жыл бұрын
You’re welcome!
@Taiyab0707
@Taiyab0707 2 жыл бұрын
Thanks sir for understanding this method 👍
@JoelRosenfeld
@JoelRosenfeld 2 жыл бұрын
Sure thing! I hope it helped!
@rogueorangeindustries187
@rogueorangeindustries187 2 жыл бұрын
Awesome video!
@JoelRosenfeld
@JoelRosenfeld 2 жыл бұрын
Thank you! I’m glad you enjoyed it!
@user-to1hh8pd5b
@user-to1hh8pd5b Жыл бұрын
i want to know what model the keyboard is, would you be possible to let me know keyboard's model?
@JoelRosenfeld
@JoelRosenfeld Жыл бұрын
SteelSeries Apex 5 is the keyboard. Feels amazing.
@user-to1hh8pd5b
@user-to1hh8pd5b Жыл бұрын
@@JoelRosenfeld thankU
@AbhishekSaini03
@AbhishekSaini03 Жыл бұрын
Here is the code snippet, % Assume a quadratic function and ensure we get the first derivative. f = @(x) (x(1) -5).^2 + (x(2)+1).^2; % minimum points (5,-1) % find the gradient, the first derivative of the objective function. We keep 1st derivative w.r.t. x(1) and x(2) in separate columns gradf =@(x) [2*(x(1)-5);2*(x(2)+1)]; % Notice that gradient will be zero at point(5,-1) % Choose an initial guess or starting model point, iterations, and step size.; you can choose any guess and play with the initial guess, step size and iterations. intialGuess = [2;3]; iterations = 50; stepSize = 0.1; recordGuesses = [intialGuess]; nextGuess = intialGuess; for i = 1: iterations % we are going in a negative direction (Gradient Descent), so there will be -ve from the previous guess, not +ve nextGuess = nextGuess - stepSize*gradf(nextGuess); % record the guesses recordGuesses = [recordGuesses,nextGuess]; end display('Distance from the minimum') % This will tell how far we are from the original solution. i.e., minimum % point. Smaller the number better the convergence. display(norm([5;-1] - nextGuess)) figure; plot(recordGuesses(1,:),recordGuesses(2,:),'ro') hold on; plot(5,-1,'b^'); hold off; % NOTE: from the plot, it can be seen that as the gradient gets closer, the steps % between each solution also gets closer.
@meikeheldoorn372
@meikeheldoorn372 2 жыл бұрын
Very helpful! I would now like to use this gradient in the form of a square instead of a line, would you know what I should change/ add to do this? Thank you!!
@soumialahfair3171
@soumialahfair3171 2 жыл бұрын
thank you , can you help me to solve my exercice ,it 's about ziggurat method how can i find the point (y2)&(x2) if i have only x(1) &(y1) with fuction =exp (-(x^2)/2) ,
@40NoNameFound-100-years-ago
@40NoNameFound-100-years-ago Жыл бұрын
Thanks for such a great video. How can we use constraints with this method?
@JoelRosenfeld
@JoelRosenfeld Жыл бұрын
It’s been a while, but I think you’d need to integrate in a projection to make sure that you stay within a constrained area
@40NoNameFound-100-years-ago
@40NoNameFound-100-years-ago Жыл бұрын
@@JoelRosenfeld could you recommend a good reference for that?
@JoelRosenfeld
@JoelRosenfeld Жыл бұрын
@@40NoNameFound-100-years-ago I don't know if it's a good reference, but this seems to have all the details about the methodology www.stat.cmu.edu/~ryantibs/convexopt-F18/scribes/Lecture_23.pdf
@40NoNameFound-100-years-ago
@40NoNameFound-100-years-ago Жыл бұрын
@@JoelRosenfeld Thank you. I appreciate your help. I was just reading about those two methods now. The projected gradient descent and the Frank Wolfe algorithm 😃👍
@jorgemiguel7580
@jorgemiguel7580 2 жыл бұрын
I have a problem How did you apply this method to the Traveling salesman problem?
@arwasaed1025
@arwasaed1025 2 жыл бұрын
How can I represent data as function e.g I have ECG data (2014×1 double) in the other word( numeric matrix) data and I wanna represent it as function ??
@JoelRosenfeld
@JoelRosenfeld 2 жыл бұрын
At this point you just have a time series. Typical approaches to approximate an underlying function are to use polynomial approximations, B splines, and Fourier Series.
@arwasaed1025
@arwasaed1025 2 жыл бұрын
@@JoelRosenfeld thanks a lot, I have just amplitude and I ploted the signal without time but I can created time, could you give me more details about B spline....... take in considering I can't use python. More details please 🙏 and I have already gradient descent code but it's so long and I doesn't get it
@JoelRosenfeld
@JoelRosenfeld 2 жыл бұрын
@@arwasaed1025 I have a video talking about splines on my channel already. Just browse around. Should not be hard to find.
@arwasaed1025
@arwasaed1025 2 жыл бұрын
@@JoelRosenfeld okay, I really appreciate your efforts
@sivamullapudi6938
@sivamullapudi6938 2 жыл бұрын
Please provide code
@Mixedit0
@Mixedit0 8 ай бұрын
hi, have you shared the codes you wrote in the video on github?
@__ALT__
@__ALT__ 2 жыл бұрын
can you find gradient of an image using three point formula ?
@JoelRosenfeld
@JoelRosenfeld 2 жыл бұрын
If you want to approximate the gradient of an image, treating the image as samples of a function, then you can use any finite difference scheme you want for each partial derivative. I think what is usually done is to multiply by a vector like [-1 0 1]^T, but ultimately you could design any finite difference scheme to get an estimate.
@soumyadeeppatra2964
@soumyadeeppatra2964 3 жыл бұрын
I am having an issue in this field , I want to optimize an objective function of my interest using gradient based optimization techniques and facing some difficulties.If you could help me in this , I will be really greatul.
@JoelRosenfeld
@JoelRosenfeld 3 жыл бұрын
What sort of objective function and what sort of data?
Numerical Analysis - Final Exam Review 1
1:01:38
ThatMathThing
Рет қаралды 2,6 М.
MATLAB - Radon Transform and Imaging toolbox
23:29
ThatMathThing
Рет қаралды 8 М.
what will you choose? #tiktok
00:14
Анастасия Тарасова
Рет қаралды 6 МЛН
Get 10 Mega Boxes OR 60 Starr Drops!!
01:39
Brawl Stars
Рет қаралды 19 МЛН
When you discover a family secret
00:59
im_siowei
Рет қаралды 22 МЛН
MATLAB at Midnight - Interpolation using Vandermonde Matrices
39:58
Steepest Descent Method (Unconstrained Optimization)
44:21
Engineering Educator Academy
Рет қаралды 6 М.
Gradient Descent, Step-by-Step
23:54
StatQuest with Josh Starmer
Рет қаралды 1,3 МЛН
Applied Optimization - Steepest Descent
29:49
purdueMET
Рет қаралды 62 М.
Oppenheimer's mistake that EVERYONE missed
12:15
ThatMathThing
Рет қаралды 33 М.
Machine Learning with MATLAB   Video   MATLAB
41:26
nan sun
Рет қаралды 16 М.
Solve any equation using gradient descent
9:05
Edgar Programmator
Рет қаралды 53 М.
Intro to Gradient Descent || Optimizing High-Dimensional Equations
11:04
Dr. Trefor Bazett
Рет қаралды 65 М.
Multiple Linear Regression | MATLAB
7:20
Knowledge Amplifier
Рет қаралды 12 М.
what will you choose? #tiktok
00:14
Анастасия Тарасова
Рет қаралды 6 МЛН