This is one of the few videos on KZbin which really provides an in depth, formulaic tutorial on GD instead of a conceptual one. Thank you!
@NiftyShifty16 жыл бұрын
Great video. The matrix stuff on the side is really nice to see. Thank you
@dareeul Жыл бұрын
I have confuse about this topic so long ,but now you make me understand it.Thank
@JohnSmith-lf5xm7 жыл бұрын
Hello Thank you for the explanation you are the only one that have explain it clear so far on 15 videos i have been watching. Thank you for real. I have a question.. I have the Gauss-Newton, Gradient Decent, and the Levenberg-Marquardt methods but I can not really understand the difference between them... can you explain that to me or tell me where to find it if you have other videos.
@narasa127 жыл бұрын
Excellent video... Thanks for posting
@jamesspectre84926 жыл бұрын
An excellent presentation! Brilliant!
@吴志高-h9g7 жыл бұрын
thanks it makes me clear about gradient descent
@43SunSon4 жыл бұрын
Question: where is the equation X_(i+1) = X_i-af'(X_i) coming from? Any resource I could take a look ? I wish to understand why the new one (X_i+1) is updated like that equation. Thank you!
@Tothefutureand3 жыл бұрын
They call it math cause you can never know where it comes from
@lindazuo91308 жыл бұрын
Hi, Chieh, thanks for the video. Very clear. Just have one question: what is the step? Is the step along the x axis, y axis or the function curve? When you multiply the step and the first derivative of f(x), what do we get? I thought here you are trying to work out the gap between x1 and x2, but mathematically, it does not make sense to me. Many thanks if you could explain it.