Hocam allah razi olsun. Bir tane subgradient anlatan duzgun video bulana kadar gotumuz catladi. Tesekkur ederim
@CamilleKaze573 жыл бұрын
Thanks a lot, it really helped me! You're a much better teacher than mine at uni lol
@sanjaykrish87194 жыл бұрын
From Fourier to Z transform to SVM's... Prof, you are great.
@tolgauye79242 жыл бұрын
adam gibi adam barry van veen. Aday ol ,oy verelim!!!
@marofe3 жыл бұрын
Can the subgrad result in a no descent direction?
@赵宇川-n3q Жыл бұрын
after we get the optimized w, how to get the optimized b?
@morisjohn41674 жыл бұрын
Thank you so much sir, after studying svm i wondered how actually can we find optimal W , your video gave me an answer for that.
@faridzabihian1694 Жыл бұрын
hi what happens when we compute the gradient for one data point?does it finsih all the point or pick a random point?
@anouarlahmdani37184 жыл бұрын
is the function continuous in 1? 5:13
@farhanhyder73043 жыл бұрын
Simple and easy to understand
@alonbegin80444 жыл бұрын
great video! I just don't get your intuition for choosing any value you want within your constraints.. whats stooping me to choose -d*x_i instead of zero in 9:37 and maybe get different result in the loss function that can potentially dramatically change my result in for example to classify corona sick people or not
@isaacnewton15454 жыл бұрын
That's called subgradient NOT subderivative. The subderivative is a taken name for the dini derivative look at Rockafellar and Wets book.