Intro to Clustering 0:27 K-Means: 11:17 SOM: 42:00
@iliasp42753 жыл бұрын
Before coming here, i saw about 5 videos on SOM. No one pointed out that the algorithm is the same as K-means . You enlightened me! thank you very much
@intoeleven4 жыл бұрын
42:00 starts to talk about SOM
@jm-px3mr4 жыл бұрын
Thank you man
@nmana97593 жыл бұрын
Thank you!
@ravivarma57035 жыл бұрын
this is the best explanation i have ever found. Please is there any way i can see more lectures from this professor in any other channels?
@Simzreid4 жыл бұрын
Absolutely brilliant. Clear, concise, flowing and enlightening. A great help in understanding SOFMs research I am thinking about doing. 👍
@subramaniamsrivatsa27194 жыл бұрын
My humble regards to Professor. For so much simplifying complex concepts and explaining intution behind the algorithms ...and encouraging us to understand 🙏
@MrQqqq22224 жыл бұрын
Wonderful professor. I can follow with him even if i am so far from ML field. I start to love Mr hamid and also AI methods and techniques. Thanks a lot my favorite virtual teacher.
@rezamonadi42824 жыл бұрын
I am really proud of you! You explained SOM like an exciting journey...
@ab8jeh5 жыл бұрын
I like the way he explains things very clearly. Within machine learning there is a tendency to cloud things to make oneself seem more intelligent - this lecturer shows how simple some of these algorithms (and ML in general) truly are without dumbing things down.
@joecrowley6305 жыл бұрын
So good, loved the enthusiasm and A-class white board usage of the lecturer. Thank you so much for sharing.
@derollo34 жыл бұрын
Excellent Lecture about clustering. Thank you very much for sharing your knowledge.
@judealaitani10364 жыл бұрын
Amaaaaazing teaching skills!
@karannchew2534 Жыл бұрын
SOM 40:39 1:14:25 Given input X, find i-th unit with closest weight vector by competition. WiT X will be maximum. Find the most similar unit. i(X) = arg max i Ⅱ X - Wk Ⅱ k = 1, 2, 3... m, m = no. of units. The "max" here means highest value of dot product. The most "aligned" set of vectors between input vector and the neuron vector. If the vector are misaligned, the dot product (think cos θ) might be zero.
@Mark-wb8ck4 жыл бұрын
SOM starts at 40:42
@udbhavprasad35214 жыл бұрын
life saver
@karannchew2534 Жыл бұрын
Given input X, find i-th unit with closest weight vector by competition. WiT X will be maximum. Find the most similar unit. i(X) = arg max i Ⅱ X - Wk Ⅱ k = 1, 2, 3... m, m = no. of units. The "max" here means highest value of dot product. The most "aligned" set of vectors between input vector and the neuron vector. If the vector are misaligned, the dot product (think cos θ) might be zero.
@Birdsneverfly4 жыл бұрын
Let me take a moment to admire your handwriting :) Plus "You are becoming your data", this has to be a dialogue from an AI movie. Cheers :)
@isuruvindula23464 жыл бұрын
Sir, you are a lifesaver sir. Thank you very much.
@karannchew2534 Жыл бұрын
Very neat handwriting for a professor.
@burakkara3377 ай бұрын
Professor says, "Nobody screams when I make mistakes". I went crazy on monitor nobody hears me :D 28:32
@homataha56265 жыл бұрын
does anyone know implementation of SOM in python 3? in all the code I have seen they always use targets but we don't have it what should we do?
@pechhase4 жыл бұрын
you could check Peter Wittek's somoclu library
@jm-px3mr4 жыл бұрын
Thanks for sharing the amazing lecture. I wish to take a class in real someday
@mohammadghorbani1885 жыл бұрын
Very good lecture. Thanks for sharing.
@basics79304 жыл бұрын
great explanation
@venkateswaranvenkatraman96304 жыл бұрын
why people are taking Mahalano distance , then eculidian distance ?