Hierarchical Agglomerative Clustering [HAC - Complete Link]

  Рет қаралды 153,536

Anuradha Bhatia

Anuradha Bhatia

Күн бұрын

Пікірлер: 102
@chaosNinja790
@chaosNinja790 10 ай бұрын
You saved me, I got 37/40 in data mining. Thank you❤
@AnuradhaBhatia
@AnuradhaBhatia 10 ай бұрын
Congratulations ❤️
@wessauder7708
@wessauder7708 7 жыл бұрын
Thank you for this walkthrough! It is very well done. I was looking every where to find an example of how to update the cluster matrix and this really helps. It is so well done and is extremely clear. Thank you for this series on clustering.
@AnuradhaBhatia
@AnuradhaBhatia 7 жыл бұрын
Thank you Sir.
@김승준-j4m3z
@김승준-j4m3z 3 жыл бұрын
exactly!!
@theguywithcoolid
@theguywithcoolid Жыл бұрын
Bro you look like Craig from Bearded Mechanic
@ShortJokes4U-p9t
@ShortJokes4U-p9t 4 жыл бұрын
good lecture, a small mistake at 4:05 the euclidean distance formula should be sqrt(sum(x - a)^2 + (y - b)^2)
@mrwoofer10
@mrwoofer10 2 жыл бұрын
smart!
@lancezhang892
@lancezhang892 2 жыл бұрын
Finally I understand the approach about how to merge these points.
@s.r.3924
@s.r.3924 2 жыл бұрын
You are saving my exam in data mining. Thank you very much!
@Mynameisjoof
@Mynameisjoof 7 жыл бұрын
Best explanation of complete linkage I have found. Thank you so much!!!!
@dcharith
@dcharith 5 жыл бұрын
Thank you very much for taking the time to post these really helpful videos!
@Kaidanten
@Kaidanten 4 жыл бұрын
does the formular at 3:44 contain a typo? should it be (y-b)^2? not (x-b)^2?
@mahdi_shahbazi
@mahdi_shahbazi 2 жыл бұрын
God bless you for this simple yet informatic explanation of Agglomerative Clustering
@mythicallygold1625
@mythicallygold1625 5 жыл бұрын
i like this video, good explanation, good step by step guide, i say more effective than what my teacher taught, thumbs up :3
@rahulgarg2693
@rahulgarg2693 5 жыл бұрын
SAME HERE
@amarimuthu
@amarimuthu 7 жыл бұрын
Hi at 3.50 , the euclidean distance should have 'y-b' instead of 'x-b' for the second value. Thanks and Nice explanation.
@AnuradhaBhatia
@AnuradhaBhatia 7 жыл бұрын
Yes Sir, Thanks
@amarimuthu
@amarimuthu 7 жыл бұрын
Anuradha Bhatia Thank you for your teaching and it helps students like me in easily understandable steps.
@AnuradhaBhatia
@AnuradhaBhatia 7 жыл бұрын
Thank you so very much for motivation.
@mutebaljasem9734
@mutebaljasem9734 4 жыл бұрын
SHe explained the Hierarchical Agglomerative Clustering very well. Big Thank
@mahakalm395
@mahakalm395 2 ай бұрын
My all doubts are clear now thx you so much. :)
@ashishdevassy
@ashishdevassy 5 жыл бұрын
this was very helpful.. thank you
@k.kaushikreddy1792
@k.kaushikreddy1792 4 жыл бұрын
Very lucid explanation. Keep up the great work !
@vanlalhriatsaka8054
@vanlalhriatsaka8054 5 жыл бұрын
Very clear explanation ma'am....and ma'am one big request please continue to do by centroid method....
@priyankkharat5686
@priyankkharat5686 7 жыл бұрын
It's very helpful for our ongoing exams....thank you so much ma'am.
@AnuradhaBhatia
@AnuradhaBhatia 7 жыл бұрын
Thanks.
@mahmoudelkafafy9982
@mahmoudelkafafy9982 5 жыл бұрын
Hello, Thanks a lot for the simple and clear explanation for the single linkage (previous video) and for the complete linkage as well. I have two questions. 1) Looking at the dendrograms obtained from the single linkage and the complete linkage, one can see that they are different. So , how can we interpret that? If I cut the tree at the same value (i mean for the single and complete linkage trees), I would obtain different clusters results. 2) What is the idea behind searching for the maximum distance in case of complete linkage?
@AnuradhaBhatia
@AnuradhaBhatia 5 жыл бұрын
Thats for complete linkage.
@jmg9509
@jmg9509 3 жыл бұрын
3:15 - Complete Linkage
@HugoRamirezSoto
@HugoRamirezSoto 7 жыл бұрын
I would like to thank you for this video. Your explanation is magnificent and so clearly. You helped me a lot to comprehend these complex subjects. Greetings from Mexico.
@AnuradhaBhatia
@AnuradhaBhatia 7 жыл бұрын
Thank you so much.
@KalusivalingamThirugnanam
@KalusivalingamThirugnanam 5 жыл бұрын
Thanks Mam for explaining this. Very useful.
@hy8040
@hy8040 4 жыл бұрын
really helped! easy to understand the concept! thanks~
@SelenaFriend
@SelenaFriend 5 жыл бұрын
thank you so much for this, it really helped me!
@detox_daddy
@detox_daddy 7 жыл бұрын
clean and precise video. Really helped. Thank you
@AnuradhaBhatia
@AnuradhaBhatia 7 жыл бұрын
Deepak Patter Thank you Sir
@AnuradhaBhatia
@AnuradhaBhatia 7 жыл бұрын
Deepak Patter 😊
@jules_tbl1010
@jules_tbl1010 3 жыл бұрын
Thank you. I've been studying from a manual and this method not not even close to the explanation
@guliteshabaeva8922
@guliteshabaeva8922 Жыл бұрын
very very clear, thank you!
@alxjf
@alxjf 6 жыл бұрын
Helped me a lot. Thank you.
@muhammadfirdaus7278
@muhammadfirdaus7278 6 жыл бұрын
Thank you for the amazing explanation.
@akashr9973
@akashr9973 4 жыл бұрын
Thank you madam, very convincing explanation!
@laxmivyshnavidokuparthi4164
@laxmivyshnavidokuparthi4164 6 жыл бұрын
Superb explaination .
@jobaidajarin356
@jobaidajarin356 3 жыл бұрын
Thank you ma'am. It helps a lot
@bhartinarang2078
@bhartinarang2078 7 жыл бұрын
COMPLETE LINK - it means, while calculating distance matrix, we take the maximum value, right? SINGLE LINK - while calculating, distance matrix, we take the minimum value?
@AnuradhaBhatia
@AnuradhaBhatia 7 жыл бұрын
Yes. BEST OF LUCK.
@bhartinarang2078
@bhartinarang2078 7 жыл бұрын
Thanks :) means a lot, madam.
@benjaminmusasizi3777
@benjaminmusasizi3777 5 жыл бұрын
Thanks mam. Very well explained!!
@ameliaachung
@ameliaachung 5 жыл бұрын
Lifesaver!! Thank you :)
@tranminhtien172
@tranminhtien172 5 жыл бұрын
Pretty useful video! Can you share this slide?
@uarangat
@uarangat 6 жыл бұрын
Thanks,clear explanation.
@farahadilah4694
@farahadilah4694 6 жыл бұрын
hi, why is it that we always have to find the minimum value in the lowest bound but at the last stage (11:53), just finding smallest value in the whole distance matrix?
@AnuradhaBhatia
@AnuradhaBhatia 6 жыл бұрын
farah adilah lower bound...so smallest
@farahadilah4694
@farahadilah4694 6 жыл бұрын
oh.. so no matter complete/single/average linkage, always take the smallest value?
@sebastiantischler8410
@sebastiantischler8410 3 жыл бұрын
What happens when you update your distance matrix and then there are two (or more) minimum values?
@bhaskarp1063
@bhaskarp1063 3 жыл бұрын
Quick Question. How do we merge when the index of min element is 1,0 or 0,1
@sumanthkumar4035
@sumanthkumar4035 4 жыл бұрын
why did we start with p3 and p6 ?shouldn't we start with the pair which has max distance between them?
@SamirAliyev771
@SamirAliyev771 5 жыл бұрын
Salam. Thanks a lot :). Excellent job.
@vmudivedu
@vmudivedu 6 жыл бұрын
Thank you Maam. That was a clean video and helped me a lot understanding the Complete-Link. I have a few questions question.. 1. How does the merge criterion influence the merge decision? 2. Why is this complete link clustering called non-local while the single link criterion called local?
@pratiksharma1655
@pratiksharma1655 5 жыл бұрын
Amazing... Cheers
@quicklook3908
@quicklook3908 4 жыл бұрын
mam i have a question: should we consider the least value or should consider the least value from the lower bound of the distance matrix
@annmaryjoseph684
@annmaryjoseph684 3 жыл бұрын
Thank you very much
@kg3217
@kg3217 3 жыл бұрын
Why do we call it "complete" and "single" linkage ? In both videos the difference was minimum and maximum distance to take after making the cluster, is there any other logical reason behind that naming ?
@EvelynJenkins-yu5wi
@EvelynJenkins-yu5wi 4 жыл бұрын
what to do if there are two same smallest elements?
@payelbanerjee9192
@payelbanerjee9192 7 жыл бұрын
dear mam, i would like to know that sometimes cases appear where after computing the similarity matrix we find two lowest distances . Now we can choose anyone of the distances to merge at that step. Now this decision may affect the cluster output at the final stage . Well here am talking about the case when a distance threshold is applied . say for eg-{1,2,3,4,9,8,7}.here if we take a threshold of 1 , then the clusters are {1,2},{3,4},{9},{8,7}.The clusters can also be {1,2},{3,4},{8,9},7.Any solution to these problem ??please reply . thanks .
@AnuradhaBhatia
@AnuradhaBhatia 7 жыл бұрын
Hello Madam, The two clusters can be formed with threshold 1. During implementation in the real world problem with clustering the other factors are also considered along with these factors.
@payelbanerjee9192
@payelbanerjee9192 7 жыл бұрын
ok, so both of them can be the answers .. am i right?? now if any other factors are taken into consideration then we have to choose a single one..
@AnuradhaBhatia
@AnuradhaBhatia 7 жыл бұрын
Right.
@pika3.14
@pika3.14 4 жыл бұрын
Thanks a ton for a fantastic explanation madam! When we pick to first start the merging process, shouldn't we pick P6 and P5 to merge first since it has Max value 0.39?
@MrChabonga
@MrChabonga 4 жыл бұрын
The first cluster is determined through the most similar units. After that we define the distance from that cluster to the other data points through either single linkage (looking at minimal distance) or complete link (looking at maximum distance)
@nononnomonohjghdgdshrsrhsjgd
@nononnomonohjghdgdshrsrhsjgd 4 жыл бұрын
Very good explanations! Can you please Show an example how to Use the correlation matrix as a distance matrix in kmeans. You have applied the euclidian distance in k means to cluster. How does the calculation of the Clusters work, with taking not the original dataset, but having the correlation matrix. How to Use the corr Matrix to Bild k means clusters? Thank you!
@georgygursky4303
@georgygursky4303 5 жыл бұрын
Thank you!
@ashtonuranium2994
@ashtonuranium2994 5 жыл бұрын
Thank so much...
@yerramillihemanth2998
@yerramillihemanth2998 2 жыл бұрын
Can you please explain what to do when the matrix has two same low value (eg: If P2 and P1 has 0.12 and P3 and P4 has 012). In that case which points need to be considered?
@mahirkhan4124
@mahirkhan4124 6 жыл бұрын
The answer or the final dendogram of both complete and average link can be same ?????
@AnuradhaBhatia
@AnuradhaBhatia 6 жыл бұрын
mahir khan Yes in few cases
@mahirkhan4124
@mahirkhan4124 6 жыл бұрын
Anuradha Bhatia thank you
@atulgupta-sl1zw
@atulgupta-sl1zw 7 жыл бұрын
dear mam, if we given a similarity matrix instead of distance matrix then what will be the approach? regards Atul
@zixiaozong2048
@zixiaozong2048 5 жыл бұрын
helps alot
@NtinosParas
@NtinosParas 5 жыл бұрын
well done .. thank u :D
@ruler5408
@ruler5408 4 жыл бұрын
Awesome
@shubhamnayak9369
@shubhamnayak9369 7 жыл бұрын
Thank you madam
@payelbanerjee9192
@payelbanerjee9192 7 жыл бұрын
mam, everywhere it is written that the space complexity of naive Hierarchical Complete Linkage clustering algorithm is O(n) but as far as I know that if all the pairwise distances are stored for calculating the distance matrix then the space complexity should be O(n2).Will you plz let me know that why the space complexity is O(n)??
@AnuradhaBhatia
@AnuradhaBhatia 7 жыл бұрын
As every distance is computed and used exactly once.
@payelbanerjee9192
@payelbanerjee9192 7 жыл бұрын
would you please clarify slightly ..
@fauzidaniqbal2564
@fauzidaniqbal2564 4 жыл бұрын
what if there is more than one smallest value? example : both value of (1,4) and (2,5) is 1
@satyamas3886
@satyamas3886 6 жыл бұрын
thanks
@tamilarsang.s1426
@tamilarsang.s1426 7 жыл бұрын
Try to upload classification sums naive Bayes ,bayessian and id3 ur video is very help full mumbai university students. ..try to solve it in same method followed by them .....Thanks
@AnuradhaBhatia
@AnuradhaBhatia 7 жыл бұрын
Thanks. Sure Sir.
@djzero669
@djzero669 5 жыл бұрын
Thanks! =D
@venkateshvelagapudi5240
@venkateshvelagapudi5240 7 жыл бұрын
can anyone please share me the code for hierachical clustering
@mohammedsiraj673
@mohammedsiraj673 7 жыл бұрын
Thank you for the clear explanation :)
@AnuradhaBhatia
@AnuradhaBhatia 7 жыл бұрын
Thank you
@nileslystatozero9869
@nileslystatozero9869 5 жыл бұрын
❤️
@СергейВакульчик-с6п
@СергейВакульчик-с6п 4 жыл бұрын
Thanks!
@looploop6612
@looploop6612 7 жыл бұрын
y-b
@nisajafernando3767
@nisajafernando3767 3 жыл бұрын
Thank you !
Hierarchical Agglomerative Clustering [HAC - Average Link]
12:39
Anuradha Bhatia
Рет қаралды 109 М.
Hierarchical Cluster Analysis [Simply explained]
8:22
DATAtab
Рет қаралды 91 М.
Don’t Choose The Wrong Box 😱
00:41
Topper Guild
Рет қаралды 62 МЛН
K-Mean Clustering
11:40
Anuradha Bhatia
Рет қаралды 343 М.
Clustering: K-means and Hierarchical
17:23
Serrano.Academy
Рет қаралды 206 М.
StatQuest: K-means clustering
8:31
StatQuest with Josh Starmer
Рет қаралды 1,7 МЛН
IAML19.5 Single-link, complete-link, Ward's method
8:51
Victor Lavrenko
Рет қаралды 68 М.
All Machine Learning algorithms explained in 17 min
16:30
Infinite Codes
Рет қаралды 509 М.
12. Clustering
50:40
MIT OpenCourseWare
Рет қаралды 309 М.
StatQuest: Hierarchical Clustering
11:19
StatQuest with Josh Starmer
Рет қаралды 462 М.
Agglomerative Clustering: how it works
8:48
Victor Lavrenko
Рет қаралды 166 М.
Don’t Choose The Wrong Box 😱
00:41
Topper Guild
Рет қаралды 62 МЛН