You saved me, I got 37/40 in data mining. Thank you❤
@AnuradhaBhatia10 ай бұрын
Congratulations ❤️
@wessauder77087 жыл бұрын
Thank you for this walkthrough! It is very well done. I was looking every where to find an example of how to update the cluster matrix and this really helps. It is so well done and is extremely clear. Thank you for this series on clustering.
@AnuradhaBhatia7 жыл бұрын
Thank you Sir.
@김승준-j4m3z3 жыл бұрын
exactly!!
@theguywithcoolid Жыл бұрын
Bro you look like Craig from Bearded Mechanic
@ShortJokes4U-p9t4 жыл бұрын
good lecture, a small mistake at 4:05 the euclidean distance formula should be sqrt(sum(x - a)^2 + (y - b)^2)
@mrwoofer102 жыл бұрын
smart!
@lancezhang8922 жыл бұрын
Finally I understand the approach about how to merge these points.
@s.r.39242 жыл бұрын
You are saving my exam in data mining. Thank you very much!
@Mynameisjoof7 жыл бұрын
Best explanation of complete linkage I have found. Thank you so much!!!!
@dcharith5 жыл бұрын
Thank you very much for taking the time to post these really helpful videos!
@Kaidanten4 жыл бұрын
does the formular at 3:44 contain a typo? should it be (y-b)^2? not (x-b)^2?
@mahdi_shahbazi2 жыл бұрын
God bless you for this simple yet informatic explanation of Agglomerative Clustering
@mythicallygold16255 жыл бұрын
i like this video, good explanation, good step by step guide, i say more effective than what my teacher taught, thumbs up :3
@rahulgarg26935 жыл бұрын
SAME HERE
@amarimuthu7 жыл бұрын
Hi at 3.50 , the euclidean distance should have 'y-b' instead of 'x-b' for the second value. Thanks and Nice explanation.
@AnuradhaBhatia7 жыл бұрын
Yes Sir, Thanks
@amarimuthu7 жыл бұрын
Anuradha Bhatia Thank you for your teaching and it helps students like me in easily understandable steps.
@AnuradhaBhatia7 жыл бұрын
Thank you so very much for motivation.
@mutebaljasem97344 жыл бұрын
SHe explained the Hierarchical Agglomerative Clustering very well. Big Thank
@mahakalm3952 ай бұрын
My all doubts are clear now thx you so much. :)
@ashishdevassy5 жыл бұрын
this was very helpful.. thank you
@k.kaushikreddy17924 жыл бұрын
Very lucid explanation. Keep up the great work !
@vanlalhriatsaka80545 жыл бұрын
Very clear explanation ma'am....and ma'am one big request please continue to do by centroid method....
@priyankkharat56867 жыл бұрын
It's very helpful for our ongoing exams....thank you so much ma'am.
@AnuradhaBhatia7 жыл бұрын
Thanks.
@mahmoudelkafafy99825 жыл бұрын
Hello, Thanks a lot for the simple and clear explanation for the single linkage (previous video) and for the complete linkage as well. I have two questions. 1) Looking at the dendrograms obtained from the single linkage and the complete linkage, one can see that they are different. So , how can we interpret that? If I cut the tree at the same value (i mean for the single and complete linkage trees), I would obtain different clusters results. 2) What is the idea behind searching for the maximum distance in case of complete linkage?
@AnuradhaBhatia5 жыл бұрын
Thats for complete linkage.
@jmg95093 жыл бұрын
3:15 - Complete Linkage
@HugoRamirezSoto7 жыл бұрын
I would like to thank you for this video. Your explanation is magnificent and so clearly. You helped me a lot to comprehend these complex subjects. Greetings from Mexico.
@AnuradhaBhatia7 жыл бұрын
Thank you so much.
@KalusivalingamThirugnanam5 жыл бұрын
Thanks Mam for explaining this. Very useful.
@hy80404 жыл бұрын
really helped! easy to understand the concept! thanks~
@SelenaFriend5 жыл бұрын
thank you so much for this, it really helped me!
@detox_daddy7 жыл бұрын
clean and precise video. Really helped. Thank you
@AnuradhaBhatia7 жыл бұрын
Deepak Patter Thank you Sir
@AnuradhaBhatia7 жыл бұрын
Deepak Patter 😊
@jules_tbl10103 жыл бұрын
Thank you. I've been studying from a manual and this method not not even close to the explanation
@guliteshabaeva8922 Жыл бұрын
very very clear, thank you!
@alxjf6 жыл бұрын
Helped me a lot. Thank you.
@muhammadfirdaus72786 жыл бұрын
Thank you for the amazing explanation.
@akashr99734 жыл бұрын
Thank you madam, very convincing explanation!
@laxmivyshnavidokuparthi41646 жыл бұрын
Superb explaination .
@jobaidajarin3563 жыл бұрын
Thank you ma'am. It helps a lot
@bhartinarang20787 жыл бұрын
COMPLETE LINK - it means, while calculating distance matrix, we take the maximum value, right? SINGLE LINK - while calculating, distance matrix, we take the minimum value?
@AnuradhaBhatia7 жыл бұрын
Yes. BEST OF LUCK.
@bhartinarang20787 жыл бұрын
Thanks :) means a lot, madam.
@benjaminmusasizi37775 жыл бұрын
Thanks mam. Very well explained!!
@ameliaachung5 жыл бұрын
Lifesaver!! Thank you :)
@tranminhtien1725 жыл бұрын
Pretty useful video! Can you share this slide?
@uarangat6 жыл бұрын
Thanks,clear explanation.
@farahadilah46946 жыл бұрын
hi, why is it that we always have to find the minimum value in the lowest bound but at the last stage (11:53), just finding smallest value in the whole distance matrix?
@AnuradhaBhatia6 жыл бұрын
farah adilah lower bound...so smallest
@farahadilah46946 жыл бұрын
oh.. so no matter complete/single/average linkage, always take the smallest value?
@sebastiantischler84103 жыл бұрын
What happens when you update your distance matrix and then there are two (or more) minimum values?
@bhaskarp10633 жыл бұрын
Quick Question. How do we merge when the index of min element is 1,0 or 0,1
@sumanthkumar40354 жыл бұрын
why did we start with p3 and p6 ?shouldn't we start with the pair which has max distance between them?
@SamirAliyev7715 жыл бұрын
Salam. Thanks a lot :). Excellent job.
@vmudivedu6 жыл бұрын
Thank you Maam. That was a clean video and helped me a lot understanding the Complete-Link. I have a few questions question.. 1. How does the merge criterion influence the merge decision? 2. Why is this complete link clustering called non-local while the single link criterion called local?
@pratiksharma16555 жыл бұрын
Amazing... Cheers
@quicklook39084 жыл бұрын
mam i have a question: should we consider the least value or should consider the least value from the lower bound of the distance matrix
@annmaryjoseph6843 жыл бұрын
Thank you very much
@kg32173 жыл бұрын
Why do we call it "complete" and "single" linkage ? In both videos the difference was minimum and maximum distance to take after making the cluster, is there any other logical reason behind that naming ?
@EvelynJenkins-yu5wi4 жыл бұрын
what to do if there are two same smallest elements?
@payelbanerjee91927 жыл бұрын
dear mam, i would like to know that sometimes cases appear where after computing the similarity matrix we find two lowest distances . Now we can choose anyone of the distances to merge at that step. Now this decision may affect the cluster output at the final stage . Well here am talking about the case when a distance threshold is applied . say for eg-{1,2,3,4,9,8,7}.here if we take a threshold of 1 , then the clusters are {1,2},{3,4},{9},{8,7}.The clusters can also be {1,2},{3,4},{8,9},7.Any solution to these problem ??please reply . thanks .
@AnuradhaBhatia7 жыл бұрын
Hello Madam, The two clusters can be formed with threshold 1. During implementation in the real world problem with clustering the other factors are also considered along with these factors.
@payelbanerjee91927 жыл бұрын
ok, so both of them can be the answers .. am i right?? now if any other factors are taken into consideration then we have to choose a single one..
@AnuradhaBhatia7 жыл бұрын
Right.
@pika3.144 жыл бұрын
Thanks a ton for a fantastic explanation madam! When we pick to first start the merging process, shouldn't we pick P6 and P5 to merge first since it has Max value 0.39?
@MrChabonga4 жыл бұрын
The first cluster is determined through the most similar units. After that we define the distance from that cluster to the other data points through either single linkage (looking at minimal distance) or complete link (looking at maximum distance)
@nononnomonohjghdgdshrsrhsjgd4 жыл бұрын
Very good explanations! Can you please Show an example how to Use the correlation matrix as a distance matrix in kmeans. You have applied the euclidian distance in k means to cluster. How does the calculation of the Clusters work, with taking not the original dataset, but having the correlation matrix. How to Use the corr Matrix to Bild k means clusters? Thank you!
@georgygursky43035 жыл бұрын
Thank you!
@ashtonuranium29945 жыл бұрын
Thank so much...
@yerramillihemanth29982 жыл бұрын
Can you please explain what to do when the matrix has two same low value (eg: If P2 and P1 has 0.12 and P3 and P4 has 012). In that case which points need to be considered?
@mahirkhan41246 жыл бұрын
The answer or the final dendogram of both complete and average link can be same ?????
@AnuradhaBhatia6 жыл бұрын
mahir khan Yes in few cases
@mahirkhan41246 жыл бұрын
Anuradha Bhatia thank you
@atulgupta-sl1zw7 жыл бұрын
dear mam, if we given a similarity matrix instead of distance matrix then what will be the approach? regards Atul
@zixiaozong20485 жыл бұрын
helps alot
@NtinosParas5 жыл бұрын
well done .. thank u :D
@ruler54084 жыл бұрын
Awesome
@shubhamnayak93697 жыл бұрын
Thank you madam
@payelbanerjee91927 жыл бұрын
mam, everywhere it is written that the space complexity of naive Hierarchical Complete Linkage clustering algorithm is O(n) but as far as I know that if all the pairwise distances are stored for calculating the distance matrix then the space complexity should be O(n2).Will you plz let me know that why the space complexity is O(n)??
@AnuradhaBhatia7 жыл бұрын
As every distance is computed and used exactly once.
@payelbanerjee91927 жыл бұрын
would you please clarify slightly ..
@fauzidaniqbal25644 жыл бұрын
what if there is more than one smallest value? example : both value of (1,4) and (2,5) is 1
@satyamas38866 жыл бұрын
thanks
@tamilarsang.s14267 жыл бұрын
Try to upload classification sums naive Bayes ,bayessian and id3 ur video is very help full mumbai university students. ..try to solve it in same method followed by them .....Thanks
@AnuradhaBhatia7 жыл бұрын
Thanks. Sure Sir.
@djzero6695 жыл бұрын
Thanks! =D
@venkateshvelagapudi52407 жыл бұрын
can anyone please share me the code for hierachical clustering