Very clear explanation, nice vocal sound, straight to the point. I loved it! Thank you very much.
@tkorting5 жыл бұрын
Thanks a lot for your feedback Please like/share/subscribe Regards
@ramjisharma12264 жыл бұрын
Pllohioollil lol OK ooollologoilllmi kill OK OK llokllk Loki LMK poo I'll JK I'll MO lollolgiololooollk polo klll"ooiklo"omojoljohjoolklkoolOK o*9lIP love OK llolkl on loop"llllloOK j on jlbll oh k polo flu lokklol I'll lol LMK k poo lllnmllll lii I'm loll lii kllllk my OK lol MP lap I'll Polk on milk lii llvkklp"lollllo"llOK ml"lOK jol"lMO go "jj9 Lnklljllo oh jkkk"blkjjnlklllOK l I'll lii
Our teacher took an entire lecture to explain this but couldn't, it took you a couple of minutes and it's all clear. Thank you! :-)
@tkorting10 жыл бұрын
Dear friend, thanks for your valuable comments Please subscribe to my channel and share this video with your peers. Regards
@hpmc74265 жыл бұрын
I have understood more about k-NN and 1-NN in these few minutes than the hours of reading papers or thesis or wiki or endless webpages of explaining nothing at all! When even a dummy like me understands it, it is explained very well! Most of the reading on ML does nothing except providing you with more abbreviations, acronyms or terms, in which are already a separate paper itself, in which you will find even more abbreviations, acronyms or terms. Thank you for explaining in English!
@tkorting5 жыл бұрын
Thanks for this great feedback Please subscribe to my channel Regards
@Ian2000ize4 жыл бұрын
Thank you for the precise explanation that made sense of how K-NN works. I have taken an ML class and still couldn't quite understand it but in less than 5 minutes, I can now explain it to a kindergarten kid thanks to you.
@RachayitaGiri9 жыл бұрын
For the first time, I know what KNN is all about! Thanks to you!
@tkorting9 жыл бұрын
+Rachayita Giri thanks for your comment. Please subscribe to my channel and share this video with your peers. Regards
@mz52345 жыл бұрын
Good concise definition overview.@@tkorting Rachayita Giri - Yes, just starting to study some algorithms - and been reading so articles - and didn't understand the point.
@tkorting5 жыл бұрын
@@mz5234 Thanks for your comment Please subscribe to my channel. Regards
@takosmos3 жыл бұрын
What a loss.. you know knn but you don't know me
@bconsortium25712 жыл бұрын
Same. Very good explanation.
@jennyguzman3462 жыл бұрын
You talk so clear (important to me: not english talker), this video is so pleasant to hear. Nice explanation!
@ivelindimitrov17572 жыл бұрын
I have been looking to understand K-NN on so many websites and it took you 30 seconds to explain it better than any of them do.
@hombreazu18 жыл бұрын
Very clear explanation. Thanks. In particular, mentioning the potential ties was very helpful.
@tkorting8 жыл бұрын
+hombreazu1 thanks for your feedback please subscribe to my channel and share this video with your peers regards
@macot799 жыл бұрын
Great explanation! I've just started learning ML and this is by far the best and easiest explanation of how the KNN Algorithm. Big thanks!
@tkorting9 жыл бұрын
Dear macot79, thanks for your valuable comments. Please subscribe to my channel and share this video with your peers. Regards
@jo12614 жыл бұрын
This is a very clear explanation. I've been reading a lot about KNN but this video clarified everything to me. Thanks!
@SumanDhondalay9 жыл бұрын
Awesome explaination!! So far this is the best explaination on KNN!! Thank you for sharing !
@tkorting9 жыл бұрын
Dear Suman, thanks for your comments. Please subscribe to my channel, and share this video with your peers. Regards
@filipegoncalves44779 жыл бұрын
Awesome explanation. Quick, simple, to the point. Exactly what I was needing to do my homework :) Thanks!
@tkorting9 жыл бұрын
Filipe Gonçalves Dear Filipe, thanks for your comments.Please subscribe to my channel and share this video with your peers. Regards
@edtrujillo46675 жыл бұрын
I have never posted a comment but wanted to thank you for the great explanation!
@tkorting5 жыл бұрын
Thanks for your comment, please like/share/subscribe. And post more comments ;) Regards
@MrRynRules5 жыл бұрын
Also, a really underrated channel!
@ComputerPhysicsLab8 жыл бұрын
Voronoi partition is clearly explain and visually straightforward. Thanks for sharing such a good presentation
@tkorting8 жыл бұрын
+ComputerPhysics Lab thanks for your feedback. Please subscribe to my channel and share this video with your peers. Regards
@catherinepang88685 жыл бұрын
simple and straightforward, I loved it! thank u so much.
@tkorting5 жыл бұрын
Thanks for this positive feedback Please subscribe to my channel Regards
@ravishankar8428 жыл бұрын
Your videos were clear and understandable by everyone thanks for putting in your efforts on this :)
@tkorting8 жыл бұрын
thanks a lot for your feedback please like/share and subscribe Regards
@huseyineken37575 жыл бұрын
Nice video, thanks! But I do have a question: in k=1 case why do we have to do Voronoi partition? Instead can't we choose the class of the new sample by the shortest distance to representative points of the training dataset classes? I guess it is the same at the end of the day but isn't it easier to calculate a sample's shortest distances than calculating for the whole Euclidean Space?
@tkorting5 жыл бұрын
Thanks for your comment. Please like/share/subscribe. Indeed you don't need to make the Voronoi partition, however they are created naturally by the behaviour of the algorithm. Regards
@reachDeepNeuron8 жыл бұрын
What is this c is all about ???? Can you give some example please, is it something like the c is ...X men movie .. we try to identify in which class or category this movie falls into such as romantic or Action movie ??? The romantic and action movies are already classified by the KNN model.
@meximon12223 жыл бұрын
Another dude who manages to explain in a short video of a few minutes what my prof couldn't in 2h of lecture.
@RocckeFella10 жыл бұрын
One of the best lectures I have ever seen on youtube so far... Please accept my sub.
@tkorting10 жыл бұрын
Dear friend, thanks for your valuable comments and for subscribing to my channel. Regards
@brnjnsvld8 жыл бұрын
I studied kNN in school (multiple years ago), and worked on a team project (in school) where we were trying to predict home prices based on several numerical predictors. The problem is I don't recall how we did it. I have saved the write-up which says we used: the original listing price, # of bathrooms, is the house in zipcode A B or C, and was it built after 1980. The write-up also says we had a "K of 3" and a mean prediction error of $16,460. Can you connect the dots for me here? How do you predict home prices using kNN? would every price point be it own classification. Regression made more sense to me for this task.
@tkorting8 жыл бұрын
Dear friend, thanks for your question. Please like the video, subscribe to my channel and share with your peers. Comparing my presentation with your outline, we have a feature space of 4 dimensions 1. listing prince 2. number of bathrooms 3. zipcode (seems to be literal, but could be interpreted as 3 numbers, say, 1 2 and 3) 4. built year (or also a binary variable, 0 and 1 for example) Than you should have several instances for houses, for example. House 1: 1. 300 2. 3 3. 2 4. 0 House 2: 1. 400 2. 2 3. 2 4. 1 (...) House N 1. 340 2. 1 3. 2 4. 0 Then suppose you want to know the listing price for a house with (1 bathroom, zipcode 1, built year 0). The algorithm should put all values for all instances in the 3D space, and compute the distance between this house, and all other instances. You get the 3 (as you stated k = 3) closest instances and retrieve its listing prices. With 3 prices, you can estimate, using average value, for example, the price for this house. Regards
@suileungmak93259 жыл бұрын
Best of the Best ,explain so clear without complex math formula !
@tkorting9 жыл бұрын
+SUI LEUNG Mak thanks for your feedback, and for subscribing to my channel. Best regards
@kamangate8 жыл бұрын
very good short explanations for beginners in field of machine learning.........good job
@tkorting8 жыл бұрын
thanks for your message please subscribe to my channel and like/share the video regards
@roxannezalucky10 жыл бұрын
This was great! Thank you so much for posting and sharing your knowledge!
@tkorting10 жыл бұрын
Dear Roxanne, thanks for your feedback. Please subscribe to my channel and share this video with your peers. Regards
@XxelitebeautyxX6 жыл бұрын
As simple as that? I was studying KNN from Introduction To Statistical Learning and I felt it so much tough to understand. I am trying to learn Data Science, I have completed like 4 courses of Datacamp's path to Data Scientist With Python. I am feeling like the path is less concept oriented and more based on how to use python for data science. I am totally confused with how to learn Data Science, I have 20 semester end holidays left. I want to use the time. I have tried the book Introduction to statistical learning, the book applies the methods using R and I am already learning python(using Datacamp). I have tried few statistics courses which were complete theory. I have tried Andrew NG and everything went above my head. I am in search for a course that includes both, the practical as well as the theoretical part.
@tkorting5 жыл бұрын
Thanks for your comment. Please like/share/subscribe. Yes, sometimes the theory in the books are quite difficult to understand, although the practice sometimes is not so hard. Regards
@KamalSehairi7 жыл бұрын
Short, simple and straightforward Amazing!! Thanks a lot sir
@tkorting7 жыл бұрын
Thanks a lot!
@bobbyaipanga23728 жыл бұрын
clear and simple explanation Helped me a lot in supervised classification. thank you!
@tkorting8 жыл бұрын
thanks for your feedback please like/share the video and subscribe Regards
@ademirgabardo68439 жыл бұрын
Very good video, easy to understand. Thank you!
@tkorting9 жыл бұрын
Dear friend, Thanks for your valuable comments and for subscribing to my channel. Regards
@vishalkatariya49969 жыл бұрын
Thank you for this video! It's very easy to understand.
@tkorting9 жыл бұрын
***** Thanks for your comments. Please subscribe to my channel and share this video with your peers. Regards
@DanielDaniel-nz9ms4 жыл бұрын
Thank you for your help! Beautifully done!
@tkorting3 жыл бұрын
Thanks for the feedback, please subscribe to my channel
@joaopedrodecarvalhomagalha71448 жыл бұрын
Very good explanation. Thanks for describing it in a simple way.
@tkorting8 жыл бұрын
+Joao Pedro de Carvalho Magalhaes thanks a lot for yout feedback. Please subscribe to my channel and share this video with your peers. Regards
@diegofernandocobacruz65088 жыл бұрын
In your example, when you find out that 'C' element belongs to 'O' class, if you would like to clasify another element 'D' , is 'C' taken in account to predict the label for 'D' or just the original samples are used to clasify it? Thanks.
@tkorting8 жыл бұрын
thanks for your feedback, Diego. Please subscribe to my channel, like the video and share with your peers. Since element C was just classified, in the standard way, it should not be used in the classification of D, otherwise the algorithm would have a different behavior, depending on the new objects for classification. This should be considered as a learning algorithm, based on kNN. Regards
@lewisalberg30817 жыл бұрын
the graphics make it easy to understand
@nyamjoon6 жыл бұрын
Thank You! You made it so easy. I now use your explanation to teach others.
@tkorting6 жыл бұрын
Thanks for your feedback. In the video description I inserted the link for the original presentation. Feel free to use it. Please subscribe to my channel. Regards
@niosen5 жыл бұрын
Short and simple, thanks a lot!
@tkorting5 жыл бұрын
Thanks for your feedback Please like and share the video and subscribe to my channel Regards
@rtazimi10 жыл бұрын
Really good explanation! Do you also have an inplementation of this algorithm so one can get an idea of how it can be implemented?
@tkorting8 жыл бұрын
thanks for your feedback, please subscribe to my channel. I don't have an implementation of the algorithm, but I am sure you can find it on the internet. The idea is to have a set of samples with N dimensions. The new values to be classified will have also N dimensions. So the algorithm must check all samples against the new values, and count the closest pattern. Then the output will be the classification. Regards
@muhammadaltf9 жыл бұрын
Good lecture...Expecting more from you
@tkorting9 жыл бұрын
Althaf Muhammad Thanks for your feedback.Regards
@lordgucci92108 жыл бұрын
Great explanation - clear and simple - thanks much!
@tkorting8 жыл бұрын
Thanks for the feedback. Please like the video and subscribe. Regards
@junhuipark90879 жыл бұрын
Hi Thales, Thank you for your tutorial. It was very good instruction to KNN. It may be not a good place to ask technical questions. But do you mind give a direction of the problem I am having now I am suing KNN to distinguish the numbers. 0 - 9. And the question would be 1. How much training set will be needed normally for each number? 2. I am using the OpenCV to train the number image in the same font. Then for each training item in training set, is it ok to have same font but different szie of image to train ? how each item should be slightly different to draw the circle of one number ? Many thanks.
@tkorting9 жыл бұрын
+Junhui Park Dear Junhui, thanks for your feedback. Please subscribe to my channel and share this video with your peers. The answer to your question depends on several factors. One of them is the size of the feature space. To use KNN with 10 patterns you must have a proper feature space, and also some minimum number of samples. I would reccomend to have the same number of samples for each number, and also try different fonts for reference. The thing is that using KNN you will not have a training stage, but to every new number to be classified (candidate), you must compute the distance between the candidate and all the samples, in order to find the nearest neighbors class to assign to the candidate. With a lot of samples you can slow considerably the processing. Regards
@christianabbey10632 жыл бұрын
Sweet and simple. Thanks man
@dancewithdjcreations24055 жыл бұрын
Can you give me the presentation link. Description link is not working.
@tkorting5 жыл бұрын
Please subscribe to my channel I updated the link on my description Regards
@kennjeee9 жыл бұрын
hey thales, i wonder what does the x and xi represents from region 1?
@taylorasbury49809 жыл бұрын
+Kenji Fnu x is the point of interest which is to be classified and xi is a point that is already known to belong to region i. If it is that mathematical notation for Ri that is confusing, then here is my attempt at putting that formula into words: Ri (region i) is defined as the set of all values of x where the distance between x and xi (where xi already belongs to region i) are strictly LESS than the distance between x and xj, where i does not equal j.
@kamangate8 жыл бұрын
let say u want to identify a place x out of the six rooms in your house....so x is that room u dont know it belong to which one (xi) of the 6 rooms of your house, and the no of features u extract from each room to form a feature vector, determines its dimension
@AngryJesus17 жыл бұрын
Aren't different metrics, like Manhattan distance, EMD, or euclid helping with the drawback of kNN algorithm? At least that's what I used in my bachelor's thesis in Facial Pattern Recognition via LBP + kNN classification (+k means)
@anmolsmartkid7 жыл бұрын
thanks for the great work..could u make a video on selecting apt K Values for differemt data sets and how the complexity of calc nearest neighbours can be minimised?I'll be highly obliged.
@tkorting6 жыл бұрын
Thanks for your suggestion, I will put on my wish list for next videos. Please like and share this video and subscribe to my channel. Regards
@wendingpeng26417 жыл бұрын
Short and good explanation, even better to speed 2x :)
@tkorting7 жыл бұрын
+Wending Peng thanks for your feedback. For more slow videos please subscribe to my channel. Regards
@Arshad.H.H8 жыл бұрын
please mr.Thales Sehn Körting l not understand how to k-nearest neighbor smoother work because knn different to kernel estimation by bandwidth .how to knn work? in nonparametric regression
@tkorting8 жыл бұрын
+arshad 2014 thanks for your feedback. Please subscribe to my channel and share this video with your peers. I am not an specialist in nonparametric regression. However, the core of the method presented here is to find instances in the feature space, which are similar to the presented exemples (samples). Regards
@ioncimpeanu70588 жыл бұрын
So what about when k=1 and we have 2 classes A and B. We want to to identify a new element x in one of this classes and the distances are equal. Then the algorithm in which class puts the element x?
@tkorting8 жыл бұрын
hi, thanks for your question Please like/share the video and subscribe. As I mentioned in the remarks of the video, some cases are not recommended for k-means. With k=1 you define a voronoi space with your features. In the limit, who creates the algorithm should define an heuristic to classify the element. One easy way is to randomize the class (or A or B in your case). Regards
@idevankush6 жыл бұрын
If at last, its to be randomized, why create Vornoi partitions? Why not directly choose "e" or "b" randomly?
@elmarqarayev64746 жыл бұрын
how do we choose an optimum k count? is there specific method to choose k?
@tkorting6 жыл бұрын
Thanks for your question. Please like and share the video and subscribe to my channel. It is a good question, because an unsupervised algorithm can be considered “a little bit supervised” because is up to the user to set this number of clusters. Sometimes you have the feeling about your data, and how many groups you expect to find, so this is a clue to determine the value of k. However, sometimes you define a very high k and some clusters will point to the same place, therefore some methods eliminate them. I mean, if you define a high value of k you will perceive our your results. Regards
@georgeb86378 жыл бұрын
1:13 - k=# (# = how many neighbours to check, before classifying) 1:29 - visually depicted how it works 3:48 - Remarks (no examples - no clear to understand)
@tkorting8 жыл бұрын
Dear George, thanks for your feedback and also the remarks on important points in the video. Please subscribe to my channel and share this video with your peers. About the Remarks at 3:48, could you clarify your difficulty to understant? Regards
@pierrelaurent10658 жыл бұрын
Simple. Concise. Thanks for sharing.
@tkorting8 жыл бұрын
thanks a lot for your comments. Please subscribe to my channel and share this video with your peers. Regards
@sudarshinityagi83209 жыл бұрын
Hey. The video is quite informative. However, I was wondering how each training vector defines a region in space when k=1.
@tkorting9 жыл бұрын
+Sudarshini Tyagi thanks for your feedback please subscribe to my channel and share this video with your peers. when k = 1, the new samples will always be classified as the most similar element. It will be very succeptible to outliers, for example. Regards
@sudarshinityagi83209 жыл бұрын
+Thales Sehn Körting , I think I understood the concept using Voronoi diagram. Thanks a lot. :)
@urjeans289610 жыл бұрын
Thanks for your concise explanation
@tkorting10 жыл бұрын
Dear Yu-Jheng Fang, thanks for your valuable comments. Please subscribe to my channel and share this video with your peers. Best regards
@aravindkramesh2 жыл бұрын
*Sir, COuld you please tell me how the initial classification is done?*
@tkorting2 жыл бұрын
The initial classification is manual, the references are given by a domain expert. Please subscribe.
@leorivas2 жыл бұрын
Very clear, thank you so much!
@MahmoudMagdy17 жыл бұрын
thanks so much for this video, Can I know what is the software that you are using for the presentation?
@tkorting7 жыл бұрын
Thanks for your feedback, please subscribe to my channel and like/share the video. In case you want to reuse my presentation (I used prezi), follow prezi.com/ukps8hzjizqw/? Regards
@superjzh6 жыл бұрын
Thank you for making this video. Could you please make another video to explain how to use KNN in solving regression problem?
@n1ru4l5 жыл бұрын
Great video but I have one unanswered question: Given we have three classes A B C and k = 5. Let's assume the nearest neighbours are A A B B C, how is determined whether A or B should be returned as a result?
@tkorting5 жыл бұрын
Thanks for your question Please subscribe to my channel. This is a good point, and can occur frequently. In this case (a tie) the algorithm can apply some predefined heuristics, such as randomly choose between pattern A and B (in your example) or even try to find only for this object k = 6 or k = 7 for example. Regards
@skinhat10 жыл бұрын
Great video. No using of jargon to explain jargon which makes it easy to understand.
@tkorting10 жыл бұрын
Dear Skinhat, thanks for your valuable comments. Please share this video with your peers and subscribe to my channel. Best regards
@bassamry3 жыл бұрын
thanks for the vid! would been beeter if you explained how the algo actually finds the neighbors
@oozzar28415 жыл бұрын
In knn algorithm which feature extraction is used? Bow or n-gram
@tkorting5 жыл бұрын
Thanks for your comment, please like/share/subscribe. You can use any feature, as long as the feature represent your classes of interest. Regards
@ranjithanayak6 жыл бұрын
Great video! And I love your accent!
@tkorting5 жыл бұрын
Thanks for your comment. Please like/share/subscribe. Regards
@luizluz6633 жыл бұрын
Are you from Brazil?
@tkorting3 жыл бұрын
Com esse sotaque, né? Não esqueça de se inscrever no canal. Abraço
@luizluz6633 жыл бұрын
haha. Inscrito! Conteúdo muito bom
@helloansuman9 жыл бұрын
Thanks a lot. Understood completely.
@tkorting9 жыл бұрын
+Ansuman Mahapatra thanks for your feedback. Please subscribe to my channel and share this video with your peers. Regards
@helloansuman9 жыл бұрын
Thales Sehn Körting Already done sir. Keep on uploading such nice videos.
@nguyentnhoang8 жыл бұрын
what do you do if your KNN has K+ points of equal distances?
@tkorting8 жыл бұрын
thanks for your question, please subscribe and like the video. This is why we recommend and odd number of K, to avoid a tie. Regards
@jonihoppen9 жыл бұрын
Congratulations for the presentation, by your accent you are Brazilian are't you? I will keep following your videos, really enjoying the material. Just as a feedback, you mentioned distances so it would be interesting to explain how to calculate the distance and the type of distance measurements can be used :). Cheers from Florianópolis.
@tkorting9 жыл бұрын
+Joni Hoppen thanks for your feedback and for your perception :) we are Brazilians! I created another video (How Region Growing Segmentation works) and there I explain one example of distance (euclidean). Maybe I can create a link between these videos to help people to understand better. Thanks again, -- Um grande abraço
@harikrishnajiju21312 жыл бұрын
Clear and concise! Thanks!
@elenacodin18307 жыл бұрын
Really useful, thank you so much!
@tkorting7 жыл бұрын
thanks for your feedback please subscribe to my channel and like/share the video regards
@nahiduzzamanrose9 жыл бұрын
Thank you for this video!
@tkorting9 жыл бұрын
Nahiduzzaman Rose Thanks for your comments. Please subscribe to my channel and share this video with your peers. Regards
@ZanoStorck6 жыл бұрын
Great explanation! Thanks a lot for it :)
@tkorting6 жыл бұрын
Thanks for your feedback Please like/share/subscribe. Regards
@msajia100010 жыл бұрын
Thank you so much, it was a great explanation
@tkorting10 жыл бұрын
Dear sajia, thanks for your valuable comments. Please subscribe to my channel and share this video with your peers. Regards
@wathiqn45839 жыл бұрын
?what is the distance function used to find the nearest
@tkorting9 жыл бұрын
+Wathiq N dear friend, thanks for your comments. You can use the distance function you want. It is common to use Euclidean distance. Best regards
@tkorting9 жыл бұрын
+Wathiq N and please subscribe to my channel
@wathiqn45839 жыл бұрын
+Thales Sehn Körting Thanks for you reply.. please, can you send me a link of details steps of this algorithm that are easy to programmed.
@tkorting9 жыл бұрын
I would suggest you to google for R or Python implementations of this algorithms, or even to use some data mining packages, such as WEKA Regards
@stone13238 жыл бұрын
Clear and simple, thanks.
@tkorting8 жыл бұрын
many thanks for your feedback please subscribe and like/share the video. regards
@dar78235 жыл бұрын
so, which class will be determined for 'c'?
@tkorting5 жыл бұрын
thanks for your question please like/share/subscribe The class 'o' will be determined for 'c', as shown in 1:40 in the video Regards
@munshidap9865 жыл бұрын
Well explained.thank you sir
@tkorting5 жыл бұрын
Thanks for your feedback Please subscribe to my channel Regards
@mshahzaib41957 жыл бұрын
To the point, excellent video
@tkorting7 жыл бұрын
+shah zaib thanks for your feedback Please like and share the video and subscribe to my channel. Regards
@satyamgupta5824 жыл бұрын
Good explaination,thanks alot..
@tkorting4 жыл бұрын
Thanks for your feedback Please like/share/subscribe
@MrRynRules5 жыл бұрын
well explained, thank you!
@tkorting3 жыл бұрын
Thanks for the comment, please subscribe to my channel
@ntsakomculu3713 жыл бұрын
Best explanation for kNN
@tkorting3 жыл бұрын
Thanks for the feedback, please subscribe to my channel
@fforfaith5 жыл бұрын
Thanks but I have a question: what is meant by class?
@Shiva-zy7jq5 жыл бұрын
Class is the final output category in which the input data will be classified.
@Shiva-zy7jq5 жыл бұрын
In more simpler terms, a class can be defined as the output category of your data.
@tkorting5 жыл бұрын
Thanks for helping the channel by answering this question Please subscribe to my channel Regards
@LucasAguiar-bg2dg2 жыл бұрын
sotaque de BR kkkk, ajudou bastante
@tkorting2 жыл бұрын
Mas como você percebeu? Kkk não esqueça de se inscrever. Abraço
@pass45472 жыл бұрын
N foi só eu kskskskks
@JohannesClinton8 жыл бұрын
nice sir, this is so helpfull for my research :)
@tkorting8 жыл бұрын
+Johannes Clinton thanks for your feedback, please subscribe to my channel and share this video with your peers. Regards
@JohannesClinton8 жыл бұрын
ok sir
@guillemrico70068 жыл бұрын
Thanks, clear and perfect!.
@tkorting8 жыл бұрын
thanks for your feedback please like/share the video and subscribe regards
@Bruuuuucie8 жыл бұрын
Thank you! Great video
@tkorting8 жыл бұрын
thanks for your feedback please like/share the video and subscribe regards
@pierluigibizzini8 жыл бұрын
very helpful, thanks a lot
@tkorting8 жыл бұрын
thanks for your feedback Please subscribe to my channel and share this video with your peers. Regards
@Frenchbulldogdad6 жыл бұрын
You Brasilian? I am too! thanks for the video
@tkorting6 жыл бұрын
Sim! Obrigado pelo feedback, por gentileza inscreva-se no meu canal. Abraço
@LuizBHMG6 жыл бұрын
Haha, também desconfiei! ;-) Muito bom o vídeo, merecido sucesso! Abraço
@valentineoragbakosi3785 жыл бұрын
Nice tutorial. Thanks
@tkorting5 жыл бұрын
Thanks for your comment. Please, subscribe to my channel! Regards
@caroltamez67985 жыл бұрын
How do you decide the number of k?
@tkorting5 жыл бұрын
Thanks for your comment, please like/share/subscribe. It depends on several factors. Some of them I discuss in the video. Regards
@jona43856 жыл бұрын
Thank you very much!
@tkorting5 жыл бұрын
Thanks for your feedback Please subscribe to my channel Regards
@butterCorn-vs8xl8 жыл бұрын
Great explaination...
@tkorting8 жыл бұрын
thanks a lot for your feedback Please subscribe to my channel and share this video with your peers. Regards
@marcoserena6696 жыл бұрын
Thank you very much for your explanation
@tkorting6 жыл бұрын
Thanks for your feedback Please like/share/subscribe. Regards
@thasin96718 жыл бұрын
Really good, Keep Doing....
@tkorting8 жыл бұрын
Many thanks for the feedback. Please like/share the video and subscribe. Regards
@keinestudiere-chillzone6332 жыл бұрын
Thanks a lot. By the way, I could guess you are a german by your accent. :) I am currently living in Germany.
@tkorting2 жыл бұрын
Thanks for your comment. Although my family is from Germany, I am from Brazil I lived in Germany for 2 months and I liked very much. Best regards
@taracai20489 жыл бұрын
Thank you so much!
@tkorting9 жыл бұрын
Dear tara cai, thanks for your comments. Please subscribe to my channel and share this video with your peers. Regards
@JogosEtudoMais4 жыл бұрын
Muito bem explicado!! Ajudou no meu trabalho da faculdade.
@tkorting4 жыл бұрын
Valeu, Daniel. Inscreva-se no canal. Um abraço
@mikesmith13508 жыл бұрын
explained very well
@tkorting8 жыл бұрын
thanks a lot for the feedback please like the video and subscribe regards
@aishwaryawagaskar84038 жыл бұрын
good explanation with short time...😃
@tkorting8 жыл бұрын
thanks for your feedback please like the video ans subscribe. Regards
@mohammedkhalaf50487 жыл бұрын
thank u very much. it is really useful
@tkorting7 жыл бұрын
thanks for your feedback please like/share the video and subscribe regards
@StxExodux8 жыл бұрын
How do this algorithm works with missing values?? This is a question that could be in my test :/
@tkorting8 жыл бұрын
The thing in this algorithm is that it is based on existent values to make the measures. I think each implementation should deal with missing values in a different way. Suppose you have 4 features, and 1 of them is unavailable. You could think on an algorithm to find the nearest neighbors in the remaining 3 features. Or try to estimate the missing values, based on the neighbors, etc. Regards
@StxExodux8 жыл бұрын
Thanks!!
@fatimahalqadheeb5 жыл бұрын
Good explanation
@tkorting5 жыл бұрын
Thanks for your comment. Please like/share/subscribe. Regards