Approximate Nearest Neighbors : Data Science Concepts

  Рет қаралды 26,352

ritvikmath

ritvikmath

Күн бұрын

Пікірлер: 60
@ScaredCrows4
@ScaredCrows4 3 жыл бұрын
Wow you have no idea how much i needed this for my current work project. Thanks as always for a fantastic explanation
@vineethgudela2033
@vineethgudela2033 11 ай бұрын
I have implemented ANN on my own after watching your video. Thanks for the great explanation ritvik
@anonim5052
@anonim5052 19 күн бұрын
I am preparing for pinterest interview! Thank you! It was very helpful!
@zivleibowitz9846
@zivleibowitz9846 Жыл бұрын
OMG. I hope all my lecturers will explain that clearly and intuitively. Thankss
@rockapedra1130
@rockapedra1130 Жыл бұрын
You have a very clear but not too wordy style. *SUBSCRIBED*
@csbanki
@csbanki 2 жыл бұрын
This is perfect! I'm so sick of all these fancy literatury stuff from professors all over the world who can only communicate through differential equations. THIS is how it should be explained. Thank you good sir!
@Septumsempra8818
@Septumsempra8818 3 жыл бұрын
Both formats are cool
@jiayangcheng
@jiayangcheng 6 ай бұрын
Thank you so much sir this explanation shows your exceptional ability to teach. So enlightening!
@hannahnelson4569
@hannahnelson4569 5 ай бұрын
This is brilliant! Thank you so much for showing us this method!
@aZnPriDe707
@aZnPriDe707 2 жыл бұрын
Clear explanation and very resourceful!
@ctRonIsaac
@ctRonIsaac 2 жыл бұрын
The lesson was clear and paper can be easier for you to control and work with. So this is fine. Thank you for the lesson!
@Mci146
@Mci146 Жыл бұрын
Thank you so much for the simple and clear explanation with examples!
@ritvikmath
@ritvikmath Жыл бұрын
You're very welcome!
@seyedalirezaabbasi
@seyedalirezaabbasi 11 ай бұрын
This format is better. Thanx.
@Crimau12000
@Crimau12000 Жыл бұрын
Thanks for sharing such a detaild and thorough explanation!
@ritvikmath
@ritvikmath Жыл бұрын
My pleasure!
@randall.chamberlain
@randall.chamberlain Жыл бұрын
Mate you really know how to explain things. Thanks for your time and dedication.
@marcosricardooliveira3790
@marcosricardooliveira3790 Жыл бұрын
I really like this format for this kind of explanation Like explainnig how a technique works very good vid, thanks
@ritvikmath
@ritvikmath Жыл бұрын
Glad you liked it!
@monalover3758
@monalover3758 11 ай бұрын
Very clear explanation! I think I got it in one pass! Pace is good. Thanks! (PS. the paper format is fine!)
@RetropunkAI
@RetropunkAI Жыл бұрын
best explanation ever. thank you
@siddharthvij9087
@siddharthvij9087 4 ай бұрын
Excellent Video
@ritvikmath
@ritvikmath 4 ай бұрын
Glad you enjoyed it!
@doronbenchayim8526
@doronbenchayim8526 3 жыл бұрын
THIS WAS AMAZING!!!!!!!!!!!!!!!
@vinnythep00h
@vinnythep00h 4 ай бұрын
Great explanation!
@ritvikmath
@ritvikmath 4 ай бұрын
Glad it was helpful!
@oddtraveller
@oddtraveller 2 жыл бұрын
Greatly explained
@dinuthomas4531
@dinuthomas4531 Жыл бұрын
Very well explained!!
@ritvikmath
@ritvikmath Жыл бұрын
Glad you think so!
@PhilipMavrepis
@PhilipMavrepis 3 жыл бұрын
Pretty good explanation but you never showed what happens if the number of K you are searching for is bigger than the number of points in the specific area. For example let's say you have a new point in R4 which has 3 points and you are searching for 4-NN for that point. Thank you again for this video, really liked it
@Han-ve8uh
@Han-ve8uh Жыл бұрын
Doesn't answer your question directly, but in FAISS IVF index, if k is more than number of items in a cell, it returns -1 id for the extra required neighbors, solution is to increase default nprobe=1 to probe more cells.
@KeithGalli
@KeithGalli 2 жыл бұрын
Thanks! Good vid :)
@zahrashekarchi6139
@zahrashekarchi6139 Жыл бұрын
well explained! thanks!
@alinajafi1528
@alinajafi1528 2 жыл бұрын
Perfect explanation! Thanks :D
@nadiakacem24
@nadiakacem24 Жыл бұрын
thank you very much, it was so helpful
@Shkedias
@Shkedias 2 ай бұрын
Thank you😊
@jamemamjame
@jamemamjame Жыл бұрын
thank you very much 🙏🏼
@ritvikmath
@ritvikmath Жыл бұрын
Of course!
@sarmale-cu-mamaliga
@sarmale-cu-mamaliga 2 жыл бұрын
Really cool :O thank you
@mayapankhaj9124
@mayapankhaj9124 7 ай бұрын
Thank you so much for a beautiful lesson. Reminded me of my elementary school days and how teachers used to teach back then.
@qwerty22488
@qwerty22488 2 жыл бұрын
Thanks for this excellent video! Is there a poplar library that helps to experiment with ANN on local machine for a small set of data?
@RishiRajKoul
@RishiRajKoul Жыл бұрын
How would we determine that a point is above and below a line using code ?
@haneulkim4902
@haneulkim4902 2 жыл бұрын
Thanks for a great video! One questions, @9:23 new point we check if given point is below or above the blue line. The way you recognize whether point is above or below is by calculating distance between (point, 1) and (point, 9) ?
@brockobama257
@brockobama257 9 ай бұрын
3:56 I thought that a kdtree can search nearest neighbor in logn and delete or add a point in logn so k nearest neighbors could be considered klogn which is less than n
@loupax
@loupax Жыл бұрын
I now wonder if this is a sensible algorithm for collision detection
@kanchansarkar7706
@kanchansarkar7706 2 жыл бұрын
such a great explanation! Wonder do you also have a similar video for HNSW? Thanks!
@ScottSummerill
@ScottSummerill 3 жыл бұрын
I like it MUCH better. I found it sometimes overwhelming to be confronted with all the info and not yet have an explanation.
@JayRodge
@JayRodge Жыл бұрын
I still don't understand how do you classify the new point? region wise or is there any other method?
@monalover3758
@monalover3758 11 ай бұрын
Here is what I think: each region has two points. So use a metrics (e.g. distance) from this given new point to the begin and to the end point and go with the closer one. The closeness can be Euclidean distance, or Cosine distance, or some other metrices.
@stevengusenius7333
@stevengusenius7333 2 жыл бұрын
Great video as always Ritvik. Am I correct that building the tree is an O(N) operation? That is, if I have only one new data point and haven't yet constructed the tree, will this still save any time over the exhaustive method? If not, then I presume building a forest would imply some break even point. Thanks.
@rockapedra1130
@rockapedra1130 Жыл бұрын
Paper is better, I think. Moving the papers around is like zooming without moving the camera.
@djangoworldwide7925
@djangoworldwide7925 Жыл бұрын
Such a nice recursive challenge. anyone have an idea how to define a function to recursivley solve this kind of algorithm, given a creiteria of maximum points?
@borknagarpopinga4089
@borknagarpopinga4089 3 жыл бұрын
What's your qualification? Somehow I cannot find any information about your education etc. Awesome videos by the way, a lot easier to understand than what every professor tries to explain.
@atwork22
@atwork22 8 ай бұрын
Looks like a sort of a binary search
@X_platform
@X_platform 3 жыл бұрын
Is ANNOY using Voronoi ?
@raunaquepatra3966
@raunaquepatra3966 3 жыл бұрын
"Lowest Complexity for Knn is O(n)" is not True!! Using kd-tree the complexity becomes O(log n).
@jasdeepsinghgrover2470
@jasdeepsinghgrover2470 3 жыл бұрын
I was also thinking about kd tree and ball tree used in sklearn... Are you aware of any other methods??
@raunaquepatra3966
@raunaquepatra3966 3 жыл бұрын
@@jasdeepsinghgrover2470 LSH
@jasdeepsinghgrover2470
@jasdeepsinghgrover2470 3 жыл бұрын
Thanks for sharing... Will learn more about it as well
Kernel Density Estimation : Data Science Concepts
25:52
ritvikmath
Рет қаралды 26 М.
K-Nearest Neighbor
7:13
ritvikmath
Рет қаралды 27 М.
Triple kill😹
00:18
GG Animation
Рет қаралды 18 МЛН
World’s strongest WOMAN vs regular GIRLS
00:56
A4
Рет қаралды 33 МЛН
Family Love #funny #sigma
00:16
CRAZY GREAPA
Рет қаралды 45 МЛН
K-d Trees - Computerphile
13:20
Computerphile
Рет қаралды 239 М.
[CVPR20 Tutorial] Billion-scale Approximate Nearest Neighbor Search
47:36
The ROC Curve : Data Science Concepts
17:19
ritvikmath
Рет қаралды 36 М.
Elasticsearch knn: Using Elastiknn for Exact & Approximate Nearest Neighbor Search
39:56
Approximate Nearest Neighbours in FAISS: Cell Probe 101
6:55
Learn Data with Mark
Рет қаралды 4,1 М.
Metropolis - Hastings : Data Science Concepts
18:15
ritvikmath
Рет қаралды 106 М.
Triple kill😹
00:18
GG Animation
Рет қаралды 18 МЛН