Lecture 43 - Collaborative Filtering | Stanford University

  Рет қаралды 223,947

Artificial Intelligence - All in One

Artificial Intelligence - All in One

Күн бұрын

Пікірлер: 102
@23karthikb
@23karthikb 4 жыл бұрын
Finally, a good concise explanation of these concepts!
@PotatoMan1491
@PotatoMan1491 9 ай бұрын
Prof is excellent in delivering essential pieces, I find this very informative
@pythoncodes4044
@pythoncodes4044 Ай бұрын
Amazing, enjoyed the lecture!
@tanmaythaker2905
@tanmaythaker2905 10 ай бұрын
Best explanation ever! Simple yet comprehensive.
@lukamandic5700
@lukamandic5700 Жыл бұрын
At Option 3 it is mentioned that cosine distance treats zero value of movies as negative. Actually cosine zero values do not have impact on cosine distance. Cosine distance: 1 - (A · B) / (||A|| * ||B||) (A · B) represents the dot product of vectors A and B. ||A|| represents the Euclidean norm or magnitude of vector A. ||B|| represents the Euclidean norm or magnitude of vector B. If one component of vector A or Vector is 0 then it won't have impact on cosine similarity.
@ahishchinnu1843
@ahishchinnu1843 2 жыл бұрын
This is hands down the best video I have ever seen that instills the intuition for the concept. Absolutely brilliant!!!
@karthikrajeshwaran1997
@karthikrajeshwaran1997 7 ай бұрын
Helped me totally grok! Thanks so much prof.
@kisome2423
@kisome2423 11 ай бұрын
Thank you! I learned a lot from you!
@jackshaak
@jackshaak 2 жыл бұрын
Thank you for putting this together in a fairly easy way to understand. You're a very good teacher.
@yakhoubsoumare6944
@yakhoubsoumare6944 6 жыл бұрын
Sir, you are a genius!
@shubhamtalks9718
@shubhamtalks9718 3 жыл бұрын
Finally, someone explained it properly.❤
@subratpati9166
@subratpati9166 7 жыл бұрын
Thanks a lot......Finally found some tutorial which is easy to learn !! Great work
@yuzhao5268
@yuzhao5268 3 жыл бұрын
great lecture- wondering what are some deep learning based approach to solve this problem today
@shauvikpujari9522
@shauvikpujari9522 3 жыл бұрын
Thank u sir your explanation is awesome
@sparshnagpal1509
@sparshnagpal1509 4 жыл бұрын
Best Description.
@Shivamagrawal50
@Shivamagrawal50 3 жыл бұрын
Hi Sir, like you said Pearson's correlation similarly takes out the individual users harshness to the rating scale while finding out user to user collaborative filtering, but why use same method for item to item collaborative filtering?
@josewesley2384
@josewesley2384 4 жыл бұрын
Thank you, it's the best explanation that i saw by now. The idea is very simple, how i didn't think on it ?
@gourabnath9789
@gourabnath9789 4 жыл бұрын
Awesome explanation! Thank you
@SanjarAhmadov
@SanjarAhmadov 8 жыл бұрын
Very insightful. Thank you.
@manisthashrestha250
@manisthashrestha250 7 жыл бұрын
thank you! that was great :D
@shreyaraut9270
@shreyaraut9270 7 жыл бұрын
thank you... explanation is simple and understandable
@johng5295
@johng5295 4 жыл бұрын
Thanks in a million. Awesome. Where have you been all these years.
@Skandawin78
@Skandawin78 5 жыл бұрын
I like this video. But I'm confused as we are trying to solve 2 problems here, 1. Find the missing ratings 2. Find the item (IBCF) or user (in case of UBCF) which/who are similar to each other we use centering+ cosine similarity and KNN to address problem 1 or 2 or both? We use rating to identify similar user/items or do we find similar items/users and then fill the missing rating?
@edwinchong37
@edwinchong37 2 жыл бұрын
Better then my Prof.!!!
@dmitrizakharov9068
@dmitrizakharov9068 8 жыл бұрын
Hi, thanks for the great explanation. Is there a reason to use centered cosine distance over normalized cosine distance? Would dividing by the standard deviation of the row after subtracting the mean not better account for different levels of variance within user ratings? Is there a downside to normalizing instead of centering?
@rc_matic
@rc_matic 6 жыл бұрын
What you're describing is standardization, not normalization. In general, that would be an idea worth trying. However in this case, an issue with standardizing here MIGHT be the fact that a user's ratings are not normally distributed. Also, since standard deviation relies on the sample/population size, you could get vastly different results comparing a user who has seen 10 movies and a user who has seen 1000 movies.
@abhinaygupta
@abhinaygupta 3 жыл бұрын
Great Explanation.
@HazemAzim
@HazemAzim 2 жыл бұрын
Excellent .. Very well structured
@Robin-zc2iw
@Robin-zc2iw Жыл бұрын
Very helpful, thanks man
@rpcroft
@rpcroft 5 жыл бұрын
Does one need to restrict the similarity set to positive values? I calculated the entire matrix (item-item) using python and pandas. I had values outside the 1 to 5 range. For example the inferred value for user2 / film2 was 20.07 and it was -1.27 for User4 / film 1. Negative centered cosine similarities seem to throw the whole thing out of whack. It is possible to have a small denominator in the 18.43 calculation magnifying values and also negative results.
@rayane4975
@rayane4975 2 жыл бұрын
great explanation!
@aramun7614
@aramun7614 3 жыл бұрын
Very good explanation.
@connorphoenix1107
@connorphoenix1107 7 жыл бұрын
Thank you this is exactly what I was looking for
@junaidahmad-oe4oh
@junaidahmad-oe4oh 7 жыл бұрын
Very Informative
@mohammadhegazy1285
@mohammadhegazy1285 Жыл бұрын
Thank you Very much sir
@eti-inyeneibia6442
@eti-inyeneibia6442 7 жыл бұрын
i couldnt get 1.00 , 0.18 0.41 with the cosine similiarity formula..pls explain
@RahulPatel-hr4qe
@RahulPatel-hr4qe 4 жыл бұрын
give me your email-id
@megabites3649
@megabites3649 4 жыл бұрын
@@RahulPatel-hr4qe can you expalain it to me? darkedarksecret@gmail.com
@philipjaykho114
@philipjaykho114 4 жыл бұрын
@@RahulPatel-hr4qe please help me getting those answer too Sir. this is my email address kphilipjay@gmail.com.. thanks a lot sir
@lishannavishka1783
@lishannavishka1783 3 жыл бұрын
I have a Problem I can not say it proper way but I will try to explain it. think there are two points A and B, A = [1,0], B=[0,1] cosine similarity between A and B is Cos90 = 0 right? like wise in this example how did you find Cosine similarity between A and B sir? how can we graph A and B points
@Stickman-shenanigans-d7k
@Stickman-shenanigans-d7k 7 жыл бұрын
Wow. Now this is what I was looking for.
@K-mk6pc
@K-mk6pc 2 жыл бұрын
Sir never addressed about user - user similarity metrics.
@ericarnaud5062
@ericarnaud5062 8 жыл бұрын
Amazing explanation
@qihanguan8105
@qihanguan8105 3 жыл бұрын
Thank you! Super helpful!
@AhmedRaza-kp8io
@AhmedRaza-kp8io 2 жыл бұрын
Centered Cosine similiarity???
@abdelrhmanwagdy3383
@abdelrhmanwagdy3383 5 жыл бұрын
what if the rates are equal and there absolute vector is equal to zero ??? we will can't divide them on zero
@hasanalikhan3543
@hasanalikhan3543 2 жыл бұрын
How they rated overall 5 movies its 6 not 5
@Sudharsan-dd1qi
@Sudharsan-dd1qi Жыл бұрын
Sir i have one problem In Collaborative filterations can u help to solve?
@whiteF0x9091
@whiteF0x9091 5 жыл бұрын
Excellent ! Thank you.
@mr68clubshorts
@mr68clubshorts 7 жыл бұрын
Great work I dont know how to thank you really
@aydin3852
@aydin3852 3 жыл бұрын
thank you, now I get it
@philipjaykho114
@philipjaykho114 4 жыл бұрын
Hi Artificial Intelligence, Im so confuse because I tried to solve the similarity (those value in green-font) but it seems so hard for me. please help me. thank you so much sir.
@celinexu6598
@celinexu6598 5 жыл бұрын
Tried the centralized consine, have 2 quetions here: 1. if we have the example as D (in the vedio) which have all the record rating as 0, the consine distence will any other member A,B,C will be NaN. becuase, there is O at the denominator. any treatment? 2. if i have 1000 customer then I have to comuter 499500 number for any 2 of the customer Combinations. If i have 1 Million customer , the computation is really large, not sure nomal server can handel. Any suggestion or short cut? Thank you!
@mamatizm817
@mamatizm817 7 жыл бұрын
i need an explanation for propagation affinity algorithme with example please
@nishthavarshney9548
@nishthavarshney9548 3 жыл бұрын
hw to calcutate the cosine value?
@ibrahimiltifat7045
@ibrahimiltifat7045 3 жыл бұрын
Thank you
@alvaropatricioiribarren3791
@alvaropatricioiribarren3791 4 жыл бұрын
Thank you!
@akshayjagtap7834
@akshayjagtap7834 3 жыл бұрын
Where to find code for the same
@ramans11
@ramans11 8 жыл бұрын
very nice..
@almasazimkhan7170
@almasazimkhan7170 5 жыл бұрын
In the formula(option2) of user-user collaborative filtering we have "Ryi" which is rating. But what we should do if Ryi in original dataset is 0. Should we ignore it?
@rhealisa9268
@rhealisa9268 5 жыл бұрын
It is not important. The formula uses the *sum* of Ryi ratings, so absence of rating (in this case, considered zero) doesn't make a difference.
@aishi99
@aishi99 7 жыл бұрын
very helpful. thanks
@foodie2841
@foodie2841 6 жыл бұрын
Sir I have a query if I want to find ratings for movie 3 by user 4 then what should I do?
@PumarkoHD
@PumarkoHD 7 жыл бұрын
When predicting the weighted avg at around 18:40, is it a coincidence that .41 and .59 add up to 1? Or should that always be the case?
@jrafaelvalle
@jrafaelvalle 7 жыл бұрын
It's a coincidence. They represent the similarity, sim, between 1 and 3 and 1 and 6, sim(1, 3), sim(3,6). Consider k values different from 2, for example 1 or 3...
@PrateekAgrawal
@PrateekAgrawal 7 жыл бұрын
How many fans of "Acid Rock" here ;)
@maganaluis92
@maganaluis92 4 жыл бұрын
This is horrible
@mathsLaw
@mathsLaw 3 жыл бұрын
How simple and how elegant , i guess this is the Stanford way , thank you !
@abderahmanehm8117
@abderahmanehm8117 3 жыл бұрын
Thank you
@prabhacar
@prabhacar 2 жыл бұрын
Your explanation of why item based CF is better than user-based CF in the last 60 seconds was very intuitive and logical. thanks for bringing so much clarity through easy-to-understand movie examples.
@albb762
@albb762 7 жыл бұрын
Centered cosine similarity is a very bad algorithm for this problem, because a person rated all the movies with 5 star will be similar to the person who rated all the movies with 1 stars.
@aashudwivedi
@aashudwivedi 5 жыл бұрын
This example is the case where one of the person is a tough rater and the other one is an easy rater.
@himanshuladia9099
@himanshuladia9099 5 жыл бұрын
what if the ratings are equal and therefore absolute vector is equal to zero ??? we will can't divide them on zero @@aashudwivedi
@harshsandesara9470
@harshsandesara9470 2 жыл бұрын
I always wondered why we needed to center everything around 0 instead of just filling in the missing values with the average ratings... basically the same thing but the calculations are done in the original frame instead of a shifted frame, and the problem that you mentioned most likely wouldn't arise then
@iiTE4L
@iiTE4L 5 жыл бұрын
17:12 in green 0.41 is wrong. I get 0.55 and I double checked it. Everything else is correct.
@megabites3649
@megabites3649 4 жыл бұрын
how to solve that 1.00, -0.18, 0.41, 0.10, -0.31 and 0.59?
@darkeddarksecret6279
@darkeddarksecret6279 4 жыл бұрын
good day sir, how to compute the cosine similarity between the row? like 1.00, -0.18, 0.41, 0.10, -0.31 and 0.59. please replay asap i needed it in my thesis project, this explaination helps me a lot sir.
@jakman85
@jakman85 8 жыл бұрын
Excellent explanation over neighborhood-based CF. Anyway you could explain Factor-Analysis based CF? Thanks!
@abhisheksahu6032
@abhisheksahu6032 6 жыл бұрын
can anyone explain how do we pick the neighbor hood size?
@Krimson5pride
@Krimson5pride 4 жыл бұрын
great explanation. So you know both baroque and acid rock? I will make sure this guy gets more exposure.
@curryfavours
@curryfavours 6 жыл бұрын
Very insightful comparison of item-item vs. user-user approaches towards the end.
@Vomvom56
@Vomvom56 3 жыл бұрын
you are just reading slides, not explaning well. disappointed
@marigam
@marigam 2 жыл бұрын
Thank you so much! I’ve heard it tried to be explained before but it was too difficult! This actually made sense! 💜
@dewinmoonl
@dewinmoonl 3 жыл бұрын
what an excellent video. clear with examples, explain the reasonings. stanford is no joke :)
@codelover847
@codelover847 4 жыл бұрын
In item-item , shouldn't we be computing cosine similarities between colums?? Because Si,j denotes the similarity of items (i.e movies).
@lucaecari2928
@lucaecari2928 Жыл бұрын
Thank you, very nicely explained, both the intuition behind the concepts and the theory.
@danielclark739
@danielclark739 2 жыл бұрын
I get what you're saying that 0 is not a great assumption for a rating if a user did not watch a movie because they might have liked it if they watch it. However, it ignores the notion that a user had the opportunity to watch and wasn't interested enough to do so. Which is data that reflects their interest in a movie.
@Girisuable
@Girisuable 4 жыл бұрын
Awesome explanation. Very clear with the scenarios. Thank you so much.
@ramiroarizpe3135
@ramiroarizpe3135 4 жыл бұрын
My Sat Guru came to save me, thanks, really usefull information.
@luis-benavides
@luis-benavides 6 жыл бұрын
Great way to explain it! Thank you.
@akshaysrivastava4304
@akshaysrivastava4304 10 ай бұрын
great video, very informative, thanks professor!
@thasin9671
@thasin9671 7 жыл бұрын
very very very very ... helpful.
@parthgupta6604
@parthgupta6604 2 жыл бұрын
One of the best videos on collaborative filtering!
@ernestdesouza8888
@ernestdesouza8888 7 жыл бұрын
great video !!!!
@binhnguyenan6549
@binhnguyenan6549 7 жыл бұрын
Thanks a lot!
@nginfrared
@nginfrared 8 жыл бұрын
Awesome explanation
@ArtificialIntelligenceAllinOne
@ArtificialIntelligenceAllinOne 8 жыл бұрын
Thanks!! Keep Learning and Sharing :)
@saurabhsingh-nm8zj
@saurabhsingh-nm8zj 6 жыл бұрын
Excellent way of teaching
@Flaviaestat
@Flaviaestat 7 жыл бұрын
Great explanation
@aymoon8080
@aymoon8080 7 жыл бұрын
Nice work
Lecture 44 - Implementing Collaborative Filtering (Advanced) | Stanford University
13:47
Artificial Intelligence - All in One
Рет қаралды 44 М.
How does Netflix recommend movies? Matrix Factorization
32:46
Serrano.Academy
Рет қаралды 343 М.
Worst flight ever
00:55
Adam W
Рет қаралды 25 МЛН
Officer Rabbit is so bad. He made Luffy deaf. #funny #supersiblings #comedy
00:18
Funny superhero siblings
Рет қаралды 5 МЛН
Collaborative Filtering : Data Science Concepts
12:03
ritvikmath
Рет қаралды 50 М.
Cosine Similarity, Clearly Explained!!!
10:14
StatQuest with Josh Starmer
Рет қаралды 90 М.
Lecture 42 - Content Based Recommendations | Stanford University
21:01
Artificial Intelligence - All in One
Рет қаралды 120 М.
Lecture 55 - Latent Factor Recommender System  | Stanford University
14:17
Artificial Intelligence - All in One
Рет қаралды 47 М.
Stanford CS229 I Machine Learning I Building Large Language Models (LLMs)
1:44:31
Lecture 46 - Dimensionality Reduction - Introduction | Stanford University
12:02
Artificial Intelligence - All in One
Рет қаралды 78 М.
GraphRAG: The Marriage of Knowledge Graphs and RAG: Emil Eifrem
19:15
Worst flight ever
00:55
Adam W
Рет қаралды 25 МЛН