Lecture 43 - Collaborative Filtering | Stanford University

  Рет қаралды 221,038

Artificial Intelligence - All in One

Artificial Intelligence - All in One

Күн бұрын

🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( / mtnayeem ) on Twitter 🐦 for real-time updates, news, and discussions in the field.
Check out the following interesting papers. Happy learning!
Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Prediction"
Paper: aclanthology.org/2023.finding...
Dataset: huggingface.co/datasets/tafse...
Paper Title: "Abstractive Unsupervised Multi-Document Summarization using Paraphrastic Sentence Fusion"
Paper: aclanthology.org/C18-1102/
Paper Title: "Extract with Order for Coherent Multi-Document Summarization"
Paper: aclanthology.org/W17-2407.pdf
Paper Title: "Paraphrastic Fusion for Abstractive Multi-Sentence Compression Generation"
Paper: dl.acm.org/doi/abs/10.1145/31...
Paper Title: "Neural Diverse Abstractive Sentence Compression Generation"
Paper: link.springer.com/chapter/10....

Пікірлер: 101
@ahishchinnu1843
@ahishchinnu1843 Жыл бұрын
This is hands down the best video I have ever seen that instills the intuition for the concept. Absolutely brilliant!!!
@SpaceXevent6
@SpaceXevent6 3 жыл бұрын
Subscribe To get Ethiopian Education Tutorials የኢትዮጲያን ትምህርቶች ቲቶርያል ከፈለጉ ሰብስክራይብ በማድረግ ይመልከቱ
@23karthikb
@23karthikb 4 жыл бұрын
Finally, a good concise explanation of these concepts!
@prabhacar
@prabhacar 2 жыл бұрын
Your explanation of why item based CF is better than user-based CF in the last 60 seconds was very intuitive and logical. thanks for bringing so much clarity through easy-to-understand movie examples.
@mathsLaw
@mathsLaw 3 жыл бұрын
How simple and how elegant , i guess this is the Stanford way , thank you !
@subratpati9166
@subratpati9166 7 жыл бұрын
Thanks a lot......Finally found some tutorial which is easy to learn !! Great work
@PotatoMan1491
@PotatoMan1491 5 ай бұрын
Prof is excellent in delivering essential pieces, I find this very informative
@parthgupta6604
@parthgupta6604 2 жыл бұрын
One of the best videos on collaborative filtering!
@yakhoubsoumare6944
@yakhoubsoumare6944 6 жыл бұрын
Sir, you are a genius!
@marigam
@marigam Жыл бұрын
Thank you so much! I’ve heard it tried to be explained before but it was too difficult! This actually made sense! 💜
@tanmaythaker2905
@tanmaythaker2905 7 ай бұрын
Best explanation ever! Simple yet comprehensive.
@dewinmoonl
@dewinmoonl 3 жыл бұрын
what an excellent video. clear with examples, explain the reasonings. stanford is no joke :)
@shubhamtalks9718
@shubhamtalks9718 3 жыл бұрын
Finally, someone explained it properly.❤
@Girisuable
@Girisuable 4 жыл бұрын
Awesome explanation. Very clear with the scenarios. Thank you so much.
@curryfavours
@curryfavours 6 жыл бұрын
Very insightful comparison of item-item vs. user-user approaches towards the end.
@jackshaak
@jackshaak 2 жыл бұрын
Thank you for putting this together in a fairly easy way to understand. You're a very good teacher.
@SanjarAhmadov
@SanjarAhmadov 7 жыл бұрын
Very insightful. Thank you.
@manisthashrestha250
@manisthashrestha250 7 жыл бұрын
thank you! that was great :D
@luis-benavides
@luis-benavides 6 жыл бұрын
Great way to explain it! Thank you.
@jakman85
@jakman85 7 жыл бұрын
Excellent explanation over neighborhood-based CF. Anyway you could explain Factor-Analysis based CF? Thanks!
@akshaysrivastava4304
@akshaysrivastava4304 6 ай бұрын
great video, very informative, thanks professor!
@gourabnath9789
@gourabnath9789 4 жыл бұрын
Awesome explanation! Thank you
@connorphoenix1107
@connorphoenix1107 7 жыл бұрын
Thank you this is exactly what I was looking for
@lucaecari2928
@lucaecari2928 Жыл бұрын
Thank you, very nicely explained, both the intuition behind the concepts and the theory.
@johng5295
@johng5295 3 жыл бұрын
Thanks in a million. Awesome. Where have you been all these years.
@karthikrajeshwaran1997
@karthikrajeshwaran1997 4 ай бұрын
Helped me totally grok! Thanks so much prof.
@lukamandic5700
@lukamandic5700 Жыл бұрын
At Option 3 it is mentioned that cosine distance treats zero value of movies as negative. Actually cosine zero values do not have impact on cosine distance. Cosine distance: 1 - (A · B) / (||A|| * ||B||) (A · B) represents the dot product of vectors A and B. ||A|| represents the Euclidean norm or magnitude of vector A. ||B|| represents the Euclidean norm or magnitude of vector B. If one component of vector A or Vector is 0 then it won't have impact on cosine similarity.
@yuzhao5268
@yuzhao5268 3 жыл бұрын
great lecture- wondering what are some deep learning based approach to solve this problem today
@shreyaraut9270
@shreyaraut9270 7 жыл бұрын
thank you... explanation is simple and understandable
@saqisyed5299
@saqisyed5299 7 жыл бұрын
Wow. Now this is what I was looking for.
@josewesley2384
@josewesley2384 4 жыл бұрын
Thank you, it's the best explanation that i saw by now. The idea is very simple, how i didn't think on it ?
@ernestdesouza8888
@ernestdesouza8888 7 жыл бұрын
great video !!!!
@ericarnaud5062
@ericarnaud5062 7 жыл бұрын
Amazing explanation
@mr68clubshorts
@mr68clubshorts 7 жыл бұрын
Great work I dont know how to thank you really
@saurabhsingh-nm8zj
@saurabhsingh-nm8zj 6 жыл бұрын
Excellent way of teaching
@sparshnagpal1509
@sparshnagpal1509 3 жыл бұрын
Best Description.
@ramiroarizpe3135
@ramiroarizpe3135 3 жыл бұрын
My Sat Guru came to save me, thanks, really usefull information.
@shauvikpujari9522
@shauvikpujari9522 3 жыл бұрын
Thank u sir your explanation is awesome
@Krimson5pride
@Krimson5pride 4 жыл бұрын
great explanation. So you know both baroque and acid rock? I will make sure this guy gets more exposure.
@HazemAzim
@HazemAzim 2 жыл бұрын
Excellent .. Very well structured
@Shivamagrawal50
@Shivamagrawal50 3 жыл бұрын
Hi Sir, like you said Pearson's correlation similarly takes out the individual users harshness to the rating scale while finding out user to user collaborative filtering, but why use same method for item to item collaborative filtering?
@dmitrizakharov9068
@dmitrizakharov9068 7 жыл бұрын
Hi, thanks for the great explanation. Is there a reason to use centered cosine distance over normalized cosine distance? Would dividing by the standard deviation of the row after subtracting the mean not better account for different levels of variance within user ratings? Is there a downside to normalizing instead of centering?
@rc_matic
@rc_matic 6 жыл бұрын
What you're describing is standardization, not normalization. In general, that would be an idea worth trying. However in this case, an issue with standardizing here MIGHT be the fact that a user's ratings are not normally distributed. Also, since standard deviation relies on the sample/population size, you could get vastly different results comparing a user who has seen 10 movies and a user who has seen 1000 movies.
@whiteF0x9091
@whiteF0x9091 5 жыл бұрын
Excellent ! Thank you.
@qihanguan8105
@qihanguan8105 3 жыл бұрын
Thank you! Super helpful!
@danielclark739
@danielclark739 Жыл бұрын
I get what you're saying that 0 is not a great assumption for a rating if a user did not watch a movie because they might have liked it if they watch it. However, it ignores the notion that a user had the opportunity to watch and wasn't interested enough to do so. Which is data that reflects their interest in a movie.
@rpcroft
@rpcroft 5 жыл бұрын
Does one need to restrict the similarity set to positive values? I calculated the entire matrix (item-item) using python and pandas. I had values outside the 1 to 5 range. For example the inferred value for user2 / film2 was 20.07 and it was -1.27 for User4 / film 1. Negative centered cosine similarities seem to throw the whole thing out of whack. It is possible to have a small denominator in the 18.43 calculation magnifying values and also negative results.
@junaidahmad-oe4oh
@junaidahmad-oe4oh 7 жыл бұрын
Very Informative
@thasin9671
@thasin9671 6 жыл бұрын
very very very very ... helpful.
@rayane4975
@rayane4975 2 жыл бұрын
great explanation!
@binhnguyenan6549
@binhnguyenan6549 7 жыл бұрын
Thanks a lot!
@albb762
@albb762 6 жыл бұрын
Centered cosine similarity is a very bad algorithm for this problem, because a person rated all the movies with 5 star will be similar to the person who rated all the movies with 1 stars.
@aashudwivedi
@aashudwivedi 5 жыл бұрын
This example is the case where one of the person is a tough rater and the other one is an easy rater.
@himanshuladia9099
@himanshuladia9099 5 жыл бұрын
what if the ratings are equal and therefore absolute vector is equal to zero ??? we will can't divide them on zero @@aashudwivedi
@harshsandesara9470
@harshsandesara9470 2 жыл бұрын
I always wondered why we needed to center everything around 0 instead of just filling in the missing values with the average ratings... basically the same thing but the calculations are done in the original frame instead of a shifted frame, and the problem that you mentioned most likely wouldn't arise then
@abhinaygupta
@abhinaygupta 3 жыл бұрын
Great Explanation.
@Robin-zc2iw
@Robin-zc2iw Жыл бұрын
Very helpful, thanks man
@Flaviaestat
@Flaviaestat 7 жыл бұрын
Great explanation
@abhisheksahu6032
@abhisheksahu6032 6 жыл бұрын
can anyone explain how do we pick the neighbor hood size?
@alvaropatricioiribarren3791
@alvaropatricioiribarren3791 4 жыл бұрын
Thank you!
@darkeddarksecret6279
@darkeddarksecret6279 4 жыл бұрын
good day sir, how to compute the cosine similarity between the row? like 1.00, -0.18, 0.41, 0.10, -0.31 and 0.59. please replay asap i needed it in my thesis project, this explaination helps me a lot sir.
@Skandawin78
@Skandawin78 4 жыл бұрын
I like this video. But I'm confused as we are trying to solve 2 problems here, 1. Find the missing ratings 2. Find the item (IBCF) or user (in case of UBCF) which/who are similar to each other we use centering+ cosine similarity and KNN to address problem 1 or 2 or both? We use rating to identify similar user/items or do we find similar items/users and then fill the missing rating?
@aydin3852
@aydin3852 3 жыл бұрын
thank you, now I get it
@codelover847
@codelover847 3 жыл бұрын
In item-item , shouldn't we be computing cosine similarities between colums?? Because Si,j denotes the similarity of items (i.e movies).
@edwinchong37
@edwinchong37 2 жыл бұрын
Better then my Prof.!!!
@foodie2841
@foodie2841 6 жыл бұрын
Sir I have a query if I want to find ratings for movie 3 by user 4 then what should I do?
@nginfrared
@nginfrared 8 жыл бұрын
Awesome explanation
@ArtificialIntelligenceAllinOne
@ArtificialIntelligenceAllinOne 7 жыл бұрын
Thanks!! Keep Learning and Sharing :)
@aramun7614
@aramun7614 2 жыл бұрын
Very good explanation.
@abdelrhmanwagdy3383
@abdelrhmanwagdy3383 5 жыл бұрын
what if the rates are equal and there absolute vector is equal to zero ??? we will can't divide them on zero
@ramans11
@ramans11 7 жыл бұрын
very nice..
@PumarkoHD
@PumarkoHD 7 жыл бұрын
When predicting the weighted avg at around 18:40, is it a coincidence that .41 and .59 add up to 1? Or should that always be the case?
@jrafaelvalle
@jrafaelvalle 7 жыл бұрын
It's a coincidence. They represent the similarity, sim, between 1 and 3 and 1 and 6, sim(1, 3), sim(3,6). Consider k values different from 2, for example 1 or 3...
@mamatizm817
@mamatizm817 7 жыл бұрын
i need an explanation for propagation affinity algorithme with example please
@mohammadhegazy1285
@mohammadhegazy1285 Жыл бұрын
Thank you Very much sir
@celinexu6598
@celinexu6598 5 жыл бұрын
Tried the centralized consine, have 2 quetions here: 1. if we have the example as D (in the vedio) which have all the record rating as 0, the consine distence will any other member A,B,C will be NaN. becuase, there is O at the denominator. any treatment? 2. if i have 1000 customer then I have to comuter 499500 number for any 2 of the customer Combinations. If i have 1 Million customer , the computation is really large, not sure nomal server can handel. Any suggestion or short cut? Thank you!
@aymoon8080
@aymoon8080 7 жыл бұрын
Nice work
@eti-inyeneibia6442
@eti-inyeneibia6442 6 жыл бұрын
i couldnt get 1.00 , 0.18 0.41 with the cosine similiarity formula..pls explain
@RahulPatel-hr4qe
@RahulPatel-hr4qe 4 жыл бұрын
give me your email-id
@megabites3649
@megabites3649 4 жыл бұрын
@@RahulPatel-hr4qe can you expalain it to me? darkedarksecret@gmail.com
@philipjaykho114
@philipjaykho114 4 жыл бұрын
@@RahulPatel-hr4qe please help me getting those answer too Sir. this is my email address kphilipjay@gmail.com.. thanks a lot sir
@almasazimkhan7170
@almasazimkhan7170 5 жыл бұрын
In the formula(option2) of user-user collaborative filtering we have "Ryi" which is rating. But what we should do if Ryi in original dataset is 0. Should we ignore it?
@rhealisa9268
@rhealisa9268 4 жыл бұрын
It is not important. The formula uses the *sum* of Ryi ratings, so absence of rating (in this case, considered zero) doesn't make a difference.
@abderahmanehm8117
@abderahmanehm8117 3 жыл бұрын
Thank you
@philipjaykho114
@philipjaykho114 4 жыл бұрын
Hi Artificial Intelligence, Im so confuse because I tried to solve the similarity (those value in green-font) but it seems so hard for me. please help me. thank you so much sir.
@aishi99
@aishi99 7 жыл бұрын
very helpful. thanks
@nishthavarshney9548
@nishthavarshney9548 3 жыл бұрын
hw to calcutate the cosine value?
@AhmedRaza-kp8io
@AhmedRaza-kp8io Жыл бұрын
Centered Cosine similiarity???
@K-mk6pc
@K-mk6pc 2 жыл бұрын
Sir never addressed about user - user similarity metrics.
@akshayjagtap7834
@akshayjagtap7834 3 жыл бұрын
Where to find code for the same
@lishannavishka1783
@lishannavishka1783 3 жыл бұрын
I have a Problem I can not say it proper way but I will try to explain it. think there are two points A and B, A = [1,0], B=[0,1] cosine similarity between A and B is Cos90 = 0 right? like wise in this example how did you find Cosine similarity between A and B sir? how can we graph A and B points
@iiTE4L
@iiTE4L 5 жыл бұрын
17:12 in green 0.41 is wrong. I get 0.55 and I double checked it. Everything else is correct.
@megabites3649
@megabites3649 4 жыл бұрын
how to solve that 1.00, -0.18, 0.41, 0.10, -0.31 and 0.59?
@PrateekAgrawal
@PrateekAgrawal 7 жыл бұрын
How many fans of "Acid Rock" here ;)
@Sudharsan-dd1qi
@Sudharsan-dd1qi Жыл бұрын
Sir i have one problem In Collaborative filterations can u help to solve?
@hasanalikhan3543
@hasanalikhan3543 Жыл бұрын
How they rated overall 5 movies its 6 not 5
@maganaluis92
@maganaluis92 4 жыл бұрын
This is horrible
@voiceofwomen2420
@voiceofwomen2420 3 жыл бұрын
you are just reading slides, not explaning well. disappointed
@ibrahimiltifat7045
@ibrahimiltifat7045 3 жыл бұрын
Thank you
Lecture 44 - Implementing Collaborative Filtering (Advanced) | Stanford University
13:47
Artificial Intelligence - All in One
Рет қаралды 43 М.
Lecture 42 - Content Based Recommendations | Stanford University
21:01
Artificial Intelligence - All in One
Рет қаралды 119 М.
She ruined my dominos! 😭 Cool train tool helps me #gadget
00:40
Go Gizmo!
Рет қаралды 55 МЛН
How To Self Study AI FAST
12:54
Tina Huang
Рет қаралды 461 М.
Collaborative Filtering : Data Science Concepts
12:03
ritvikmath
Рет қаралды 45 М.
Stanford's FREE data science book and course are the best yet
4:52
Python Programmer
Рет қаралды 659 М.
programming ≠ coding - Leslie Lamport
1:15:46
Stanford Math
Рет қаралды 19 М.
How Recommender Systems Work (Netflix/Amazon)
8:18
Art of the Problem
Рет қаралды 223 М.
But what is a neural network? | Chapter 1, Deep learning
18:40
3Blue1Brown
Рет қаралды 16 МЛН
Matrix Factorization - Numberphile
16:34
Numberphile
Рет қаралды 373 М.
How does Netflix recommend movies? Matrix Factorization
32:46
Serrano.Academy
Рет қаралды 333 М.
ПОКУПКА ТЕЛЕФОНА С АВИТО?🤭
1:00
Корнеич
Рет қаралды 2,9 МЛН
ТОП-5 культовых телефонов‼️
1:00
Pedant.ru
Рет қаралды 21 М.
Разряженный iPhone может больше Android
0:34