LSA

  Рет қаралды 15,259

Francisco Iacobelli

Francisco Iacobelli

Күн бұрын

Пікірлер: 13
@shreysoni4107
@shreysoni4107 3 жыл бұрын
Very easy explanation! Thank you for making it simple.
@sujits3458
@sujits3458 4 жыл бұрын
Very good explanation, I was able to understand the concept so easily, thank you :-)
@dharmendrathakur1487
@dharmendrathakur1487 3 жыл бұрын
Thank You very much Francisco...You made it so simple :-)
@pratikanand2200
@pratikanand2200 5 жыл бұрын
very very good description. can you upload the link of this ppt?
@kkhanh4y
@kkhanh4y 3 жыл бұрын
rows represent documents and columns represent words, isn't it? but here rows represent words and columns represent documents.
@fiacobelli
@fiacobelli 3 жыл бұрын
You can use different matrices to judge word similarity or document similarity. In the end it is the same. All that changes is what vector you use depending of your end goal.
@kkhanh4y
@kkhanh4y 3 жыл бұрын
@@fiacobelli Ok, thank you, dear
@pratikanand2200
@pratikanand2200 5 жыл бұрын
the meaning of a word is not satisfying as the passages contain other words also so can the average meaning is given to all words that the passages contain BUT the LSA describe it properly.
@2441139knakmg
@2441139knakmg 4 жыл бұрын
Best
@mohitpahuja6894
@mohitpahuja6894 3 жыл бұрын
wrong explanation. you havent actually reduced dimensionalty. what you have found is approximation of data. what you need to do is get projection of data on top two eigenvectors to get lower dimensional data
@fiacobelli
@fiacobelli 3 жыл бұрын
Correct. It is an approximation matrix. You can play with dimensionality as well, but this is the seminal concept behind LSA
@harryshuman9637
@harryshuman9637 3 жыл бұрын
COMP 596 is wack
Introduction to Latent Semantic Analysis (1/5)
3:24
Databricks Academy
Рет қаралды 45 М.
Singular Value Decomposition (SVD): Mathematical Overview
12:51
Steve Brunton
Рет қаралды 409 М.
人是不能做到吗?#火影忍者 #家人  #佐助
00:20
火影忍者一家
Рет қаралды 20 МЛН
Support each other🤝
00:31
ISSEI / いっせい
Рет қаралды 81 МЛН
VSM, LSA, & SVD | Introduction to Text Analytics with R Part 7
37:32
Data Science Dojo
Рет қаралды 24 М.
Singular Value Decomposition (SVD): Matrix Approximation
14:54
Steve Brunton
Рет қаралды 248 М.
Latent Semantic Indexing | Explained with Examples | Georgia Tech CSE6242
13:56
Polo Club of Data Science
Рет қаралды 31 М.
Cosine Similarity
23:32
Francisco Iacobelli
Рет қаралды 9 М.
A Trivial Implementation of LSA using Scikit Learn  (2/5)
5:47
Databricks Academy
Рет қаралды 24 М.
Conditional Probabilities
18:48
Francisco Iacobelli
Рет қаралды 7 М.
Lecture 47 - Singular Value Decomposition | Stanford University
13:40
Artificial Intelligence - All in One
Рет қаралды 339 М.
Topic Models: Introduction
14:21
Jordan Boyd-Graber
Рет қаралды 33 М.
LSA (Latent Semantic Analysis)
4:07
Minsuk Heo 허민석
Рет қаралды 32 М.
Latent Dirichlet Allocation (Part 1 of 2)
26:57
Serrano.Academy
Рет қаралды 137 М.