Full podcast episode: kzbin.info/www/bejne/m17KqKmjnd6IbaM Lex Fridman podcast channel: kzbin.info Guest bio: Arvind Srinivas is CEO of Perplexity, a company that aims to revolutionize how we humans find answers to questions on the Internet.
@mackiej5 ай бұрын
Great video. I wonder why there is no BM25Vectorizer in scikit-learn? There's a TfidfVectorizer.
@SeenAndCheese5 ай бұрын
haha I hope Lex video edited out what Aravind said about what not to say. That would be so metal. Meta-metal!
@duongkstn5 ай бұрын
From existential crisis channel to LLM teaching channel :)😅
@manonamission20005 ай бұрын
when LLMs begin having 1,000,000,000 token windows, the RAG pattern might be obsolete
@paultparker5 ай бұрын
Indeed! I I Believe Perplexity is already doing this in normal, searches. The difficulty for very large context, windows is that the processing power required is the square of the context window size. my impression was that this is fundamental To how attention works in the transformer architecture, but it seems to me like there might be ways to optimize it with research.
@djpuplex5 ай бұрын
This is basically what AI is and will be for a while.
@thegrandbizzare5 ай бұрын
Obviously this was recorded before they were caught scraping Wired and Forbes articles, with robots.txt files, using unlisted ip addresses...lol