No video

Build Your Own ChatGPT with LangChain & Ollama (No OpenAI API Key Needed!)

  Рет қаралды 3,942

Maple Arcade

Maple Arcade

Күн бұрын

Want to build your own ChatGPT-like chatbot without relying on OpenAI's API? This video shows you how to do just that using LangChain and Ollama! LangChain is a powerful framework for building AI-powered applications, and Ollama is the latest cutting-edge platform to run large language model that can generate human-quality text.
Join us as we dive into the world of AI and build this chatbot using typescript/javascript.
By the end of this video, you'll have your very own AI assistant ready to answer your questions and complete your requests!
Github Repo: github.com/Kou...
Quickstart Guide to Ollama: • Ollama.ai: A Developer...
Stream LLM Responses in NextJS chat app: • Build a Next.js App wi...
Make Ollama WebAPI Public: • Host a AI Server

Пікірлер: 10
@shreyas.sihasane
@shreyas.sihasane 19 күн бұрын
Great work, please bring more AI projects
@Aman_yadav1419
@Aman_yadav1419 4 ай бұрын
Amazing keep uploading ❤
@haashirnawaz9687
@haashirnawaz9687 11 күн бұрын
the streamingtextresponse is not working anymore from the ai library what do i do instead to make it work?
@MrRobots100
@MrRobots100 4 ай бұрын
Few questions: 1) Will it work locally on the device on which it is installed on for example lets say react native or do you need to set up a flask API and get response 2) If I have a MongoDb or Firebase DB with data and I want to finetune the model to give me responses based on that, can I do it 3) Scalability costs, for lets say from 1000 users to 100,000 users, how can I keep it low
@maplearcadecode
@maplearcadecode 4 ай бұрын
1) No, you don't need to set up flask API for this. Installing Ollama will spin up a web api. Do check out my Ollama quick start guide and you'll have your question answered. 2) Of course, you can load fine tuned custom models on ollama. You can check out this tutorial here (kzbin.info/www/bejne/aXPRn6aHeNKagtEsi=P4yZYxFuGtyKmmU6) 3) If you are thinking about sharing the app then as a first step, check out ngrok and their pricing to make the locally hosted app public. Secondly, the shared app will be limited by the throughput of the LLM itself, which will be influenced by the local hardware you will be running it on. You might wanna consider running ollama in hosted environments within a docker.
@deepanshjha6353
@deepanshjha6353 4 ай бұрын
Could you please provide guidance on creating an Advanced RAG model that incorporates Hugging Face LLM models and PDFs within Next.js using TypeScript?
@maplearcadecode
@maplearcadecode 4 ай бұрын
Sure I'll make a video soon
@Pancelet
@Pancelet 4 ай бұрын
@@maplearcadecode faster please
@hasanulhaquebanna
@hasanulhaquebanna Ай бұрын
How can we train this model or how can built our own ai model to work with our needs?
@ash_b9875
@ash_b9875 Ай бұрын
Thank you very much
How To Connect Local LLMs to CrewAI [Ollama, Llama2, Mistral]
25:07
codewithbrandon
Рет қаралды 67 М.
Они так быстро убрались!
01:00
Аришнев
Рет қаралды 3 МЛН
OMG what happened??😳 filaretiki family✨ #social
01:00
Filaretiki
Рет қаралды 12 МЛН
Lehanga 🤣 #comedy #funny
00:31
Micky Makeover
Рет қаралды 28 МЛН
Can This Bubble Save My Life? 😱
00:55
Topper Guild
Рет қаралды 57 МЛН
Lesson 15: Email Setup in Django (Part 2) #django #webdevelopment #python
25:01
DataWise Discoveries 🌟
Рет қаралды 20
host ALL your AI locally
24:20
NetworkChuck
Рет қаралды 984 М.
Stop paying for ChatGPT with these two tools | LMStudio x AnythingLLM
11:13
Ollama.ai: A Developer's Quick Start Guide!
26:32
Maple Arcade
Рет қаралды 6 М.
How to Stream Responses from the OpenAI API
12:33
Adam Thometz
Рет қаралды 4,8 М.
Run your own AI (but private)
22:13
NetworkChuck
Рет қаралды 1,3 МЛН
How I Made AI Assistants Do My Work For Me: CrewAI
19:21
Maya Akim
Рет қаралды 796 М.
Understanding ChatGPT/OpenAI Tokens
7:21
DevTalk with FK
Рет қаралды 34 М.
Level Up Your Typescript Skills: Adding Ollama To Your Apps!
13:12
Matt Williams
Рет қаралды 24 М.
Они так быстро убрались!
01:00
Аришнев
Рет қаралды 3 МЛН