Easiest Local Function Calling using Ollama and Llama 3.1 [A-Z]

  Рет қаралды 2,799

Prompt Engineer

Prompt Engineer

Күн бұрын

In this video, we are going to use Ollama to test out a local LLM viz. Llama 3.1 to try out function calling.
Get a detailed understanding of what function calling is and how you can code it yourself at your own pc.
This is also an example of local function calling using Ollama which shows the power of local LLMs and open-source LLMs
Github Repo: github.com/Pro...
#ollama #llama3.1 #functioncalling #llms #localllms
CHANNEL LINKS:
🕵️‍♀️ Join my Patreon for keeping up with the updates: / promptengineer975
☕ Buy me a coffee: ko-fi.com/prom...
📞 Get on a Call with me at $125 Calendly: calendly.com/p...
💀 GitHub Profile: github.com/Pro...
🔖 Twitter Profile: / prompt48

Пікірлер: 11
Function Calling with Ollama, Llama 3.1, Streamlit and RapidAPI
24:51
Prompt Engineer
Рет қаралды 1,1 М.
How does function calling with tools really work?
10:09
Matt Williams
Рет қаралды 11 М.
Brawl Stars Edit😈📕
00:15
Kan Andrey
Рет қаралды 16 МЛН
大家都拉出了什么#小丑 #shorts
00:35
好人小丑
Рет қаралды 85 МЛН
Python RAG Tutorial (with Local LLMs): AI For Your PDFs
21:33
pixegami
Рет қаралды 227 М.
Why is everyone LYING?
7:56
NeetCodeIO
Рет қаралды 289 М.
EASILY Train Llama 3.1 and Upload to Ollama.com
14:51
Mervin Praison
Рет қаралды 31 М.
Function Calling in Ollama vs OpenAI
8:49
Matt Williams
Рет қаралды 33 М.
LLAMA-3.1 🦙: EASIET WAY To FINE-TUNE ON YOUR DATA 🙌
15:08
Prompt Engineering
Рет қаралды 27 М.
Local Agentic RAG with LLaMa 3.1 - Use LangGraph to perform private RAG
18:01
Brawl Stars Edit😈📕
00:15
Kan Andrey
Рет қаралды 16 МЛН