LLM Function Calling - AI Tools Deep Dive

  Рет қаралды 12,309

Adam Lucek

Adam Lucek

Күн бұрын

Пікірлер: 35
@IdPreferNot1
@IdPreferNot1 6 ай бұрын
The BEST walkthrough and sample code on tool calling on KZbin!
@therealsergio
@therealsergio 6 ай бұрын
Adam, your content is very helpful and thoughtfully put together. It’s clear there are a lot of hours going into their preparation
@terryliu3635
@terryliu3635 2 ай бұрын
The best walk through on this topic ever!! Thank you!!!
@Alexfnplays
@Alexfnplays 24 күн бұрын
crystal clear and right to the point!!!
@sunitjoshi3573
@sunitjoshi3573 5 ай бұрын
Nice video! I was just thinking about function calling…and your video showed up! Thanks 😊
@AdamLucek
@AdamLucek 5 ай бұрын
Perfect!
@PragyaKaushik-q8f
@PragyaKaushik-q8f 2 ай бұрын
This is amazing content! Quite concise and interesting! Thank you so much!!
@JustinHennessy
@JustinHennessy 2 ай бұрын
Amazing run down on tools. Thanks so much for sharing.
@BagusSulistyo
@BagusSulistyo 21 күн бұрын
hi Adam, thank for video awesomeeeee !!
@Horizont.
@Horizont. 6 ай бұрын
Great video. Dude's been speaking nonstop for 30min straight. Now, I need a 2 hour break.
@amk2298
@amk2298 6 ай бұрын
😂😂😂😂
@Karthik-ln7eg
@Karthik-ln7eg 6 ай бұрын
great video. very clear explanation.
@AdamLucek
@AdamLucek 6 ай бұрын
Thanks!
@JimHeil
@JimHeil 6 ай бұрын
Awesome video! Thank you!
@AdamLucek
@AdamLucek 6 ай бұрын
Thanks!!!
@sirishkumar-m5z
@sirishkumar-m5z 5 ай бұрын
An intriguing exploration into LLM function calling! Investigating further AI tools may improve your comprehension even more.
@billblair3155
@billblair3155 2 ай бұрын
U Da Man 😎
@coolmcdude
@coolmcdude 6 ай бұрын
My favorite subject
@chrisjhandy
@chrisjhandy Ай бұрын
awesome video!
@rowcolumn7499
@rowcolumn7499 10 күн бұрын
Thank you Adam and you effort to make this video. May I ask, in 2025, what is/are best tools if I want to develop an app that can do RAG and external tools calling in your opinion? llamaindex/langgraph/langchain? Thanks in advance!
@skibidi-skibidi-bap
@skibidi-skibidi-bap 3 ай бұрын
Wait so does Langchain create the whole Json for you using the Tool decorator?
@aishwaryaallada0925
@aishwaryaallada0925 5 ай бұрын
Awesome video! Is the code shared anywhere? 😊
@AdamLucek
@AdamLucek 5 ай бұрын
Yes! In description and direct link here: github.com/ALucek/tool-calling-guide
@youtubemensch
@youtubemensch 5 ай бұрын
How are possible with a LLM running on a own Server? So without an Ai-API. Like a Llama model. Is this possible with a specific model?
@siddheshwarpandhare1698
@siddheshwarpandhare1698 6 ай бұрын
Hi Adam, well explained all content, I need to disable parallel calling, but I am not sure where to Put parallel_function_tool:false , can you help me in this case?
@AdamLucek
@AdamLucek 6 ай бұрын
Sure, that's placed here with OpenAI's API response = client.chat.completions.create( model="gpt-4o", messages=messages, tools=first_tools, tool_choice="auto", parallel_tool_calls=False # parallel function calling ) or with Langchain during the bind_tools stage llm_tools = llm.bind_tools(tools, parallel_tool_calls=False)
@siddheshwarpandhare1698
@siddheshwarpandhare1698 6 ай бұрын
@@AdamLucek Completions.create() got an unexpected keyword argument 'parallel_tool_calls' getting thsi error
@AdamLucek
@AdamLucek 6 ай бұрын
I was getting that too then I updated my OpenAI api with pip and it fixed it! Make sure you restart your kernel after if you’re in a notebook environment
@siddheshwarpandhare1698
@siddheshwarpandhare1698 6 ай бұрын
Thank you ​@@AdamLucek
@DaleIsWigging
@DaleIsWigging 5 ай бұрын
The tutorial doesn't say how to insert your API key. To do this replace { client = OpenAI() } with : { import os from dotenv import load_dotenv # Load environment variables load_dotenv() client = OpenAI() # OpenAI API configuration client.api_key = os.getenv("OPENAI_API_KEY") } ignore the "{" and "}". Then you can use a .env file with: OPENAI_API_KEY="sk-proj..."
@AdamLucek
@AdamLucek 5 ай бұрын
Also possible to do Import os os.environ["OPENAI_API_KEY"] = "your-api-key-here" And it will automatically pull the environment variable when creating the client etc. Thanks for pointing out!
@gani2an1
@gani2an1 6 ай бұрын
where can we see the functions that are available with each model?
@maxpalmer8660
@maxpalmer8660 6 ай бұрын
Kinda hate this dudes voice ngl. Anyone else with me?
@AdamLucek
@AdamLucek 6 ай бұрын
I been thinking this…
@skibidi-skibidi-bap
@skibidi-skibidi-bap 3 ай бұрын
No. I am not with you.
How AI Creates Images/Videos/Audio - Diffusion Models Explained
38:34
Function Calling in Ollama vs OpenAI
8:49
Matt Williams
Рет қаралды 40 М.
小丑女COCO的审判。#天使 #小丑 #超人不会飞
00:53
超人不会飞
Рет қаралды 16 МЛН
UFC 310 : Рахмонов VS Мачадо Гэрри
05:00
Setanta Sports UFC
Рет қаралды 1,2 МЛН
Арыстанның айқасы, Тәуіржанның шайқасы!
25:51
QosLike / ҚосЛайк / Косылайық
Рет қаралды 700 М.
GPT function calling in a nutshell
15:36
Henrik Kniberg
Рет қаралды 53 М.
The BEST Way to Chunk Text for RAG
33:17
Adam Lucek
Рет қаралды 9 М.
The 8 AI Skills That Will Separate Winners From Losers in 2025
19:32
Building Intelligent AI Agents with Open Source LLMs | smolagents
41:14
How does function calling with tools really work?
10:09
Matt Williams
Рет қаралды 17 М.
Variational Autoencoders | Generative AI Animated
20:09
Deepia
Рет қаралды 61 М.
Visualizing transformers and attention | Talk for TNG Big Tech Day '24
57:45
EASIEST Way to Fine-Tune a LLM and Use It With Ollama
5:18
warpdotdev
Рет қаралды 288 М.