Instructor: The Best Way to get Typed Data from Ollama

  Рет қаралды 2,676

Ian Wootten

Ian Wootten

Күн бұрын

Пікірлер
@jxnlco
@jxnlco 3 ай бұрын
Thanks for featuring instructor!
@IanWootten
@IanWootten 3 ай бұрын
Hey Jason, thanks for creating it!
@MeinDeutschkurs
@MeinDeutschkurs 5 ай бұрын
Great! But try Gemma model. Sounds strange, but it is really good on picking content.
@whatsbetter8457
@whatsbetter8457 5 ай бұрын
Hey Ian, there is a more Ollama native library for the same use case called ollama-instructor. It was inspired by instructor from Jason Liu.
@IanWootten
@IanWootten 5 ай бұрын
This is a very recent library and doesn't seem to offer most of the features of instructor. I'm not clear on why you'd use it over instructor itself.
@aznnarabdelmotalib7303
@aznnarabdelmotalib7303 3 ай бұрын
Hello, perhaps the use of the ollama-instructor library guaranty a local use of your data and I think it's a major point. Is the openIA library safe to use? No data leaks?
@davidtindell950
@davidtindell950 5 ай бұрын
thank you.
@IanWootten
@IanWootten 5 ай бұрын
You're very welcome.
@Burgerhs
@Burgerhs 5 ай бұрын
Sir I'm off topic but how can I enable usb support on the steam deck Gnome Boxes
Choosing Your LLM Provider is a Whole Lot Easier with This
4:28
Ian Wootten
Рет қаралды 1,7 М.
This Game Is Wild...
00:19
MrBeast
Рет қаралды 130 МЛН
Disrespect or Respect 💔❤️
00:27
Thiago Productions
Рет қаралды 42 МЛН
Instructor makes GPT Function Calling easy! 🚀
21:34
echohive
Рет қаралды 5 М.
Instructor - Structured LLM Outputs - Hands on Demo
10:44
Fahd Mirza
Рет қаралды 1,1 М.
FastHTML and HTMX Will Change the Way You Build Apps Forever
14:05
Draupner Data
Рет қаралды 1,4 М.
1. The Ollama Course: Intro to Ollama
9:39
Matt Williams
Рет қаралды 18 М.
ell: A Powerful, Robust Framework for Prompt Engineering
15:04
Ian Wootten
Рет қаралды 31 М.
“use cache” NextJS’s latest take on data caching
17:27
Jack Herrington
Рет қаралды 7 М.
FastHTML: Modern, Reactive Web Apps in Pure Python
16:09
Ian Wootten
Рет қаралды 9 М.
Using Ollama to Run Local LLMs on the Raspberry Pi 5
9:30
Ian Wootten
Рет қаралды 75 М.
Easiest Local Function Calling using Ollama and Llama 3.1 [A-Z]
13:28
Prompt Engineer
Рет қаралды 5 М.