Each new generative UI example from you guys is implemented in a different pattern. From manually intercepting response types to the latest streamui+tools. Does the team feel this present pattern is mature or are they unhappy with it and will be redoing it next month?
@fagnersales5324 ай бұрын
Nice observation
@carloslfu4 ай бұрын
Yeah, but this is why it experimental. Use at your own risk!
@shuding4 ай бұрын
Hey, AI SDK maintainer here! I'm not sure what you're referring to regarding "intercepting response types", but `streamUI` is pretty stable. It's the same as the previous experimental `render` function, but with a more consistent name as other APIs like `streamText` and `streamObject`. Also worth mentioning that the Generative UI APIs are designed to be general enough and fit into any UI patterns and AI pipelines, which means that there isn't only one way to do Generative UI. For example, you can just use `streamUI` + tools to handle LLM + UI, or combine low-level utilities like `createStreamableUI`/`createStreamableValue` and your existing pipeline for flexibility. Happy to answer any questions!
@wshewm4 ай бұрын
@@shuding What are your recommendations for maintaining type-safety in the application, especially surrounding the server actions and use of the `useActions` hook? Unless I'm missing something, it seems like the required use of the hook defeats one of the major benefits of server actions: end to end type safety. Is there maybe a lower-level approach that I could take to bypass the hook?
@itszachhan4 ай бұрын
@@shuding Can you use libraries like shadcn and nextui for the streamed components? Last time I tried, it wasnt working
@MilindMishra4 ай бұрын
Thanks for the walkthough! Looking fwd to build generative ui stuff :)
@whovivekshukla4 ай бұрын
I tried out their streaming ui a few months ago! that was pretty dope!
@syndg4 ай бұрын
Was just about to send you this video for our reference. Glad to see you already watched it!
@TomAsh5194 ай бұрын
Could you please do a tutorial on the combined use of ai ask with long chain adapter that utilizes eg. simple RAG? From your documentation it is not clear how to implement it properly.
@jervx82927 күн бұрын
U need vectors
@TheJuava4 ай бұрын
Great Overview! A Video incorporating RAG with Vercel AI SDK would be awesome!
@nicoalbanese103 ай бұрын
Thanks for the suggestion - this is on our list!
@0xOmzi3 ай бұрын
Hyped! 🥳
@yaseerokino4 ай бұрын
Vercel always coming through for developers
@michaelo4u4 ай бұрын
Thanks! Great job simplifying it.
@ahmadbilalfarooqi4 ай бұрын
great explain by AI SDK of Vercel its really helpful and easiest for building application with the use of predefined function and method
@TomAsh5194 ай бұрын
it is me only or this tutorial and the repo does not work? I got Error: `useUIState` must be used inside an provider. to resolve it add import { AI } from "./action"; to root layout: export default function RootLayout({ children, }: Readonly) { return ( {children} ); }
@ShaneCrenshaw3 ай бұрын
If you are calling getAiState or useUIState, it has to happen inside of So you could create a new component and do something like and inside of Chat, that's where you would use uiUIState
@wshewm4 ай бұрын
This is cool and all, but it seems that around every corner in this SDK (especially with the rsc stuff) all the types are just 'any'. In my opinion, you can't really call your library "The AI Framework for TypeScript" and then not have strong types. This is especially annoying because in my eyes it defeats one of the major benefits of server actions: end-to-end type safety. Is there a way to bypass some of the abstractions, like the useActions hook?
@0xOmzi4 ай бұрын
Thanks for this great video! Looking forward to trying my hands out on the latest release! Kudos to the Vercel AI team too!
@usagisan794 ай бұрын
ilove ai sdk and kirimase both!
@asimalqasmi73164 ай бұрын
Let me recall the kirimase dream I had
@BhagawanNityanandaOfGaneshpuri4 ай бұрын
Thank you guys!
@jrmumm3 ай бұрын
how does it stream a structured object? doesn't the stream come back as JSON? how can it parse it if it's not fully complete?
@fezkhanna69002 ай бұрын
fantastic!
@daniloitj3 ай бұрын
Very good explanation. If possible, could you connect streamUI with assistants? I also had a lot of difficulty separating the tools into other files.
@郭伽政2 ай бұрын
For example, a Ai has a getWeather tool The context of the conversation achieves the following effects: user: hello ai: Hello! How can I assist you today? user: How's the weather today? tool: getWeather("local") tool_result: {“weather”:"sunny","maxTemperature":35,"minTemperature":0} ai: The weather is good today, but the temperature difference is a bit large, so please keep warm. I hope that on the client page, users can see in sequence: Ai wants to use the getWeather tool, the call result of the getWeather tool, and Ai's answer based on the call result. How can I achieve this?
@simpingsyndrome4 ай бұрын
will it work for React Native?, i wanna build my ai chatbot mobile app.
@jeanysergeimezarodriguez86294 ай бұрын
i have the same question hehe
@jrmumm3 ай бұрын
when you stream an object, you get a partial. how do you get the final/full response (not partial)?
@xberna81564 ай бұрын
I could not find an example in the Docs where the model can use Tools and return RSC while also being able to Stream a response when no Tool is used. All the example I could find use generateText() or streamUi() so the text response Is not a Stream. Should I use a combination of streamText() + Tools + createStreamableUi() to Stream text and have Tools that can return RSC?
@hakarsheikh78534 ай бұрын
Great question im interested to know too
@nicoalbanese103 ай бұрын
Hey! With `streamUI`, if no tool is used, the text response is streamed via the component returned from the `text` function. Is that what you're looking to do?
@xberna81563 ай бұрын
@@nicoalbanese10 Yes, that's correct. I would like to stream the text token by token when the model does not use a tool. Can I achieve that using streamUI?
@raphauy4 ай бұрын
Awesome!
@Samuelsward963 ай бұрын
Hey i'm struggling a bit with the useChat for multiple conversations. How can i keep multiple conversations active at once? Any tips?
@rhyscampbell41783 ай бұрын
NICO!!!!!
@mymorningjacket_4 ай бұрын
Anyone have a great example on how to get the user's actual location here?
@nicoalbanese103 ай бұрын
You can run any asynchronous javascript code within a tools' execute function. So you would first want to find the exact location based on the search query (eg. openstreetmap). Then pass that to a weather api (eg. open-meteo) and return the resulting temperature 😊
@mymorningjacket_3 ай бұрын
@@nicoalbanese10 thanks for the suggestion, nico!
@andru50543 ай бұрын
Hey, awesome demo - thanks. We're using llamaindex in Python for our LLM backend that uses RAG. I want to use tools that pass react components to the frontend - how would I accomplish this? Thank you
@ShaneCrenshaw3 ай бұрын
Is there an example with streamUI and error handling for things like finishReason and usage?
@GAllium143 ай бұрын
Vercel ai sdk + RAG tutorial
@remiib184 ай бұрын
Is it still not possible to use both the regular tools to fetch data and the tools to return components ?
@yarapolana4 ай бұрын
I added $10 to test open ai and ai sdk and I had 100% of “unknown error” calls and $8 used?! What in the world it wasnt like this before. Loads of retries in the background (should be opted out from the start)
@RoccoGhielmini4 ай бұрын
I can't concentrate because they keep saying next
@nicoalbanese103 ай бұрын
Sorry about that, will work on it for the next one!
@journeyofc62004 ай бұрын
dope!!
3 ай бұрын
Someone knows how to Improve the response using the API of openAI? Seems like the chat-gtp web app the results are a lot better. Using the API returns very similar responses, in this case, asking "tell me a joke" answer the same thing over and over again. Another great API btw
@Jake-bh1hm4 ай бұрын
is this compatible with sveltekit 5?
@mustofa_id4 ай бұрын
yes
@maharshiguin78134 ай бұрын
New version of ai package has too much abstractions
@wshewm4 ай бұрын
Agreed. It destroys the type-safety, especially with RSC and the useActions hook.
@LutherDePapier4 ай бұрын
Where's Lee?
@leerob4 ай бұрын
I'm here :)
@ShouryanNikam4 ай бұрын
how do you make the code animations?
@dawidwraga4 ай бұрын
I'm just gonna leave a comment here so I'm notified if someone responds 👀
@mishal_legit4 ай бұрын
.
@ahmoin4 ай бұрын
/free
@rude_people_die_young4 ай бұрын
Great use of zod ❤
@zivtamary4 ай бұрын
+1
@andreschou95604 ай бұрын
sick
@smitty73263 ай бұрын
god i wish i knew about vercel ai like 6 months ago lol
@imDeeva2 ай бұрын
No need to add any openAi api key?
@FelixWaigner4 ай бұрын
When will i be able to use this with langchain?
@JeomonGeorge3 ай бұрын
can I use ai vercel in vue.js
@aneesmanzoor73404 ай бұрын
how to create these coding videos with animations like this kindly create a video on that as well @vercel
@nenadbanjeglav20814 ай бұрын
Can I use this in next.js out of the box?
@rutvijdoshi96643 ай бұрын
Can we fine tune this model ?
@JustOmmShah4 ай бұрын
Where do you keep the API Key?
@malaytiwari32073 ай бұрын
How do I pass the API key?
@Nick__XАй бұрын
Just add it to the env vars the lib will do the rest
@yogeshrathod9534 ай бұрын
Can it work with ollama
@AtomicPixels4 ай бұрын
Ugh I hate using gpt. Why isn’t anyone doing Gemini especially since it’s in free beta
@CarlottaRiganti4 ай бұрын
But how do you npm?
@The_kids-c2j4 ай бұрын
I made 8 subscribers 😊
@andriisukhariev4 ай бұрын
train without stops
@karamanabdullah4 ай бұрын
is it free?
@mohammednasser21594 ай бұрын
It's just a library, using it is free, however using the models, Like Gemini and GPT4o requires a token, which will almost always cost money
@karamanabdullah4 ай бұрын
@@mohammednasser2159 thank you
@leerob4 ай бұрын
@@mohammednasser2159 thank you! Yes exactly.
@sanukjoseph4 ай бұрын
🤩❣...
@jasonjefferson65963 ай бұрын
Is this a real human or synthetic?
@slavalu743 ай бұрын
looks like a bio robot for me
@slavalu743 ай бұрын
As always by Next over engineered and over complicated. We just need 4 functions. streamUI, receiveUI, streamText, receiveText. Everything else much easier to do without your helper functions.
@Archibong.samuel2 ай бұрын
I wish I could be part of nextjs team. This is insane 😩 why am I just seeing this? Thanks @team_nextjs