Jeremy, thank you for your contribution in bringing AI tooling into Rust. Right now I’m busy working on a project that isn’t related to AI, and it constantly feels like I’m missing on something more important but thanks to your videos and open source projects I know well where’s my entry on AI field! Amazing things, incredible flexibility in Rust!!! Thank you ❤
@ruturajzadbuke96466 ай бұрын
Amazing! I was planning to write mine the coming weekend. Now I can use this one, at least as a starting point. Thanks Jeremy!
@ShaunPrince6 ай бұрын
This is the innovation that we need for AI apps.... The playing around and experimenting in python has hit a plateau, time to build some real apps now!3
@yaanno6 ай бұрын
Fantastic! Thank you for this yet another great series! I've been trying to learn Rust by building a client/consumer app utilizing AI tools to connect travel APIs and user queries by voice. Even though no success yet (just learning stuff) but the application of your library will be superb. Merci :)
@JeremyChone6 ай бұрын
Thanks, voice might come later in the context of chat api. If you need text to speech, the async-openai has a great support for open AI TTS.
@norminemralino22606 ай бұрын
Another great tutorial Jeremy. I was able to follow along relatively easy. One small suggestion is to maybe use the dotenv create in your example. Anyway great job. I'd love to see a RAG example using Rust. I don't think I've seen one. I normally see RAG implementations in Python
@JeremyChone6 ай бұрын
Thanks, in the crate examples/ there is an example of Auth Resolver which allows providing a custom env name or even a resolver function. Yes, later I will be showing RAG in Rust. It might have SQLite for the content, lancedb/vectordb for the embeddings, and it will actually show that the embeddings/vector search can be optional when we do RAG.
@JFaleiroJR6 ай бұрын
Lower level that this, but I wonder what is in the making for a specialized machine language accelerator, friendlier than melior - I am considering putting a project together on that space.
@ItaloMaiaTM5 ай бұрын
Another nice video, Jeremy! Would you have any material to suggest on how to use genAI to solve objective problems?
@JeremyChone5 ай бұрын
Thanks, I am going to create more content with GenAI, and AwesomeApp will be a desktop app using it. In the enterprise/productivity app context, I created a high-level video UI to HI that outlines some of the opportunities. Function calling, which can translate a user request to a function name/params to be called by the application, can be very powerful. I will try to create some concrete examples of this.
@user-zz6fk8bc8u6 ай бұрын
I think some image things would be great. Especially stable Diffusion. Would love to use Rust for that instead of python or some GUIs
@JeremyChone6 ай бұрын
Yes, I am planning to add images as far it is supported by the chat apis.
@froop23936 ай бұрын
No! Super cool 😊 I really like it!
@TheMsksk6 ай бұрын
Love it. Thanks!
@ИванРагозин-я8я6 ай бұрын
Would you please add Groq
@JeremyChone6 ай бұрын
Oh, yes, good one. I knew I had one important one left to do, but somehow I could not remember the name. Thanks, I will add Groq. Their API should be similar.
@JeremyChone6 ай бұрын
Groq has been added to the main branch. I am pretty amazed by their pricing, so cheap! Definitely a must-have for GenAI.
@automatalearninglab6 ай бұрын
Yessss, super neat! :)
@ZiRo8156 ай бұрын
What’s the thinking behind modelling the models as strings instead of an enum?
@JeremyChone6 ай бұрын
To avoid having a fixed list of model names that needs to be updated. This is especially important with ollama, which allows the installation of a wide variety of models. Eventually, I might have Model(Arc) type, but this will just be a wrapper for the string to make it a little more efficient with repeating the model name in the stream event (not present for now)
@BrazenNL6 ай бұрын
Any chance on Copilot?
@JeremyChone6 ай бұрын
I might have missed something, but I think Copilot is one level above the generative/AI provider. I believe they use the OpenAI model, probably wrapped in an Azure service. So, adding the "endpoint variants" such as Azure OpenAI endpoints, AWS Bedrock endpoints (for the open models and Anthropic/Cohere), and Google Vertex AI are part of the plan. The adapter infrastructure is now in place. Now most of the differences should be around the auth schemes.
@BrazenNL6 ай бұрын
@@JeremyChone Sounds like a plan! Looking forward to your thought process in the next video. Thanks!
@meka49966 ай бұрын
Amazing! Thanks
@cunningham.s_law6 ай бұрын
not sure how I feel about the implicit key reading from ENV great crate tho
@JeremyChone6 ай бұрын
You can provide your own with resolver per adapter kind.
@parabolicpanorama6 ай бұрын
cool, but what if you want to roll your own model and not just ping an API for a response. You could do all of this in any language..