Really the absolute BEST AI presentations and development around! Thanks!
@HyperUpscale6 ай бұрын
You are from another planet, Mervin... Always few steps ahead in the future 🤯🤯🤯...
@asqu6 ай бұрын
this is Phidata Team project kzbin.info/www/bejne/j369nmqCmayEppo
@Ahmed-Sabrie6 ай бұрын
If I would give a score for this spectacular video in a scale from 1 to 10 .. I would give you 20! Well Done and Many Thanks
@michaelmarkoulides70685 ай бұрын
Holy Shit Marvin I’m impressed . You’ve just shown how with a few python libraries and some code you can hook into os file system and create agents with specialised knowledge . If I am not mistaken this could run on any embedded hardware with Linux os and a 5G or WiFi connnwction because your using an open ai api call Which means you could extend Ai to edge computing right now to all the millions of connected edge devices out there Kudos man 👏👏👏 I’m going to try this on some hardware I have at work
@moses54076 ай бұрын
Would it be possible to add approved chat comments to the local knowledge base? Or is that automatic via the postgres storage? Also, can PraisonAI automatic multi-agent creation be added vs. manually/programmtically defining all of the agents/tools in LLM OS?
@27712376 ай бұрын
Awesome Dear. Thanks for sharing. One Que - Can we use LLAMA3 instead of GPT4 ?
@moses54076 ай бұрын
Would love to see a remotely accessible server addition to this setup to act something like open interpreter and the O1 lite.
@mehditaslimifar25215 ай бұрын
can LLM OS be done with vectorshift?
@kamalkamals5 ай бұрын
there is a technical reason to choose phidata instead of langchain ?
@mehditaslimifar25215 ай бұрын
nicee, thank you :-) can you please make a video on LLM OS on AWS?
@SDN-Acad6 ай бұрын
Amazing video. Thank you!
@Techonsapevole6 ай бұрын
super cool, I hope the next CPUs will run Llama 3 70B fast
@onlineinformation53206 ай бұрын
Is there any way i can use these assistants created by phidata in multi agentic frameworks like crewai or autogen?
@luxipo99346 ай бұрын
This is impressive
@timothywcrane6 ай бұрын
Love to make something similar, however I would seek the foundational LLM to actually be a local SLIM and call for larger models if needed. I would wish to make it local GPU agnostic/unneeded. Modular GPUs can be added on prem or called from providers. My idea is not in any way superior on the face of it... just an iteration/extrapolation on a similar idea. Thanks.
@JohnSmith762A11B6 ай бұрын
It's funny in the film Her the fictional OS1 appears to take up the whole screen. It does later show documents and other UI elements, so I wonder if it's fair to call it an OS. I will say Apple and Microsoft need to get ahead of this and start allowing LLMs to control their desktops. My guess is both are working feverishly on this. In a few years you won't need to know how an email app like Outlook even works to send and receive email, or to create and update spreadsheets, your AI OS will handle that for you, just tell it what you want in there.
@benh81996 ай бұрын
Out of all the AI tools and frameworks you’ve used, which one(s) do you find to be the most useful and have the most promise moving forward?
@mehditaslimifar25216 ай бұрын
very nice video, thanks Mervin
@josephtilly2586 ай бұрын
Hello, exporting my openai api key isn't working on the terminal, any tips on how to import it ?
@spencerfunk66976 ай бұрын
I’m having problems using lm studio with this or groq
@FRX_SONGS23 күн бұрын
How did u build that AI?
@christopheboucher1276 ай бұрын
Hi Mervin, what about postgres memory ? is it a long term memory ? something like autogen teachable agent or memgpt ? thx again for your amazing content !
@phidata6 ай бұрын
Postgres right now is storing chat history -- but chatgpt like personalized memory is in the works :)
@christopheboucher1276 ай бұрын
@@phidata great! Amazing! I don't know if it will be possible, but I dream of a long term management system in a sql like database with autocreation of tables for topics which will be filled in by the agent when he found relevant info to keep (for example preferences, backstory of the user, the company, personals data's, ideas and thought, etc.) this kind of memory would be very helpful for all kind of assistants, from office to psychotherapist, coach, etc etc and maybe a runtime to reorganise all the database when it's needed... And if all that can be user session managed, it will be the perfect framework for new kind of agentic system, if you see what I mean... Unfotunaly I don't have enough coding skills to build that or to help building that.. Thanks a lot @phidata for.. Phidata ;) very great job and very great gift for the world!
@phidata6 ай бұрын
@@christopheboucher127 this is truly amazing! im coding the personalized memory piece right now and your message was like the AI gods speaking to me showing me what to build. Thank you! Cannot express how much I appreciate this guidance, thank you
@AI-Wire6 ай бұрын
How does PhiData relate to CrewAI and PraisonAI? Would we use them all separately and independently? Or do they work together somehow? If they are independent, which do you recommend and why?
@denisblack98976 ай бұрын
1. Start with crewAI 2. Find it was a waste of time 3. Move on with your life 😅 No need to blow your mind with more complex stuff to see this stuff provides zero value CrewAI is perfect and a little bit illegal with how simple it is.
@AI-Wire6 ай бұрын
@@denisblack9897 What do you mean by, "CrewAI is perfect and a little bit illegal with how simple it is."
@JohnSmith762A11B6 ай бұрын
@@denisblack9897 I'm confused... are you saying CrewAI is too simple to do real work? Or are you saying it's amazing?
@henrychien91776 ай бұрын
Is openai the only api? Or can i use local llm to mimik openai api will it work?
@OptaIgin6 ай бұрын
Legit question, why not just use the embed models like Nomic for example, chatting with my LLM I learned these vector "memories" create neural connection cells/nodes or whatever and it connects to these vector memories , meaning it's knowledge and memories sort of expands..
@MeinDeutschkurs5 ай бұрын
I rewatched this. Now I think that the term from scratch is misleading. Now I think that from scratch should start with „open a new python file“.
@nrpragmatic6 ай бұрын
Can it run doom tho?
@BhavBurri7 күн бұрын
Can we create an ai os
@RICHARDSON1436 ай бұрын
❤❤❤
@MeinDeutschkurs6 ай бұрын
🎉🎉👏👏👏
@benh81996 ай бұрын
Phidata > Praison AI?
@enesgul29706 ай бұрын
Kendi görüntün çok büyük. Kodları göremiyoruz.
@АлексГладун-э5с6 ай бұрын
You could have shared a link to the original video: kzbin.info/www/bejne/bJiVfH-srK2Norc instead of recording a clone yourself.
@tomasbusse24106 ай бұрын
Hm 😢
@phidata6 ай бұрын
tbh i think mervin explained it better than me :)
@helix88476 ай бұрын
if you havent noticed he does that a lot for most of his videos.
@phidata6 ай бұрын
@@helix8847 im actually a big fan of that because he explains it much better than me :) hope he continues to do that. Mervin has a way of communicating complex information and i learn a lot about my own work when he makes a video
@redbaron35556 ай бұрын
Let’s rename an agent as OS and pretend it is novel…🤷🏻♂️🙈
@ninolindenberg44445 ай бұрын
Here is how you create AI OS from SCRATCH => Python …. 😂😂😂😂😂😂 What’s next? Building AI rockets from SCRATCH with Python? 😂😂😂😂😂😂 Please stop the nonsense and start educating people with real stuff. Yes Python is installed by default in OS but it‘s not used to program the OS.
@Badg0r6 ай бұрын
This is never going to work, since the LLM has to work from an OS as well. This is an OS on an OS. And it always needs lots of power.
@fieldcommandermarshall6 ай бұрын
depends on what you mean by ‘this’ 🤓
@nathanpallavicini66876 ай бұрын
Must have never heard of virtualisatiin
@michaelmarkoulides70685 ай бұрын
Yes it’s true LLM’s have an OS under it in the same way an ATM or a parking meter have an app running on top of a operating system like windows embedded or embedded Linux but I think you missing the point . LLM’s are black boxes they have no contact with things outside their domain . LLM Os is a concept and Marvin has demonstrated how you can implement that concept and giving it hooks into your OS to access files and special agents . In this video Marvin uses an OpenAI api key which means you are making a api call to Open Ai servers . So you could run this as is on a barebones Linux system with a WiFi or 5G connection and run it on raspberry pi or higher end beaglebone black . If you were going to replace the OpenAI LLM component with an open source Llm like Grok or llama then you are correct you would need a lot more memory and compute power not to mention a gPU