No video

AI Knowing My Entire Codebase Resulted in a 20x Productivity Increase

  Рет қаралды 15,043

Mervin Praison

Mervin Praison

Күн бұрын

Пікірлер: 184
@AlexX-xtimes
@AlexX-xtimes Ай бұрын
Great Job. It is exactly what I was working on. Next step: Implement RAG to handle more complex applications
@MervinPraison
@MervinPraison Ай бұрын
If you can ingest your whole code base into AI, you don’t need RAG
@daniiielsan
@daniiielsan Ай бұрын
@@MervinPraison The 2M token limit is still limiting. What happens with the vast majority of repositories that have more than 100M tokens?
@MervinPraison
@MervinPraison Ай бұрын
Agree, if it’s higher than 2M token then RAG is required . Will think a way of implementing it. Also it has the benefit of keeping the cost low
@Alf-Dee
@Alf-Dee Ай бұрын
I am exploring ways to bypass the token limits by structuring the code architecture in a modular way from the beginning. For example, I sometimes aggregate just public methods signatures, with one-liner method explanations, so I don’t waste tokens. This way the LLM has enough context of all the classes it needs to know about. Like treating my own code like an API. (For context, I am a Unity C# XR/gamedev) What do you think? Is this method already used in some tool?
@Soniboy84
@Soniboy84 Ай бұрын
​@@MervinPraison Large context window is still not bulletproof. LLMs tend to lose detail in the middle of the context window, which can be pretty detrimental in a codebase if it misses something crucial. It's good for simple CRUD stuff, but isn't good enough yet for production ready big applications.
@MarcusNeufeldt
@MarcusNeufeldt Ай бұрын
This is really the way to go, more content in that direction pls 🙏🙏
@judeboka6094
@judeboka6094 Ай бұрын
This really great. It will much save so much time. I would be nice to have another video focused on using local models. Great Job 👍
@Leto2ndAtreides
@Leto2ndAtreides Ай бұрын
My initial feedback from trying it (with a larger project with 900K tokens), is: 1. It may help to have multiple profiles in the settings file. Like, iOS only folders, or Android only folders, docs folders (as per my own case)... Preferably the UI would have an option to switch between these profiles. 2. The settings file should have an allow_only option, as opposed to just an ignore option - that way, we could just white list a few directories. 3. A dropdown or autocomplete for most of the current OpenAI, Gemini, Anthropic, etc. models would simplify selection... May as well let the user enter their own preference. But for common cases, a dropdown would definitely be better. Anyway, cool project... Very promising idea.
@MervinPraison
@MervinPraison Ай бұрын
Thank you very much for this feedback, much appreciated I will look in to each of this points and will try to get it included.
@JayS.-mm3qr
@JayS.-mm3qr Ай бұрын
How did you get the program to work? The requirements give me all kinds of conflicts. Cannot get it to run.
@ETdoFresh
@ETdoFresh Ай бұрын
I am also working on something similar! For this kind of application I always thought maybe a graph of functions, variables, and classes would be a good data structure to pass in as context, but harder to implement in practice. Keep up the great work!
@chetanreddy6128
@chetanreddy6128 Ай бұрын
Great serving to the community man its just amazing now the developers could save tons of time and even they can be more productive same time!
@d.d.z.
@d.d.z. Ай бұрын
Wow Mervin. Absolutely outstanding.
@stonedoubt
@stonedoubt Ай бұрын
Mervin! You are a beast!
@alexnimo83
@alexnimo83 Ай бұрын
Amazing andven better then some of the payed options... Adding an agentic framework for more complex tasks can be a nice addition...
@Leto2ndAtreides
@Leto2ndAtreides Ай бұрын
For around 40K tokens, a Gemini 1.5 Flash API call would cost > $0.02... Which is fine.
@ChicagoJ351
@ChicagoJ351 18 күн бұрын
I just glanced through the video. I see the "knowing my entire code base" part, but don't see the "20x productivity increase" part. The more I watch AI coding videos the more it seems it's mostly beginner coders. I could be wrong, but that is what it seems.
@redbaron3555
@redbaron3555 Ай бұрын
Is it able to improve on complex code as good as aider?
@Ahmed-Sabrie
@Ahmed-Sabrie Ай бұрын
Very well done, mate! really astonishing!
@malikrumi1206
@malikrumi1206 Ай бұрын
I like the concept, but I do have some questions. 1) You did a token count before choosing the model. Doesn’t this count vary widely, depending on the model and its tokenizer? Don’t some models allow for the use of different tokenizers? 2) In one of the comments, you said ingesting the entire codebase meant not needing RAG. If your AI needs are *only* within your codebase, sure, that might be right. But if you did all this work with Python 3.11, what are you going to do when Python 3.12 comes out?
@adamchan4403
@adamchan4403 Ай бұрын
Super interesting and waiting a long time for it , please share more on this topic and real usage tutorial .
@andrewsilber
@andrewsilber Ай бұрын
Looks like a great start! I think integrating RAG and maybe graphRAG would make it even more useful. Also, it would be good if it could read all the git PR descriptions so that it can have more concept of which files need to be modified to implement certain things. For example, if I have a game codebase and I want to add a new weapon, that might involve a number of different systems: inventory, UI, gameplay mechanics, level design, etc. It would be great if a newcomer to the game dev team could use that rather than spend a huge amount of time ramping up on a big complex codebase to accomplish their JIRA tasks.
@figs3284
@figs3284 Ай бұрын
Looks good man. I'll give it a try tonight.
@Augmented_AI
@Augmented_AI Ай бұрын
Please do a video on long term memory.
@sahajamitrawat
@sahajamitrawat Ай бұрын
Thanks for sharing. Looks promising. I also use chainlit UI for my personal projects :-)
@andyloren4826
@andyloren4826 5 күн бұрын
Why it does not support dart files? I cannot see any of them. You said how we can exclude files but not how to include them. Do you know how to include dart files?
@florentromanet5439
@florentromanet5439 Ай бұрын
Awesome 😮
@dDesirie
@dDesirie Ай бұрын
Great job building this all by yourself! I'm currently using Cursor IDE and Sourcegraph Cody extension. They both support many models as well as codebase indexing for contaxt. I wonder what is the difference between these?
@drmarinucci
@drmarinucci Ай бұрын
Thanks!
@MervinPraison
@MervinPraison Ай бұрын
Thank you
@henryinskip3085
@henryinskip3085 Ай бұрын
How does this compare with Cursor?
@paulmiller591
@paulmiller591 Ай бұрын
This is cool. Please do more about this.
@souvickdas5564
@souvickdas5564 18 күн бұрын
How to use command r+ model?
@Techonsapevole
@Techonsapevole Ай бұрын
impressive, next step: automatic fix github issues
@JayS.-mm3qr
@JayS.-mm3qr Ай бұрын
This sounds great, but oh my god, I have never had such problems with dependencies. Couldnt get it to work in colab or locally. Tried using poetry to install requirements. Specifically, right now the requirements cant solve a conflict between mdocs-material and mdocs-jupyter. It is unsolvable. Been trying to get this to run for DAYS. Please god give me an answer to resolve conflicts.
@avencadigital3527
@avencadigital3527 Ай бұрын
Hey Mervin! Thanks for all! You're amazing! For some reason I'm not able to set the Gemini API Key using EXPORT or even SET. Is there a way to define my key directly on the code? Thanks! (Ps. I'm running it on Windows)
@vasvalstan
@vasvalstan Ай бұрын
How is this different than Cursor? Did anyone tried both?
@Nice-rb9vd
@Nice-rb9vd Ай бұрын
Hi Mervin! Thanks very much for this! Does this also work with Context Caching for Gemini Pro 1.5 and Flash? It is very cheap and made for this type of thing. Thanks again!!!
@mikew2883
@mikew2883 Ай бұрын
Pretty awesome! 👍
@iredtm4812
@iredtm4812 29 күн бұрын
are you using chainlit to build that project ?
@vivanshreyas5857
@vivanshreyas5857 Ай бұрын
This is amazing!!!!!!!
@3stdv93
@3stdv93 Ай бұрын
Thanks for sharing 🙏
@PhillipRashaad
@PhillipRashaad Ай бұрын
This is really cool!! Does it actually edit the files for you?
@tijendersingh5363
@tijendersingh5363 Ай бұрын
You built this
@NobleVisionINC
@NobleVisionINC Ай бұрын
Does the code ask you to update the scripts? Do you still have to cut/paste code to make changes?
@MervinPraison
@MervinPraison Ай бұрын
That feature not implemented yet, probably that’s the next upcoming feature
@dohyunee
@dohyunee Ай бұрын
great contents, thank you
@SantoshBhorMD
@SantoshBhorMD Ай бұрын
Nice work. can you add some functions to include including folders instead of exclusions. Also it would be nice to just point and click on ui to exclude or include folder/files.
@MervinPraison
@MervinPraison Ай бұрын
Sure. Next will work on including folders
@ChopLabalagun
@ChopLabalagun Ай бұрын
Found solution but i think we need the ability to update the prompt as it always does the same specially with ollama.
@MervinPraison
@MervinPraison Ай бұрын
Did u login ,? Username and password is admin
@ChopLabalagun
@ChopLabalagun Ай бұрын
@@MervinPraison I had to set environment variables in order to be able to logging, i am on linux and i believe we need to update the prompt as every question was focus on explaining the whole code instead of just 1 file.
@aldoyh
@aldoyh Ай бұрын
Oh "that's exactly what we need!" Thanks! Will it work the same way for Laravel codebase?
@MervinPraison
@MervinPraison Ай бұрын
Yes it should
@farexBaby-ur8ns
@farexBaby-ur8ns Ай бұрын
Thx for this.. qns: Privacy of my data? Will my codebase and what it does be retained by llm.. the cpu and mem of my pc will o ly be a problem if I go ollama, right?
@lawrencium_Lr103
@lawrencium_Lr103 Ай бұрын
Siiik,,, well done
@indrakumar5365
@indrakumar5365 Ай бұрын
Cant we fine tune any code model with entire code base to achieve similar results?
@aariz2469
@aariz2469 Ай бұрын
aider or praison?what do you like and why?
@brulsmurf
@brulsmurf Ай бұрын
Only 20x? My brother I achieved 40x increase in productivity with Ai knowing my codebase.
@drmarinucci
@drmarinucci Ай бұрын
Thanks this is exactly what I need. Would it be possible to extend it to Claude 3.5 Sonet? Cheers
@MervinPraison
@MervinPraison Ай бұрын
Yes you can use Claude also. 100+ LLMs
@jargolauda2584
@jargolauda2584 Ай бұрын
Why you tried 3.5 turbo and not 4?
@subins2917
@subins2917 Ай бұрын
Hey, will this work on Windows?
@aldoyh
@aldoyh Ай бұрын
I was trying with it for few days, now when I login then submit a prompt it goes back to login?! any advice?
@MervinPraison
@MervinPraison Ай бұрын
Please use the username and password as : admin and admin
@aldoyh
@aldoyh Ай бұрын
@@MervinPraison worked like a charm! Now the tree list isn't complete? I am using ollama/mistral
@redbaron3555
@redbaron3555 Ай бұрын
What do you do when the LLM tries to fix a file and messes it up? Often it forgets parts of the code. Is there an option to go back?
@MarcusNeufeldt
@MarcusNeufeldt Ай бұрын
@@redbaron3555 I do regular backups before tackling bigger changes exactly because of that
@MervinPraison
@MervinPraison Ай бұрын
Version control each change using git . Give the Ability for the ai to revert change if it goes wrong
@xXWillyxWonkaXx
@xXWillyxWonkaXx Ай бұрын
How would you compare this to something like Deekseek Coder or Qwen, curious
@MervinPraison
@MervinPraison Ай бұрын
Deepseek coder and qwen can be used with this. But it might not have large context length as Gemini. Google Gemini shines at this
@themax2go
@themax2go Ай бұрын
not local, not private... not pushing my work through to the cloud
@danshimony
@danshimony Ай бұрын
I need help installing this in archlinux with the gui
@anglikai9517
@anglikai9517 Ай бұрын
9:33 Do praison code on praison code to improve itself so it has unlimited context regardless of llm
@viyye
@viyye Ай бұрын
what languages does it work on
@sigma_z
@sigma_z Ай бұрын
I am assuming that it would depend on the LLM you're using. But I could be wrong. I'm going to try this AI today. Damn amazing if you ask me. Well done to the author. 🎉
@viyye
@viyye Ай бұрын
@@sigma_z Am try it all now!! It is amazing
@MervinPraison
@MervinPraison Ай бұрын
Any language , as long as the LLM supports it
@viyye
@viyye Ай бұрын
@@MervinPraison thank you, this is such a great tool
@CHNLTV
@CHNLTV Ай бұрын
Mervin, is it possible to have an include yaml as opposed to exclude? I want to use this across different codebases and would like to just use a directory as my include to evaluate and work on.
@MervinPraison
@MervinPraison Ай бұрын
Great suggestion. I will add this to my features list. Thanks
@AbdulBasit-ff6tq
@AbdulBasit-ff6tq Ай бұрын
Rather than passing the whole context at the same time wouldn't something like graph rag would be a better option.
@MervinPraison
@MervinPraison Ай бұрын
RAG came in to play only to solve the lower context length. If we have higher context length with high accuracy with low cost, no need of RAG or any of its strategy.
@takshitmathur2761
@takshitmathur2761 Ай бұрын
amazing
@VLM234
@VLM234 Ай бұрын
Hi Guys, I am facing an error, when I give a prompt to praisonai chatbox, it's redirecting to the login page, after logging it starts from the beginning. I tried to explore the solutions but didn't get anything. Does anyone have any idea?
@MervinPraison
@MervinPraison Ай бұрын
Did you try the default username and password: admin and admin
@Z223I
@Z223I 19 күн бұрын
@MervinPraison I did the export OPENAI_API_KEY="..." with my real key but nothing is being returned in the browser except your logo. Suggestions?
@MervinPraison
@MervinPraison 19 күн бұрын
Did u try using the default username and password as admin and admin
@Z223I
@Z223I 19 күн бұрын
@@MervinPraison Yes I did. My guess was the key was wrong. But that matches. I believe it is returning an empty string. Other thoughts? You are doing an awesome job!
@vikaskyatannawar8417
@vikaskyatannawar8417 Ай бұрын
Does it only work with Python Repository?
@MervinPraison
@MervinPraison Ай бұрын
Now it can work with any repo. I have fixed that issue.
@Chatec
@Chatec Ай бұрын
I am using mac, when i try to run 'praisonai code' command inside project directory i get this error, note that pip didnt work when I installed so I followed a chatgpt process where I successfully installed using 'pipx'. The package works well including opening in the browser and displaying file structure in the console but when I try to chat is when I get this error """""ValueError: the greenlet library is required to use this function. No module named 'greenlet'"""""""""""""
@MervinPraison
@MervinPraison Ай бұрын
Thanks for letting me know about this issue. This issue is now fixed with the latest version. Please upgrade to the latest version using pip install -U "praisonai[code]"
@Chatec
@Chatec Ай бұрын
@@MervinPraison I have this error, please guide: ➜ mervin_20x git:(main) ✗ praisonai code 2024-07-20 18:04:59,260 - 8488127488 - __init__.py-__init__:632 - WARNING: SDK is disabled. 2024-07-20 18:04:59,260 - 8488127488 - __init__.py-__init__:1218 - WARNING: SDK is disabled. 2024-07-20 18:05:01,127 - 8488127488 - sql_alchemy.py-sql_alchemy:67 - WARNING: SQLAlchemyDataLayer storage client is not initialized and elements will not be persisted! 2024-07-20 18:05:02,516 - 8488127488 - config.py-config:351 - WARNING: Translation file for en-GB not found. Using default translation en-US. 2024-07-20 18:05:02,519 - 8488127488 - config.py-config:351 - WARNING: Translation file for en-GB not found. Using default translation en-US. 2024-07-20 18:05:02,522 - 8488127488 - markdown.py-markdown:42 - WARNING: Translated markdown file for en-GB not found. Defaulting to chainlit.md. Processed 28/28 files Context gathered successfully. Total number of tokens (estimated): 1302 Processed 28/28 files Context gathered successfully. Total number of tokens (estimated): 1302 18:05:20 - LiteLLM:ERROR: ollama.py:423 - LiteLLM.ollama.py::ollama_async_streaming(): Exception occured - All connection attempts failed 2024-07-20 18:05:20,785 - 8488127488 - ollama.py-ollama:423 - ERROR: LiteLLM.ollama.py::ollama_async_streaming(): Exception occured - All connection attempts failed 2024-07-20 18:05:20,789 - 8488127488 - utils.py-utils:50 - ERROR: All connection attempts failed Traceback (most recent call last): File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions yield File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpx/_transports/default.py", line 373, in handle_async_request resp = await self._pool.handle_async_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 216, in handle_async_request raise exc from None File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 196, in handle_async_request response = await connection.handle_async_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpcore/_async/connection.py", line 99, in handle_async_request raise exc File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpcore/_async/connection.py", line 76, in handle_async_request stream = await self._connect(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpcore/_async/connection.py", line 122, in _connect stream = await self._network_backend.connect_tcp(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpcore/_backends/auto.py", line 30, in connect_tcp return await self._backend.connect_tcp( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpcore/_backends/anyio.py", line 114, in connect_tcp with map_exceptions(exc_map): File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/contextlib.py", line 158, in __exit__ self.gen.throw(value) File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.ConnectError: All connection attempts failed The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/chainlit/utils.py", line 44, in wrapper return await user_function(**params_values) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/praisonai/ui/code.py", line 255, in main async for part in response: File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/litellm/llms/ollama.py", line 430, in ollama_async_streaming raise e File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/litellm/llms/ollama.py", line 374, in ollama_async_streaming async with client.stream( File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/contextlib.py", line 210, in __aenter__ return await anext(self.gen) ^^^^^^^^^^^^^^^^^^^^^ File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpx/_client.py", line 1617, in stream response = await self.send( ^^^^^^^^^^^^^^^^ File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpx/_client.py", line 1661, in send response = await self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpx/_client.py", line 1689, in _send_handling_auth response = await self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpx/_client.py", line 1726, in _send_handling_redirects response = await self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpx/_client.py", line 1763, in _send_single_request response = await transport.handle_async_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpx/_transports/default.py", line 372, in handle_async_request with map_httpcore_exceptions(): File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/contextlib.py", line 158, in __exit__ self.gen.throw(value) File "/Users/africodeacademy/.pyenv/versions/3.12.4/lib/python3.12/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ConnectError: All connection attempts failed 2024-07-20 18:05:20,922 - 8488127488 - config.py-config:351 - WARNING: Translation file for en-GB not found. Using default translation en-US.
@shrn680
@shrn680 Ай бұрын
great work! I have an issue though that every time I try my first prompt on my local codebase, praison ai logs me out! any ideas? tested on safari and chrome
@MervinPraison
@MervinPraison Ай бұрын
I am trying to root cause of this issue. can you please let me know 1. if you are using Windows or Mac or Linux? 2. Are you installing it locally or in a cloud ?
@MervinPraison
@MervinPraison Ай бұрын
Did you try the default username and password: admin and admin
@florentromanet5439
@florentromanet5439 Ай бұрын
@MervinPraison what if changing the settings.yaml file does note change the token Count ? mine stays at 128000 despite setting.yaml has been changed
@MervinPraison
@MervinPraison Ай бұрын
Default maximum token limit is 128,000 You can modify that easily, Here is how you can modify : docs.praison.ai/ui/code/
@luisruiz9205
@luisruiz9205 Ай бұрын
Then there's no way to make it work for Windows?
@MervinPraison
@MervinPraison Ай бұрын
I will be adding support to windows soon
@shawnkratos1347
@shawnkratos1347 Ай бұрын
got it working. do not make your own login. it will log you in but every time you try to run code it will crash and kick you out. when i logged in as admin/admin it worked. are you going to put this on github so everyone has access to the core files?
@MervinPraison
@MervinPraison Ай бұрын
Did u follow this and is it working now as per the document ? docs.praison.ai/ui/code/
@shawnkratos1347
@shawnkratos1347 Ай бұрын
@@MervinPraison yes. first time i set it up i used my own email and password. it would just crash and ask me to log in again. after changing to admin/admin it worked. im using ollama right now and the models keep giving me recommendations on the entire codebase not the files i specify. i want to set up claud 3.5 but don't see instructions for doing so for praisonai code. only instructions for openai,groq,and ollama. how do i configure antropic?
@shawnkratos1347
@shawnkratos1347 Ай бұрын
@@MervinPraison ps it did work on gpt3.5 but its very limited. and i want to use claud api
@shawnkratos1347
@shawnkratos1347 Ай бұрын
@@MervinPraison i got it working. export CLAUDE_API_KEY=XXXXXXXX then setting my model to claude-3-5-sonnet-20240620
@shawnkratos1347
@shawnkratos1347 Ай бұрын
@@MervinPraison think i got it working for claud. export ANTHROPIC_API_KEY=XXXXX setting model to claude-3-5-sonnet-20240620
@DWSP101
@DWSP101 Ай бұрын
I wouldn’t want my AI to know coding language I would want my AI to know psychology and human behavior sociology in all of the knowledge base I have on the human condition and disorder is kind of like the DSM five but with more personal flavor of myself I wish there was a way to learn how to directly downloadAI model on a local computer of mine as long as I provide memory it should be able to work just fine but I really don’t know how to do that if I did God
@benoitcorvol7482
@benoitcorvol7482 Ай бұрын
Hello and thank's for your video, as usual it's awesome, i'm trying to run it but they ask me credential password and email, when i put some and start to prompt it's send me back on the login page, i use a recent macbook M3 anyone know how to fix that ? Thank's again from France :)
@MervinPraison
@MervinPraison Ай бұрын
Did you try the default username and password. Username: admin Password: admin
@benoitcorvol7482
@benoitcorvol7482 Ай бұрын
@@MervinPraison it works thanks ! :)
@robertstoica4003
@robertstoica4003 Ай бұрын
Sure, let's just throw an arbitrary 20x developer now because the usual 10x is not hype enough anymore.
@webskillz
@webskillz Ай бұрын
This is great, but doesn't the Claude project feature combined with artifacts do the same thing?
@MervinPraison
@MervinPraison Ай бұрын
No. Claude Artifacts doesn’t know ur full existing code base which is in your computer
@webskillz
@webskillz Ай бұрын
@@MervinPraison ok thanks for clarifying
@solomonegwu6017
@solomonegwu6017 Ай бұрын
If you upload your entire codebase to the Claude project feature, Claude will have access to and be able to understand that full codebase within the project context.
@MyrLin8
@MyrLin8 Ай бұрын
about what I'm seeing as well
@guntarion
@guntarion Ай бұрын
I got the list of files and folder shown, but the token count is 0, why is that?
@MervinPraison
@MervinPraison Ай бұрын
Please try now. Upgrade to the latest version. Probably you were using non python project, but that bug is fixed now. pip install -U "praisonai[code]" to upgrade
@user-me7xe2ux5m
@user-me7xe2ux5m Ай бұрын
Excellent work. I only have one issue: I followed your instructions. Now every time I enter a prompt, I am redirected to the login page and then nothing happens. How can I circumvent this?
@MervinPraison
@MervinPraison Ай бұрын
Are you using windows ?
@user-me7xe2ux5m
@user-me7xe2ux5m Ай бұрын
@@MervinPraison I am using a MacBook running the latest version of macOS. From the log of external LLM (Claude 3.5), I can observe that the LLM query has been made, but I don't get to see the response in the UI because the login redirect is interjected. Is there any configuration I need to adjust? Any help is highly appreciated.
@florentromanet5439
@florentromanet5439 Ай бұрын
@@user-me7xe2ux5m really interested in that as well. Please follow up if any solution
@benoitcorvol7482
@benoitcorvol7482 Ай бұрын
@@user-me7xe2ux5m Same problem on mac m3 as well did you find the way to fix it ?
@MervinPraison
@MervinPraison Ай бұрын
Sure, I will do my testing and get to you all soon. Meanwhile you also please test after creating a virtual environment using Conda or pyenv or venv ?
@jbrockman2003
@jbrockman2003 Ай бұрын
Can this be setup using Codestral?
@MervinPraison
@MervinPraison Ай бұрын
Yes you can use codestral
@Chatec
@Chatec Ай бұрын
Can it be installed globally or only in virtual environment.
@MervinPraison
@MervinPraison Ай бұрын
Globally
@Chatec
@Chatec Ай бұрын
@@MervinPraison thank you Mervin. You've always been my AI guide though am in Software Engineer.
@darkreader01
@darkreader01 Ай бұрын
This is what exactly I needed. I have tried it. But entering the prompt, the UI is going back to login page. And I am getting an error message: "500 Internal Server Error". I have tried gemini 1.5 pro and gemini 1.5 flash. I have set the gemini api key. Also I am getting an warning saying SDK is disabled, I dont know if it has something to do with the error. How can I fix this?
@MervinPraison
@MervinPraison Ай бұрын
Are you using windows ?
@darkreader01
@darkreader01 Ай бұрын
@@MervinPraison No, I am using linux mint
@benoitcorvol7482
@benoitcorvol7482 Ай бұрын
I actually have the same problem did you find out, how to fix it ?
@darkreader01
@darkreader01 Ай бұрын
@@benoitcorvol7482 No, I haven't found any fix yet
@MervinPraison
@MervinPraison Ай бұрын
Did it bring up the UI? If so Try using admin and admin As username and password
@ReviewSmartTech
@ReviewSmartTech Ай бұрын
Am on windows using wsl, it’s kinda hung…. Is this config supported?
@MervinPraison
@MervinPraison Ай бұрын
Sorry for that. Soon I will add doc on how to add this in Windows.
@ejkitchen
@ejkitchen Ай бұрын
@@MervinPraison I think he means running in WSL Ubuntu/Linux on Windows. I have tried both and no luck with either.
@speedyq8
@speedyq8 Ай бұрын
Do not reinvent the wheel. Use cursor.
@MervinPraison
@MervinPraison Ай бұрын
Cursor is good, but I am not as convinced as this
@florentromanet5439
@florentromanet5439 Ай бұрын
@MervinPraison That's really cool! I managed to install this on a small server on my network. I have some token on Claude API can we use that as well ? (meaning along OPENAI, GEMINI and GROK ?) For the community at 6:00: code: ignore_files: - ".*" - "*.pyc" - "pycache" - ".git" - ".gitignore" - ".vscode" - ".idea" - ".DS_Store" - "*.lock" - ".pyc" - ".env"
@MervinPraison
@MervinPraison Ай бұрын
You can use Claude
@chrisdsilva7114
@chrisdsilva7114 28 күн бұрын
@MervinPraison I have been facing issues when i query my codebase and it crashes to the login page again
@MervinPraison
@MervinPraison 28 күн бұрын
Please try using admin and admin as the username and password
@Atom_Cypher
@Atom_Cypher Ай бұрын
Good video 👍I need a small help in installation. I'm getting error at the time of installation in Mac. can you please help @mervin? pip3 install "praisonai[code]" Collecting praisonai[code] Using cached praisonAI-0.0.5-py3-none-any.whl.metadata (747 bytes) WARNING: praisonai 0.0.5 does not provide the extra 'code'
@SaurabhBhatt-vx8bq
@SaurabhBhatt-vx8bq Ай бұрын
@MervinPraison I was trying to follow the same thing but its opening the chainlit login page. When i login there its successfully structure my folder but its not generating any content and redirecting to login page after some time, any idea on this ?
@MervinPraison
@MervinPraison Ай бұрын
Are you using windows ?
@SaurabhBhatt-vx8bq
@SaurabhBhatt-vx8bq Ай бұрын
@@MervinPraison No, its ubuntu
@MervinPraison
@MervinPraison Ай бұрын
Try using admin and admin As username and password
@SaurabhBhatt-vx8bq
@SaurabhBhatt-vx8bq Ай бұрын
@@MervinPraison Thanks for this advice it worked ! However its responses are not accurate ( like the file name I'm asking exist in my code base and its showing in the context but its unable to recognize the file saying this file doesn't exist ). But for some files its giving the correct answer.
@MervinPraison
@MervinPraison Ай бұрын
@@SaurabhBhatt-vx8bq Also it depends on the model you are using. Better the model, better the response.
Give Long Term Memory for AI Agents, Here is how
6:48
Mervin Praison
Рет қаралды 4,4 М.
LlamaFile: Increase AI Speed Up by 2x-4x
8:43
Mervin Praison
Рет қаралды 9 М.
Joker can't swim!#joker #shorts
00:46
Untitled Joker
Рет қаралды 40 МЛН
This Dumbbell Is Impossible To Lift!
01:00
Stokes Twins
Рет қаралды 32 МЛН
Мы сделали гигантские сухарики!  #большаяеда
00:44
It’s time to move on from Agile Software Development (It's not working)
11:07
I Tried Every AI Coding Assistant
24:50
Conner Ardman
Рет қаралды 773 М.
Claude 3.5 Deep Dive: This new AI destroys GPT
36:28
AI Search
Рет қаралды 641 М.
NEW Grok 2 vs ChatGPT 4 🥊 The ULTIMATE AI Showdown! (UNEXPECTED)
13:10
Earn $1,350/Day with ChatGPT & Google Drive for FREE
17:03
Chad Kimball
Рет қаралды 366 М.
18 Months of Building Autonomous AI Agents in 42 Minutes
42:12
Devin Kearns | CUSTOM AI STUDIO
Рет қаралды 71 М.
Is AI Replacing Software Engineering?
16:09
CS Dojo
Рет қаралды 29 М.
Have You Picked the Wrong AI Agent Framework?
13:10
Matt Williams
Рет қаралды 61 М.
Build Anything with Perplexity, Here’s How
42:16
David Ondrej
Рет қаралды 180 М.
Joker can't swim!#joker #shorts
00:46
Untitled Joker
Рет қаралды 40 МЛН