Every one of these apps scrolls past the beginning of the answer, forcing me to scroll back up. And you've just blindly copied that.
@FrontPage9824 күн бұрын
wish to disable on all AI apps, it is terrible thing!
@KiKaraage24 күн бұрын
I hhope this can be improved, at least if there's multiple code blocks, the navigation arrows can jump to next/previous code blocks instead of just to the very top
@nero370024 күн бұрын
At least chatGPT is slow enough so you can read along before it scrolls out of view.
@KattKingston24 күн бұрын
I’ve worked on two chat AI projects. The goal is to help users quickly assess whether an answer is accurate or not. A glance at the response should be enough to tell if it’s wrong. You can’t please everyone, and it wouldn’t make sense for Theo to design the app in a way that’s completely different from the standard. That would likely feel off-putting... These AI chat apps also put delays in the response to make it feel more natural and easier to read when it's being streamed in. They could make them instant if they want but it's all about what most users want. Also one thing I did at my last job was add an instant stop to scroling if we detected the user trying to scroll up but like mentioned you can't please everyone.
@KeyT3ch23 күн бұрын
This needs to have more upvotes.
@dogoku24 күн бұрын
Why not call it chat thing?
@_y7ya24 күн бұрын
Video has been ruined for me after reading your comment. What a missed opportunity! ahhhh
@erentr716724 күн бұрын
lol yeah
@stercorarius24 күн бұрын
He said the domain was taken
@asciimage24 күн бұрын
Already taken
@bossmusa907523 күн бұрын
he explained why on that stream but i legit just forgor the explanation
@bankmanager24 күн бұрын
You have a LOT of competition with much more mature and feature-rich open source alternatives, but good luck. I personally wouldn't pay for something like this, especially without an ability use our own API keys or local models. Get shipping!
@alir8zana63524 күн бұрын
Are there any open source local first options for chatting with LLMs? This project is not open source and I want to understand how these local first experiences work
@bankmanager24 күн бұрын
@@alir8zana635Yes, there are many! oobabooga and LocalAI are both awesome, or there are projects like Continue which is local AI VS code extension.
@bankmanager24 күн бұрын
@@alir8zana635Oh, also Jan if you are on Mac/Linux too.
@bankmanager24 күн бұрын
@@alir8zana635sorry for the spam, but also LM Studio isn't open source but has a lot of open source tooling around it and is very feature rich and well designed. llama.cpp or koboldcpp are inference engines you can run on the CLI or with their own inbuilt UIs too. There is a rich ecosystem!
@elencantadordegatos24 күн бұрын
@@alir8zana635 There are, lobe chat for example
@younessexd374824 күн бұрын
It does really bad job at handling big prompts edit: It got fixed!!!!
@ScottMaday24 күн бұрын
Theo's addiction to speed is like that friend who keeps modding their car to eek out every last horsepower. At the end of the day though, it's still a honda civic under the hood.
@jasdhfkajsdfhk23 күн бұрын
i mean ive seen some crazy fast civics tbf
@schtormm23 күн бұрын
ehh it's not as janky as a civic with a laptop
@tigerxplso24 күн бұрын
It's already down... Can't get responses, either one sentence or nothing is returned.
@FrontPage9824 күн бұрын
just refresh the page, solve captcha and it will work.
@Joseph_ebuka23 күн бұрын
They got DDOS
@thomasle10024 күн бұрын
I don't think speed is a pain point for users (comparing to existing solutions)
@comradepeter8724 күн бұрын
nah mate you didn't try it then. It actually feels distinctly fast. In fact this made me realise a lot of the latency that I attributed to LLMs was just sh*tty React code.
@aqua-bery24 күн бұрын
@@comradepeter87 I tried it on my phone. It feels exactly the same as the chatgpt mobile app. Really doesn't matter. The only speed difference is really that he didn't add animations to the messages showing.
@Kampouse24 күн бұрын
@@aqua-bery meh i feel its just a way of riding the ai hype but dont blame him
@germandavid252023 күн бұрын
sometimes I start typing in ChatGPT and 1 second later it will delete what I was writing because it finished loading, it's really annoying
@adeidara995523 күн бұрын
Yeah, speed is nice and all but I am not switching over to a less feature bare bones editor just for how fast their react code is.
@carltongordon24 күн бұрын
While the app is crazy fast.... the chat hasn't given me a single response in almost an hour now
@t3dotgg24 күн бұрын
We got ddos'd! Should be fine now, super sorry about that
@kstash356424 күн бұрын
@@t3dotgg Another genuine/newbie question: I thought CloudFlare prevents that nowadays? Or is that a paid thing?
@carltongordon24 күн бұрын
@@t3dotgg Tanks Theo!
@maciekdeveloper24 күн бұрын
@@t3dotgg Maybe wouldn't happen if you had hired DevOps instead of cloud
@fatjay940224 күн бұрын
I dont want to sound like a dick.. but i dont care if its the " Fastes" i care about the information it gives me :).. i am ok with waiting 1 min for the best one.
@vitricecascales378724 күн бұрын
Actually the only problem could be the sys prompt but the ai behind should work the same since it's a call to their api
@joelv449524 күн бұрын
Ya “really fast” usually means it’s a lower quality model.
@KattKingston24 күн бұрын
@@joelv4495 We actually add delays to the streaming response to these chat apps to make it easier for the userse to read while being streamed in. I'm sure we could stream this data in almost instantly if we disabled that but it's too jaring.
@dylanjonesSD24 күн бұрын
He’s using the same model, it’s faster because the web app is fast. Usually it’s slow because the chat apps intentionally slow down the input so it always comes in smooth and at about reading pace. This uses the same models but the web app is faster
@buildwithharshit24 күн бұрын
@@joelv4495 gemini 🤣
@TheAriznPremium24 күн бұрын
Seems like "latex" is not being rendered, but it's fast af. e.g. "The result of ( 89 \times 9 ) is ( 801 )."
@t3dotgg24 күн бұрын
Latex support is high on our list!
@laztheripper24 күн бұрын
We really needed another AI chat bot. Very innovative, thank you.
@JoRyGu24 күн бұрын
If the purpose of this was sort of a tech demo where the source was shared it would be really cool, but there are much higher quality products out there for the same price. Making speed the differentiating value proposition when something like perplexity provides smarter answers and only takes 1 second longer was certainly a choice.
@mantas982723 күн бұрын
@@JoRyGu ' but there are much higher quality products out there for the same price' for 8$ a month ? you sure?
@laztheripper23 күн бұрын
@@mantas9827 He is correct. There's better for free. You can find lower reasoning faster models out there where you can just download the training set for and run on your own machine. Or you can use ChatGPT free tier on lower settings, which is roughly similar in reasoning as the model he used.
@rodgetech23 күн бұрын
@@mantas9827yes!
@laztheripper23 күн бұрын
@@mantas9827 Yes, many pretrained sets are available online to use, for free. Most of the big LLMs are free, you just pay for premium / more advanced functionality. Funnily enough, the model Theo uses isn't much better than the free tier of ChatGPT, and the answers are just as quick. You can compare this model to the more advanced ones like o1 and say o1 is slower, but that leaves out the fact that o1 produces much better and more accurate results.
@gamersruin23 күн бұрын
"You've reached the message limit. This often happens because your IP address is being used by others. Sign in to reset your message limit." I haven't sent a prompt yet...
@anttihilja23 күн бұрын
vpn?
@gamersruin22 күн бұрын
@@anttihilja nope. I had to clear my site cache to get it to work, even though it was my first time using the app. It's impressive performance, though.
@marchypolite24 күн бұрын
It’s fast but tbh I never really had a problem with the speed of AI chat bots right it’s more the accuracy of the response I’m more interested in
@Moksulini24 күн бұрын
"free"
@thinkbuddyai24 күн бұрын
it's great to have a maker like Theo join the AI chat market like us! we haven't seen a lot of proper products in a while, and i'm confident you'll go places - wishing you lots of success. the market is huge and there's plenty of room for all of us 🙌
@FrontPage9824 күн бұрын
what a cool self-promotion, you even supported the guy. btw i checked out your product too, it looks really nice. it's great to see such ethical people in the saas market. we were tired of seeing every new alternative saying 'X is dead, we're the kings now' or ruthlessly criticizing their competitors. a friendlier future is possible with such gestures
@olgucanylmaz205724 күн бұрын
@@FrontPage98 could not agree more. loved it!
@ИванРагозин-я8я24 күн бұрын
LOL
@pikavecordis505623 күн бұрын
@@FrontPage98 Self promotion on top of self promotion. Craaaazyyy
@yigidovic23 күн бұрын
hahaha cool
@noahfexpayton23 күн бұрын
The messages left aren't consistent. It just told me 3 left but then said 7 left.
@neofox252622 күн бұрын
Sounds like a bug you shouldn't report 😅
@krzysztofHD24 күн бұрын
Already broken... no response at all
@jakobhaywood22824 күн бұрын
it worked for me just now fine, fast indeed
@krzysztofHD24 күн бұрын
@@jakobhaywood228 works now... looks good...
@WereCatf24 күн бұрын
Oh god, I was hoping this was a joke, but it seems Theo has just fallen off the deep end.
@JoRyGu24 күн бұрын
Are you really surprised the guy that spends a significant percentage of his content shilling for various developer products is now trying to add an AI hype train revenue stream to his portfolio?
@WereCatf23 күн бұрын
@@JoRyGu It's less about being surprised and more about being...disappointed? Disgusted? Both?
@arotobo23 күн бұрын
@@JoRyGuit’s not even a good hype train tho. At least allow api keys or add o1 and Gemini to model selection.
@rli0923 күн бұрын
this is probably the worst copy pasta I’ve ever seen in a while
@subhranshudas886223 күн бұрын
next theo video - T3 web AI IDE.
@Chris-cx6wl23 күн бұрын
6:40 bruh, none of that was slow at all. Did you really build this to fix a 200 ms spinner on screen? I wish I had that type of free time
@catalyst925922 күн бұрын
It is not about the interface, try getting a larger response back from the model
@Dank22 күн бұрын
@@catalyst9259 8:55
@luizgrocco23 күн бұрын
I got rate limited from the very first query I made, what might be going on?
@MightyArts23 күн бұрын
Isn't this just a wrapper application around ChatGPT...? :)
@AD-wg8ik21 күн бұрын
No, it's ChatGPT but faster
@Maxjoker9820 күн бұрын
@@AD-wg8ik No, I'm fairly certain that it just uses ChatGPT APIs internally, since you can't use their models otherwise. The only thing making this "faster" is simply the bad UI latency of the ChatGPT website. I'm fairly certain the "lower bound" on reaction time is still the ChatGPT API.
@strawberrybeat755124 күн бұрын
uhm.. sending messages doesn't work on a mobile device.
@jon186723 күн бұрын
The thing I'm not understanding is that it seems like the limiting factor of AI stuff wouldn't be anything that local first would solve. For the most part, the bottleneck in the UI feeling snappy is almost always the time it take for the LLM to do what it does, and it also seems like connecting to that llm over websocket or whatever would be unnoticeably different in speed as far as updating the UI goes for that. I guess you could probably switch between chats really fast with local first, but am I missing something here? Side note, I've only got 7 messages left on my free trial which isn't enough for me to compare the actual speed at which the model runs earnestly to another platform
@realOo7-v6r24 күн бұрын
Im trying it from South Africa and its not working, its just eating the prompts and not giving any response
@realOo7-v6r14 сағат бұрын
Update: works now
@betweenbrackets23 күн бұрын
All this hype for a ChatGPT wrapper 😂 this is a weekend project for most and you’re charging people for it? At least provide option to input our own API key instead of milking your junior dev audience over a glossed over “product”. Theo with yet another cash and grab
@rodgetech23 күн бұрын
Easy to talk
@mantas982723 күн бұрын
do you expect him to give you access to unlimited claude / gpt api calls for free?
@betweenbrackets23 күн бұрын
@@mantas9827 no? Hence ability to input our own keys where we can use it without him charging an margin on the calls? That would’ve been a right thing to do for his community
@卛20 күн бұрын
@@mantas9827 your reading comprehension is exactly why theo is releasing such a product. he can exploit people like you
@letsgoo488124 күн бұрын
bullshit , its slow as hell
@tzuilee58824 күн бұрын
do we really need another chat bot? doubt
@J3dotgg24 күн бұрын
It's not another chat bot, did you even watch the video?
@FrontPage9824 күн бұрын
I think that 'bring your API key' model is still work! but not sure for t3's model of all inclusive.
@elencantadordegatos24 күн бұрын
We don't
@hexxt_24 күн бұрын
on the front page, it trims the titles way too short and nothing when you hover also there are no login or signup links on it so you have to click on one of the chats first
@certainlyJesse24 күн бұрын
"ill leave it at a cost" hmm I dont belive you.
@thatanimeweirdo24 күн бұрын
Just tried this out, first answer was super fast, any new chat or following prompt never finished "loading". I was able to watch this full video, ask chatgpt and copilot before it finished, it's still going.
@ginocote22 күн бұрын
That's exactly what I thought: when the execution happens on the front end, the longer the discussion or response are, the more my computer freezes and becomes practically unusable during the extended conversation.
@callofbrokendreams24 күн бұрын
the website is responsive but the reply are really slow for me, i just asked hello world in rust and its been a minute not working but for super random question i asked capital city of liberia and it answered while searching for the answer to hello world in rust. It still hasnt replied for that. I dont know if its just me or something i have messed up. edit: maybe cause my first qustion was what would be your response to a restricted chat prompt that i was flagged cause later after reasking in new chat to write a rust program to print hello world i got message saying my chat was flagged please start a new chat that it started working finally edit2: like is the server under load cause the reply time is super inconsistent, for half time it was working then in middle of a prompt it stopped to then again just loading screen for my prompt
@Kampouse24 күн бұрын
Lmao i build the same thing in Qwik during Christmas and our design are the same 😂😂😂
@carterstach603424 күн бұрын
Can you share the source?
@Kampouse24 күн бұрын
cant post link here
@Kampouse24 күн бұрын
@@carterstach6034 same username on the git site with "justchat"
@Kampouse24 күн бұрын
ugh
@Kampouse24 күн бұрын
@@carterstach6034 just search me on the site i have the same username and the repo name as it pretty self describing
@wisdomelue24 күн бұрын
didn’t name this ChatThing???, disappointed in theo
@FrontPage9824 күн бұрын
chaThing (t is common haha)
@Chrizzmeistah24 күн бұрын
why can't i login with the ai tools that i already have paid for?
@handle_unavailable39724 күн бұрын
gotta make money
@rasibn24 күн бұрын
They are self hosting their AI on azure I believe.
@FrontPage9824 күн бұрын
it is not possible, only way is bringing your API and ChatGPT Plus/Pro does not include API key, you have to pay seperately. no oauth2 or sth where I can use my own chatgpt
@t3dotgg24 күн бұрын
Sorry for initial instability and high error rates! We got DDOS'd super hard and it maxed out our limits on the LLM hosts. Throwing a captcha and browser check in so they can't keep it up 🙃
@KattKingston24 күн бұрын
I’ve worked on two large AI chat apps. We faced constant attacks from users cycling through thousands of IPs to exploit the free service. Many also abused the paid services to resell access since it offered unlimited usage. To address this, we quietly downgraded suspected accounts from paid to free and displayed a message asking them to contact us. Good luck managing that kind of challenge. I know every month we were savings near $100k after doing this from these attackers.
@kledmohd42302 күн бұрын
@@KattKingstonwhat's the chat apps you guys built.
@mahmoudmaher413824 күн бұрын
Application err😅😅😅
@QuintenCoret24 күн бұрын
> title says it's free > shills 8 dollar pro tier within the first minute
@cipher0124 күн бұрын
I mean why not
@modernkennnern24 күн бұрын
Also, no matter the question, you always get "Our systems have flagged your message or the generated response. Please try again with a different prompt." as a response. You got that response fast though, so he's not lying :|
@insomnicc121524 күн бұрын
it is free, u just pay if u need it a bunch, he has t make it paid since it costs api credits
@z_096824 күн бұрын
Both are true... That Vercel bill has to be payed.
@greenstonegecko24 күн бұрын
You can use 50 prompts per day for free. Unfortunately we don't live in an infinite money glitch. For future reference, you should never think something is free, even if it says so.
@iangriggs23 күн бұрын
Excited for this app, the local-first approach and the features you are planning to build. Thanks Theo.
@mikee.22 күн бұрын
Of course it's fast, it's a 100 line gpt wrapper? It's not really much faster than my self-hosted librechat instance, which has waaaayy more features and runs on an oracle free server...
@Haphazardhero23 күн бұрын
I really appreciate the goal here to push the limit of what can be done here to improve the UX of these popular apps. Somehow this has become a bit of a lost art and I'm really happy every time I see this level of detail in providing the best experience possible. I'm also really happy to see it built with react router. I'm sure that team is happy too.
@Ximaz-13 күн бұрын
I think I'm biased as I prefer slow LLM chat experience. The newest GPT is slow to answer, where the older ones are quiet faster. I believe I associated speed and quality. The slower the better, thinking the LLM is ACTUALLY analyzing my question and looking for an answer. Now, I know I'm wrong, thanks to your video.
@dazecm22 күн бұрын
Interesting question at 7:11 and explanation from Theo. Adding to what Theor said you can also differentiate your product in other business areas ancillary to product features, such as how well you treat your customers, be that via customer support quality, pricing practice and general pro-consumer behaviours. It will invariably not be insta death for a competing product because you aren't solely competing on feature set and performance.
@NewtGQ24 күн бұрын
I dont understand the criticism. "If x company solves this problem, isnt this product dead?" yes, but they probably wont. "If mac ran directX games on bare metal, wouldnt windows be dead?" Like yeah, but they won't - so no.
@vertopolkaLF24 күн бұрын
@@mehdi-idham he could have this comment since livestream and just waiting for the video to go live also 2x watch speed exists, so it's real to have this question after 5 mins
@soroushjm101124 күн бұрын
Windows is better 😌
@asi6942023 күн бұрын
Do you really think these companies like chatgpt or claude wont solve the speed problem and make it faster😂😂😂
@NewtGQ23 күн бұрын
@@asi69420 No, they don't care. They want to minimize costs, and maximize features. Its like NVIDIA founders cards. They aren't the best cards for performance and cooling because NVIDIA spend most of their time actually designing the chip inside the card instead of the housing.
@NewtGQКүн бұрын
@ they won’t prioritize it, no
@CaptainLungi22 күн бұрын
wow, making frontend experience butter fast smooth, lets go
@robertpalm85922 күн бұрын
Amazing tool🤩 Now Theos vid transcripts can be summarized even faster for not needing to watch em in full. ... Kidding but not kidding.
@mdrafatsiddiqui24 күн бұрын
trying from india. not that fast
@danialbka779021 күн бұрын
the different model selection and speed is cool
@mgetommy21 күн бұрын
branching chat would be a legit improvement on the experience. have thought about this a lot
@juanandresmezzera930423 күн бұрын
Amazingly fast! It would be a nice addition if the system were self aware. The only query I asked about it that it was able to answer was `what are you?`. I tried, for example, `what is your free quota?` and it went directly to a general response. I understand this is not the point of the app, but thinking out loud here Definitely going to keep using it. Thank you for the experience you built
@alexholker130923 күн бұрын
I also asked this type of question and got a response, but I asked about "T3 Chat" and not "you".
@benheidemann383623 күн бұрын
Hey Theo, this looks really cool. I agree a desktop app (and mobile app like Claude) would be great, as would branching chats. Been wanting both of these for a long time! In addition, I’d love the ability to select one or more files from my PC as context for the chat. Ideally also I’d love to select specific lines from files.
@puremajik24 күн бұрын
Hi, I love the app! Great speed! I'd like to know why you went with Dexie compared to the other best in class options? Replicache, Zero Alpha, ElectricSQL, PowerSync, Readyset. You asked for suggestions in your video so here are my 2 suggestions: Suggestion 1: Create a feature that analyzes past threads with the AI which would: - Identify initial missing information that was later crucial - Common clarification patterns - Task-specific requirements - Distinguishing between necessary exploration iterations and avoidable clarification iterations Then create generate recommendations for: - User prompting strategies (how to frame requests better) - System prompt modifications (what default requirements/expectations should be built in) Create task-specific prompt templates that: - Pre-emptively request commonly needed information - Set appropriate expectations for iteration - Include task-specific context requirements - Account for different interaction needs (creative vs analytical tasks) Suggestion 2: - A community prompt library with prompts contributed by users (system or user prompts) that can be up voted or down voted for specific tasks.
@KiKaraage24 күн бұрын
Great suggestions, deserved to go higher
@daniellchukwu23 күн бұрын
Wow. Words can't describe how helpful this comment has been for me. Been deep diving into all the alternative options u mentioned and i am absolutely blown away. Thank you and God bless 🙏
@jordy44223 күн бұрын
Theo turning into a philanthropist. Can’t believe we got dark mode for free
@kshitijk1423 күн бұрын
Pushing the others, good shit fr
@spreen_co24 күн бұрын
you did not build your own chatbot (as the title implies) you built a web UI for existing chatbots
@hansiboy534824 күн бұрын
I hope 2025 is the year of SPA and offline first / sync layer
@ИванРагозин-я8я24 күн бұрын
we need more GPT wrappers
@amine723 күн бұрын
I'm sorry, but literally no one cares about a loading spinner or the speed of text chunks if you're eventually getting the best results. The claude part is stupid at best. Also your app is currently broken.
@stoef24 күн бұрын
I would love to have markdown-esque prompting similar to claude. having a code-block be shown in monospace and maybe even syntax highlighting makes it way easier to parse your own inputs imo.
@dogoku24 күн бұрын
Just make it a PWA, no point making it a separate app
@KiKaraage24 күн бұрын
It probably will be anyway
@Leto2ndAtreides24 күн бұрын
May as well add the experimental Google models... 'cus they're free. Also, it's not even been a day and most of the models are in the Pro tier. lol
@Nil-js4bf21 күн бұрын
The optimizations here are probably out of my league but I would love hearing more technical details about what kind of challenges there are and how you solved it. Of course, given that it's a paid product, I don't expect it to be super deep explanation though.
@andybourgeoisinfo23 күн бұрын
User functions would be very nice. On the desktop app, a "open raycast deeplink" function would let us automate almost anything. Make the desktop app fully automatable so we can create crazy workflows
@kstash356424 күн бұрын
Genuine question: why use Next at all if want to do routing a different way?
@t3dotgg24 күн бұрын
Server bundling, static gen for static pages, simple deployment, streaming baked-in, server actions and the ecosystem of next-specific tools that are super handy
@kstash356424 күн бұрын
@@t3dotgg gotcha. Thank you ❤
@henryvaneyk376923 күн бұрын
I asked Co-Pilot to build me a Deno 2 REST API using Fastify and Zod as a starting point for a CRUD service. The result was a disaster that did not even build. At this point I am very disappointment with AI. I think most SWE jobs are safe for the foreseeable future.
@boredbytrash23 күн бұрын
I work with LLMs every day and you’re absolutely right. It needs too much handholding to be any competition to developers, at this moment at least
@neofox252622 күн бұрын
Yeah with new frameworks it just has no idea
@CyberTechBits23 күн бұрын
SPA & PWA are the way! They feel soooo much better and are so much faster! If you build it right they will come!
@dustindustir52117 күн бұрын
Features I need are reruns and branching in the chat history. I need to be able to edit my own prompts up in the chat historie. I lik the way google ai studio does it.
@datalabwork17 күн бұрын
What about file upload such as PDF in ChatGPT Pro?
@SanderCokart24 күн бұрын
I'd love LLM's and more specifically local ones in IDE's or editors to be able to keep it up to date with the docs of your framework of choice and have it be up to date with modern features and patterns. Like v0 for nextjs but one thats able to understand your code base.
@bourgtai23 күн бұрын
Chat: This is nothing but UX 😒 Apple (in its prime): No shit, sherlock.
@randyproctor392320 күн бұрын
Oh man I saw Dexie in one of your VS Code tabs in a recent video and I was curious!
@Justgoodvids18 күн бұрын
This looks great Can you add a non-AI chat mode so I can store notes Can you add the list of app you’ve worked on to the t3 website The default chats blank out when you revisit the chat website
@Eva-km5ng20 күн бұрын
You will need to remove the navigate on mouse down if you want to properly support mobile. Press and hold is how you right click on links in mobile
@Tymon000024 күн бұрын
I just said hi "Our model provider flagged this prompt ☹ please start a new chat with a different prompt."
@mikoajpaterek847523 күн бұрын
Really great job, would be great to contribute to it 😮😊
@no_the_other_ariksquad24 күн бұрын
You got me on laravel bro
@sohambhattacharjee831424 күн бұрын
at this point, theo can rewrite next because he of all people knows where it sucks
@additionaddict552422 күн бұрын
oh please go into details on the sync, really love those kind of things
@additionaddict552422 күн бұрын
also what the heck, theo shilling local first???
@EhabSamir-nn7le23 күн бұрын
I'm trying it now and it's amazing
@eloitorrents243922 күн бұрын
why not build an extension that improves chatgpt or claude UX?
@sumitpurohit884922 күн бұрын
Instead of making an app you should've made a lucia style guide to making an AI app faster and it's experience better. I think you can still do this right now and this app will serve as a demo of that guide.
@hexxt_24 күн бұрын
surprisingly fast, good job
@LorenzSascha24 күн бұрын
I'm sorry. Theo said in the video that the pro tier includes 500 messages/week, but the site claims pro as “unlimited” use of any model. So, what is actually in it?
@t3dotgg24 күн бұрын
Site has been updated
@FrontPage9824 күн бұрын
can I send 100K token in 500 messages for most expensive model? any token limits?
@audiecaceres-723 күн бұрын
Im using this from now on!
@georgekrax24 күн бұрын
Add memory storage between conversations and the ability to organize into folders and list favourite chats, and I will buy it!
@bankmanager24 күн бұрын
LM Studio might be a good free alternative for you in the meantime.
@neofox252622 күн бұрын
@@bankmanagerwhat about AnythingLLM
@brandon649023 күн бұрын
product 101 solve actual problems
@guxit22 күн бұрын
6:44, that loading state is what you call "super slow to navigate"? It was on screen for literally 13 frames. Theo has extremely high standards. While I wholeheartedly disagree, this is no doubt a cool project from a tech perspective!
@solomonakinbiyi23 күн бұрын
and before we go any further **here's a video from our sponsor**. it didn't happen but it played in my head, lol
@elwan-l124 күн бұрын
The first thing i tried was writing "cheesburger", and it got flagged immediately
@samithseu24 күн бұрын
incredibly fast, well done sir :)
@ustav_o23 күн бұрын
if you figure out a way to do branching im sure every other app in existance will also do it because it sounds amazing
@HeyNoah23 күн бұрын
This is awesome! Will "Projects" be like folders basically? Wish I could have folders to organize chats at least 2 levels deep.
@joshwhaley365923 күн бұрын
I'd be interested in hearing more about how your dexie code ends up. I made my own sync approach with websockets and API Gateway syncing to DynamoDB and had such a frustrating time getting things running smoothly.
@PedroTechnologies23 күн бұрын
Insane app Theo 🔥 Congrats!!
@sierragutenberg24 күн бұрын
Hey, why don't you make it a PWA?
@craxkerjack605724 күн бұрын
bug : using non registered account, asked a long response, then got IP limited, went to register (auth was slow, from indonesia), it didn't continue to the last chat, after clicking, it didnt answer my last question, nor was there an option to retry, so i had to retype again. maybe you need a "found a bug" tab to make development faster. enjoyed it though 🎉
@peca12332122 күн бұрын
Can you please add folder functionality for chats?