Impressive, very nice. Let's see Paul Allen's optimization.
@joshtriedcoding Жыл бұрын
dude I really appreciate you for commenting so often. Mhmm very nice, it has such an impressive THICCNESS to it
@BeyondLegendary Жыл бұрын
Your complient was sufficient, Josh.
@semyaza555 Жыл бұрын
@@BeyondLegendaryLmfao
@joshtriedcoding Жыл бұрын
@@BeyondLegendary hahaha
@DontFollowZim Жыл бұрын
Edge servers tend to be weaker virtual machines. Their advantage is their proximity but they have a disadvantage of weaker machines, which could be a significant factor if your requests are taking 200ms+ it sounds like there's either a decent amount of compute happening, which would make the weaker machines more noticeable, or there's a decent amount of network io, which could be related to what you were showing with distance between servers and DB.
@joshtriedcoding Жыл бұрын
oh yeah there's definitely some compute, this is nowhere near an empty api route. Just the relative differences between edge and non-edge were a bit surprising to me
@filipkovac767 Жыл бұрын
nearly by half -> nearly 40% -> in the end 34% just tell us the truth even if it doesn't sound so flashy 📸
@joshtriedcoding Жыл бұрын
fair point
@chiblitheone Жыл бұрын
Maybe the extra delay is because Edge Functions are running on Cloudflare compared to regular Serverless Functions on AWS. And with the both Upstash and PlanetScale running primarily on AWS, the connection inside of AWS might be faster.
@kavindesivalli Жыл бұрын
Oh damn... another day, another new thing I'm learning from you 👍💪
@joshtriedcoding Жыл бұрын
cheers kavin!
@MikeNugget Жыл бұрын
Edge includes additional layer, additional routing, stack and network specifics of the provider. They also do analytics, collect telemetry and a bunch of other things which can just slow down the request.
@mateja176 Жыл бұрын
Conceptually, it acts like a DB transaction. Alternatively, in some cases it would be possible to conditionally merge the operations.
@TheTmLev Жыл бұрын
What you're calling "blocking" requests are not actually blocking, since you use async/await. The correct term is "sequential".
@joshtriedcoding Жыл бұрын
blocking means the client usually waits and doesn't do anything else until it receives the server response, doesn't that block the process/thread reading the response? Sequential sounds like a good term to describe this either way
@TheTmLev Жыл бұрын
@@joshtriedcoding `await` doesn't block the thread, quite the opposite - it allows thread to process other Promises in the meantime.
@11r3start11 Жыл бұрын
@@joshtriedcodingblocking in multithreading means the one which blocks thread and make it unusable at all. await waits for the execution but doesn't blocks which is the main difference. I'd avoid usage "blocking" terminology in this case, as none of the threads were blocked.
@sjain07 Жыл бұрын
You can configure the vercel edge locations, then they will definitely be faster than serverless
@OryginTech Жыл бұрын
I’m confused, isn’t the downside of this that you’d be making unnecessary calls if the 2nd func only needs to be executed conditionally? Yes the route takes less time now, but if you have a very expensive func, you’d be running that constantly.
@joshtriedcoding Жыл бұрын
if the condition doesn't run, the command will not be added to the pipeline and will not be executed
@OryginTech Жыл бұрын
@@joshtriedcoding maybe I’m missing something, but according to your diagram doesn’t this mean that it needs to anyway wait for the first command to return to then run the second? So how is it different from awaiting the command?
@akhilscamp1905 Жыл бұрын
Any such feature in axios ? I guess we could only opt for SWR for suck kind of optimisations
@shivanshubisht Жыл бұрын
use vercel's regional edge, which would only use edge workers near your database region
@outroddet Жыл бұрын
Hey, is programming your daily job, or what do you do for living?
@arnhazra Жыл бұрын
Hey Josh, I am using MongoDB with next js and its super slow, 15-20 seconds each request. Same API takes only 500ms in Express or Nest JS
@joshtriedcoding Жыл бұрын
oh wooow it should not take 15-20 seconds
@eliaswennerlund7581 Жыл бұрын
I don't know the specifics of your code, but something that I encountered when using mongodb and next.js was that a new connection to the database was initialized on every incoming request. I fixed this by caching the connection. I also believe that the type of runtime may cause the same thing to happen, since some runtimes doesn't support long-lived connections.
@miguderp Жыл бұрын
Check your functions' location, I believe by default it's set to Washington. You can find that under Settings > Functions in your Vercel project page
@PwrXenon Жыл бұрын
Standard nextjs user
@arnhazra Жыл бұрын
@@miguderp , No I have set it to nearest, also I am saying about my local.
@Gerrilicious Жыл бұрын
Sehr informatives Video Danke dafür!
@obinnaee8685 ай бұрын
How do I do this for springboot ?
@TheIpicon Жыл бұрын
actually Theo has a video answering my question on stream about the exact same topic (when edge was just introduced). Theo said when he recommends using the edge, he talks about the RUNTIME. not the location. because of the same issue you figured out by yourself. you can config the Vercel's edge to only run in specific region (which you'll want next to your DB), but still use the good and fast runtime of edge. here's the video I'm referencing: kzbin.info/www/bejne/i4HSkIuXncqYZ8k (I'm so hyped about it because it was the first time he noticed me on stream 😆)
@TheIpicon Жыл бұрын
great job btw figuring it out on your own
@koustavmaity-fh3gx Жыл бұрын
can you make a complete next-auth tutorial video on basic to advanced level..
@babayaga6172 Жыл бұрын
Nice Can u please make a video hoe to handle cache and invalidate cacahe in large relational database and how to setup keys with prisma and redis please
@dogfrogfog Жыл бұрын
why did you decide to use Redis for this project?
@hafidselbi2497 Жыл бұрын
great question 👍
@joshtriedcoding Жыл бұрын
cause its fast
@11r3start11 Жыл бұрын
for this scenario - its seems like scalable, fast and popular. But i'd say its quite a misuse and something event-driven and/or actor-based will be more suitable)
@joshtriedcoding Жыл бұрын
@@11r3start11 The built in TTL is super handy, it's fast because it's in-memory and beyond key-value pairs and some simple hashes there are no complex relations. Not sure what you mean by misuse
@Chris-zt4ol Жыл бұрын
Imagine Prisma had that
@benji9325 Жыл бұрын
But 200+ms is still slow tho..
@null_spacex Жыл бұрын
What's your point?
@joshtriedcoding Жыл бұрын
depends on the calculations you do in the api route
@developer_hadi Жыл бұрын
How can I do that in mongoose🤓
@breakinggood-r2v Жыл бұрын
is this a course or you building your own website/project
@wasd3108 Жыл бұрын
wait, u're gonna tell me, instead of making a sequential requests, that parallel will be faster? NO WAAAAAAAAAAAAY
@breakinggood-r2v Жыл бұрын
You looks like foden football player
@luckypius132 Жыл бұрын
@joshtriedcoding Жыл бұрын
first it was kevin de brunye and now this
@berniko4954 Жыл бұрын
I am lazy to watch video full but I want to increase api speed
@CallousCoder Жыл бұрын
Stop using a baby language and use Rust or C++ and you have a 100/200% speed increase. Us system developers are like 250ms pffff kill off the API guys this adds too much overhead.
@joshtriedcoding Жыл бұрын
🤡
@CallousCoder Жыл бұрын
@@joshtriedcoding Yeah it always humors me when I hear JavaScript and Python devs talk about performance, when their initial choice to use those languages for a backend should be at the very least eyebrow raising. And they are ugly large bloated languages. I like small lean and mean languages, they are also more robust
@StingSting844 Жыл бұрын
This could have been a short. You just batches requests together using redis-pipeline. It's not a trick but a common occurrence in all products. Disappointed with the clickbait 😞