Great example to show that choosing the right technology will not automatically make the website fast. You have to write good code.
@Z4KIUS14 күн бұрын
good code in bad tech is often faster than bad code in good tech
@JohnSmith-op7ls14 күн бұрын
@@Z4KIUSTrivial performance gains like this rarely matter to begin with. Spend your time addressing issues that cost real money or adding features that make it. Chasing tiny page load speeds is just mindless busywork.
@Z4KIUS14 күн бұрын
@@JohnSmith-op7ls good feature beats minuscule speed improvements, but big speed regressions at some point beat any features
@JohnSmith-op7ls14 күн бұрын
@ But this isn’t about addressing relevant performance issues, it’s about pointlessly squeezing out a bit more, in a contrived demo, just for the sake of it.
@ulrich-tonmoy14 күн бұрын
why not make these feature the framework thing instead
@alexmortensen690114 күн бұрын
Interestingly enough, the edge McMaster has is not that their website is so insanely fast, its that everything you order will be delivered to you company in a couple of hours. So if you think the page loading is fast, checkout their delivery, lol
@drooplug14 күн бұрын
Better than amazon!
@thewhitefalcon853913 күн бұрын
Their other edge is that they have every conceivable product. They are all around a premium quality service with high prices to match. When you need something specific, fast, to exact specifications and perfect every time, you use this company. When price matters more, you try your luck on Ali.
@drooplug13 күн бұрын
@thewhitefalcon8539 I'll say they have a massive selection, but I often do not find what I am looking for there.
@Ginto_O9 күн бұрын
couple hours delivery is quite slow for Russia. The delivery here us usually 15 to 30 minutes
@allenklingsporn69938 күн бұрын
@@Ginto_O You clearly don't live in a rural area of Russia. McMaster delivers ANYWHERE in the Continental US that fast.
@elephunk689813 күн бұрын
Worked at McMaster for a few years. This kind of glosses over how we’re also able to perfectly sort/filter/and serve up data on over a half million different part numbers. There’s a looooot of stuff going on in the backend for this
@t3dotgg13 күн бұрын
It’s very very impressive stuff, especially for how long it’s existed and worked. I wish more of that info was public so I could have talked in depth about it 🙃
@drooplug14 күн бұрын
I like how theo thinks McMaster's competitive edge is their website and not that they knock on your door with your parts 3 minutes after you complete the order. 😄
@tsunami8703 күн бұрын
I live right next to a warehouse so for me it's more like 1 minute 😂
@mbainrot15 күн бұрын
The craziest shit with McMasterCarr thou... is it's even fast for Australians. And we can't even buy shit from them without shenanigans
@bugged121215 күн бұрын
No one cares about Australia, it's irrelevant to world affairs. Shoo.
@DanielCouper-vf5zh14 күн бұрын
I've used this as my go-to pat response to "can you give me an example of good web design/UX/UI" in interviews for years, is great that it's getting attention now 🎉
@allenklingsporn69938 күн бұрын
McMaster-Carr, shouldering the weight of America's industrial might since 1901.
@rikschaaf14 күн бұрын
13:45 prefetching is great! When I started experimenting with HTMX, I immediately turned that on there as well (it supports both on mouse down and on hover, depending on your preferences). Great to see that next.js also supports it.
@alexmortensen690114 күн бұрын
As an engineer, McMaster is the greatest website known to man
@AndreiLiubinskiКүн бұрын
3:57 >>thats pretty nuts Yeas, those are pretty nuts. And bolts
@ChaseFreedomMusician14 күн бұрын
So one of the things you seemed to miss was that with was a classic .NET 4.5 ASP website. So the tech for this is about 15 years old. All that javascript at 4:45 is auto genned. The back page for this is much simpler.
@brileecart11 күн бұрын
As a purchase manager that orders from McMaster CONSTANTLY, it's wild to me every time their website gets talked about. Worlds colliding or something lol
@spageen15 күн бұрын
McMaster-Carr Speedrun (100%, glitchless)
@hqcart114 күн бұрын
jquery baby, oh yaah, yo heard me right
@eddie_dane15 күн бұрын
25:32 Good sir, everything here is magical, if think back to the days of vanilla and jquery, but I get your point.
@MikkoRantalainen14 күн бұрын
The real magic: accept 2h delay for every change and you can cache *everything* for 2h.
@PraiseYeezus14 күн бұрын
the realer magic: 300ms delay for every change and caching things only after they're requested
@xanderplayz344614 күн бұрын
@@PraiseYeezusrealest magic: cache everything in the browser indexedb, and store a hash, so when the hash sent from the server to the client is different, the client downloads everything over again
@xiaoluwang736710 күн бұрын
@@PraiseYeezusis this actually how McMaster works??
@PraiseYeezus10 күн бұрын
@@xiaoluwang7367 no that's how Vercel's infra works
@juanenriquesegebre88736 күн бұрын
Love how at 3:57 he goes out of topic to compliment how pretty the nuts on this website are.
@m4lwElrohir15 күн бұрын
except for image flickering, pretty smooth UX
@JenuelGanawed14 күн бұрын
this is really good to implement in a ecommerce website... it makes shopping online really really fast.
@joshblevinswebengineer14 күн бұрын
This is an interesting intersection between web development and ux. The site has amazing ux and software engineering.
@nemopeti14 күн бұрын
What about server/CDN and network costs for this amount of prefetch? How it works on mobile clients, where is no hover event?
@shirkit14 күн бұрын
There's no free lunch. The original project does the same. You can choose not to preload the images if you're worried about that, only the HTML content. I'm gonna tell you for my company the price of traffic is easily covered by improved user experience. Also on mobile you can track the viewport and prefetch on item visible for a certain amount or some other metric, you'd need to research for a particular use case, or don't prefetch images and only the HTML for everything. Trade-offs are always there.
@ibnurasikh14 күн бұрын
It's an eCommerce site, so network and bandwidth costs are very very low compared to the revenue generated from sales. However, load speed is crucial. I've seen a 30% drop in CTR/visitors when my website's page load time is slow.
@theexploderofworlds38559 күн бұрын
Used to work in a machine shop and would pick up hardware from one of their warehouses regularly. Great customer service and hardware, great company.
@henriquematias198614 күн бұрын
We all did things like this back on 90’s/00’s and it worked like a charm, no frameworks, no jQuery
@PraiseYeezus14 күн бұрын
which site did you build that performs this well?
@henriquematias198614 күн бұрын
@@PraiseYeezus Brazilian Channel 5 ( Rede Globo ) covering the Olympics in 2004 is a good example; we had super tight requirements with the size of the CSS and imagery. Basically, back in the day, at the end of 90's beginning of 00, you had to make websites that performed well because broadband wasn't so well spread, especially in South America. So it was expected that designers would know how to compress images and videos to the maximun amount of compression possible. Often, internet banners had incredibly low limits in size, so everyone back in the day would squeeze as many KB as possible of every single file. Nowadays, a lot of "designers" and "developers" will put images online without even trying to compress or make them the correct size before slapping them online.
@henriquematias198614 күн бұрын
@@PraiseYeezus for some reason my comment keeps being deleted. So i will rewrite it briefly, I wrote the main Brazilian website ( by our main tv channel, which was "the official channel" ) for covering the Athens olympics in the early 00's and many other websites and at the time broadband wasn't so popular and everyone in the team, designers and developers and project managers were well aware of file sizes and compression types, most projects had strict rules for file sizes and page load times. XMLHttpRequest API was standard and so it was having different if conditions for different popular browsers, jQuery was not there yet.
@randomuser664383 күн бұрын
No jQuery before HTML5 and ES6 sounds like an awfully bad decision
@SidTheITGuy15 күн бұрын
The depths that you go to is honestly, unreal. I can only imagine what it takes to put these videos out. Kudos to you, my man!
@mohitkumar-jv2bx15 күн бұрын
Sid, i agree but he conveniently missed few key points as he is a React/Next Shill. Few key points Theo is missing: 1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data. 2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
@t3dotgg15 күн бұрын
@mohitkumar-jv2bx as I mentioned in my reply above, the only one "conveniently missing" points here is you. 1) All of these examples use real databases. The DB for NextFaster has millions of entries. It's a fair comparison. 2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic.
@AshesWake-sf7uw9 күн бұрын
@@t3dotgg just 20$ dollars for these many requests 💀. I mean i get it, most of these are just favicon/really small requests which don't take a lot of bandwidth, but the amount of requests a single user generates on this site is just absurd. So, that low price is indeed shocking.
@KvapuJanjalia12 күн бұрын
Looks like they are using good ol’ ASPNET Web Parts technology, which is considered dead nowadays.
@krisbude960710 күн бұрын
about every youtube streamer has covered this already
@m-ok-637914 күн бұрын
95% React or any other JS framework developers can't build a website as fast as McMaster site.
@bluekeybo7 күн бұрын
"Choosing a specific technology won't necessarily make your website faster"...Nextjs makes all optimizations default..."Choosing Nextjs will definitely make your website faster"
@dgoenka114 күн бұрын
I noticed a small but significant tweak that probably helps a lot: B&W images.. they probably get a lot of saving by the compression on top of the fact that images here are all small.. tthe result: the browser is done quicker loading and rendering the images
@lev1ato14 күн бұрын
I have learned a lot from this video, more videos like this would be awesome
@hannespi288614 күн бұрын
Too!
@emilemil114 күн бұрын
Tbh I don't think it feels that fast, especially for a page that is all text aside from some tiny black and white images. Getting a simple page like that to behave like the NextFaster example isn't that difficult, preloading, caching, and not doing full page reloads will get you most of the way there. The reason most websites are slower is because A. they're loading much more data, and B. their focus is on adding features and developing quickly, not trying to get page loads down to milliseconds.
14 күн бұрын
Man... I absolutely love your honesty when doing ads! Seriously!
@bluegamer421014 күн бұрын
These videos are always fun to watch but I'd really like it if you were to put chapters in a video.
@jepqmw15 күн бұрын
same thing is implemented in soundcloud when you hover on music it loads buffer and once u click it loaded buffer starts playing
@HamdiRizal14 күн бұрын
If your ceo/manager asks you to rank higher on Pagespeed Insights, show them this video.
@hunter247315 күн бұрын
The images on Masters are actually Sprites
@tom_marsden14 күн бұрын
Great point. With sprites you are fetching far less images and then just using offsets.
@filipturczynowicz-suszycki772814 күн бұрын
Great breakdown Theo!!
@hqcart114 күн бұрын
First, the comparision between McMaster and NextFaster is not fair, McMaster does actually query the database on each product, while NextFaster downloads 10MB on the first page. this is not going to work if you have bigger database. McMaster Tech: 1. Jquery 2. Styled Component this proves that all newcomers frameworks wanting to fix slowness problems that other frameworks had originally weren't there, bad coding and adding dependencies are what we don't need.
@t3dotgg14 күн бұрын
Did you watch the video? McMaster loads more data than NextFaster. Next also queries a database with literally millions of items in it.
@hqcart114 күн бұрын
@@t3dotgg even if it does, it's not as simple, what kind of enhancement on the database? is it in memory?, how big is it, is it redundant? knowing that NextFaster is all about speed, i am sure 100% they did some of the hacks to make it look that good, but in the real world, hello darkness my old friend...
@t3dotgg14 күн бұрын
@@hqcart1 Why don’t you take a look? It’s all open source and they’re very transparent about how it works. The database is Neon, which is a serverless ready Postgres provider. They provide most of what you’d hire a db admin for (backups, pooling, sharding etc)
@MrTonyFromEarth12 күн бұрын
People in the comments seriously overestimate how slow database queries are. In reality accessing a database is nothing compared to, say, network latency.
@AbouAnia14 күн бұрын
Back when websites were built by code veterans optimizing for 1ms
@RealOscarMay14 күн бұрын
The website also looks pretty good
@maazmunir921314 күн бұрын
This was a good video, learnt alot thanks!
@gr33nDestiny14 күн бұрын
Thanks for this, its Awesome
@radiozradioz241914 күн бұрын
Can Theo just appreciate a good website without dunking on it and shilling NextJS? He doesn't need to be so defensive all the time.
@chrisalupului14 күн бұрын
Appreciate you Theo, thanks for the video! 😄👍 Does fast mean more opportunities for vulnerabilities or less? Just curious your input on it.
@t3dotgg14 күн бұрын
Fast usually means simple. Simple usually means less surface area. Less surface area usually means less room for exploits. There's no hard rules here, but generally speaking, simpler = better
@ThePaisan14 күн бұрын
Wes Bos made a video on same thing 2weeks back then Codedamn hoped on the same thing and a dozen others.
@Sammysapphira6 күн бұрын
Pre fetching is something im shocked isnt more common. It used to be on lots of websites but then disappeared.
@GuiChaguri12 күн бұрын
I wonder how this project would perform on a self-hosted environment. We all know Vercel does a bunch of special optimizations for Next hosted in their cloud. I'm guessing it will still run pretty fast, but some of these optimizations will not work out of the box or not work at all
@BenoitStPierre8 күн бұрын
I'm really interested to hear why you're coming around to mousedown for interactions. I'm still in the mouseup camp but I haven't dug into it much and would love to hear what the arguments are! Future video?
@elmax574814 күн бұрын
Im currently convincing my principal eng’s to rewrite our whole website because i saw the next-faster page the other day… wish me luck.
@cherubin7th14 күн бұрын
nextfaster's way how the images flicker in makes me feel bad.
@zahash104512 күн бұрын
I’m sure it feels amazing to use this site on your optic fiber internet connection
@t3dotgg12 күн бұрын
I’m not on fiber sadly :( I also tried it with a very slow vpn and it still felt great!
@taaest-xek14 күн бұрын
Ah yes be prepared for a lot of bandwith cost . Especially when using aws wrapper vercel
@nightshade42714 күн бұрын
htmx prefetch can do similar hover prefetching pretty easily, including the images of the prefetched page
@alehkhantsevich11314 күн бұрын
From Europe NextFaster doesn't feel fast. I would say McMaster-Carr feels much faster from here.
@ashrafal14 күн бұрын
Sponsor? I feel Vercel(Next.js) is a long term sponsor of the channel.
@robwhitaker853414 күн бұрын
Googles page speed tool is nothing to do with site speed to user, and everything to do with first page load. Optimizing for first page load and optimizing for general site speed are two different kettles of fish. Google has to assume the user is loading the site for the first time
@nihardongara30258 күн бұрын
That’s how the old days worked
@JLarky3 сағат бұрын
Some of those optimizations are already in Next (2 weeks later)
@danglad554614 күн бұрын
Super useful video!
@90vackoo14 күн бұрын
Thanks for finally doing this
@shgysk8zer015 күн бұрын
You just gave me an idea to promote some things I work on because... I write things that are both minimal and fast. I'm sure I could attain that speed, and with lower load size.
@shgysk8zer014 күн бұрын
Putting it to use on my local server for an 11ty site I took navigating after initial load down to ~25ms. Mostly only took 4 lines for setup, but I had to refactor some things to deal with adding event listeners on page load. Added < 6kb to my bundle size, before compression. Could probably get it down to like 4ms and even reduce bundle size, while making it easier to maintain, but that'd basically mean a complete rewrite of things.
@amanx135 күн бұрын
The Rollercoaster Tycoon of HTML
@aghileslounis14 күн бұрын
Euhh...is it only me or? You are comparing a personal project with 0 users to McMaster? I'm confused. First of all, this next.js example is completely static, McMaster is NOT. It's fully dynamic as the server does a bunch of checks about the product's availability and much more. Like if you change something on the server, it's reflected immediately on McMaster. In this Next.js example it will not, it statically generates the pages. The Next.js example is more of a blog. It can NEVER EVER be a marketplace. You'll build 1000000 pages? Rebuild them every X time?.... It's crazy to think that you can just like that, build something better and insult people's intelligence. It's NOT faster AT ALL. You're comparing a BLOG to a fully functional huge MARKETPLACE
@whydoyouneedmyname_14 күн бұрын
It's impressive surely, but it even talks about the optimizations it doesn't do compared to the real thing. It's like saying one of those design youtube/netflix clones are faster than the real thing
@aghileslounis14 күн бұрын
@@whydoyouneedmyname_ It's not impressive, i'm so sorry. It's just building the pages and prefetching. McMaster is a x1000 times more complex than that to achieve the speed in a real world situation. You could never ever achieve the speed of McMaster in reality using only these techniques, they are not enough, neither realist for a real marketplace
@anonymousfigure3714 күн бұрын
The Next.js example is not "completely static". Your claim to know about McMaster's backend caching logic is dubious (and provably incorrect; other videos detailing McMaster have covered this) because you don't even seem to know what this fully open source thing is doing even though the code is literally in the video you're commenting on. "x1000 times more complex" is goofy as hell too.
@aghileslounis14 күн бұрын
@@anonymousfigure37 I may have exaggerated, I can concede you that no problem, but I understand what it's doing and what McMaster is doing. I can tell that It's on completely another level. The code in this video was bad in a sense that It can't work on a real marketplace unless you change it to support all the McMaster features, which will make it way slower...even worse: if you keep it like that, it will crash instantly! The site wouldn't even load.
@anonymousfigure3714 күн бұрын
@@aghileslounis I think the biggest additional complexity the McMaster site has in terms of functionality is the whole left navigation experience, which certainly complicates the caching story. In fact if you get specific enough with the left nav filters, you start to experience slowdowns because of cache misses. I can't think of anything that McMaster is doing that would be difficult to implement performantly in Next.js. I mentioned the left nav interface; what features specifically do you have in mind?
@olavisau15 күн бұрын
Um... loading a lot of JS is not always fine. At that point it only works quickly with high internet speeds, which is not something everybody has across the world. If your target customers are in the US / EU / Australia and other areas where internet bandwidth is fast, then sure you can send in more data to avoid more requests, but if your target customers are every country or africe / latam, then you really have to think about every byte sent to the customer.
@connectable7 күн бұрын
Cheers Wes Bos
@shanghaikid198413 күн бұрын
Designer is the enemy of the web performance.
@BorisBarroso14 күн бұрын
Sveltekit can do that if you use the default behavior that loads a link on hover. Prefetching images is cool.
@d34d10ck14 күн бұрын
I don't think it's default behavior. You do have to explicitly set data-sveltekit-preload-data="hover" in either the anchor or body tag , don't you?
@BorisBarroso13 күн бұрын
@ ok newer versions of sveltekit require this. I haven’t generated a new project in some time. Anyway is dead simple to make load content on hover.
@mohitkumar-jv2bx15 күн бұрын
Few key points Theo is missing: 1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data. 2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
@t3dotgg15 күн бұрын
1) All of these examples use real databases. The DB has millions of entries. It's a fair comparison. 2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic
@sad_man_no_talent15 күн бұрын
hey blind 12:40 theo shows next one uses db calls
@maazmunir921314 күн бұрын
@@t3dotgg I mean realistically the site would have had that much traffic for a few days only no?
@aymenbachiri-yh2hd15 күн бұрын
This is awesome
@pixiedev15 күн бұрын
5 years ago I made a SPA website for my college using just Django and vanilla js and that was fast as f 😅. I made a router and for first request It download the full page then for any click it download only the part of page and then I attach/replace to the page part and head scripts without changing the layout. /first-page (full page with layout) /next-page?req=spa (only changed content not full layout)
@jibreelkeddo703015 күн бұрын
HTMX sounds like it’s up your alley
@m1265215 күн бұрын
2:10 isn't pre-loading just because someone hovers a bit wasteful? I'd want to see stats on pre-loads to clicks first.
@doc852715 күн бұрын
yes, it will be a waste if you don't click, that's the tradeoff of choosing pre-fetching. Your traffic and billing can sky rocket if you are not being careful. They can afford the prefetch to provide a better UX for their clients. Hence, there are lots of times you don't want to prefetch.
@m1265215 күн бұрын
@ a waste is a waste, to the environment its a big deal. Whats the carbon footprint of those trillions of unnecessary preloads combined I wonder?
@tinnick15 күн бұрын
To be honest, hovering doesn’t exist on mobile devices which is where the concern about wasteful request network bill is mostly relevant so I think it’s a good trade off for desktop devices. Yeah, yeah. Hover might technically exist on mobile too, but if you disable it the trade off is only on desktop.
@tinnick15 күн бұрын
@@m12652 Really 😅. Humans are quite wasteful too if you’re going to that length about environment concerns. Should we remove all toilets in the world because it’s inconvenient every time some one takes a dump to recycle as manure? I don’t think so, and I hope humanity is not heading that way. I think, It would be best in human interests to not sacrifice inconvenience but make up with other means for things we have been a little wasteful of.
@m1265215 күн бұрын
@ very educated and totally relevant... you must be one of those people that thinks israel is a country 🙄
@noext700114 күн бұрын
its good until you want SEO, google dont see tag
@thelethalmoo14 күн бұрын
I wish they could do a fast version of jiiiraaa
@saiphaneeshk.h.548214 күн бұрын
Will it be as fast as it is now if caching invalidation of 2hrs is removed? Or is it playing a major role in time reduction?
@nullvoid354512 күн бұрын
I'm not very familiar with JS and so I don't know if he showed this in the video, but I wonder what exactly this 2 hour cache invalidation timeout effects? If things like stock and price cant update on every load or even update live, then I get the reasons for suspecting the cache is misrepresenting the comparison, but I lack the immediate skills to check without outpacing my own interest. But like, images only updating every 2 hours. Sure, why not?
@nightshade42714 күн бұрын
do nextMaster pregenerate all product pages and everything? Wonder how long that takes to build? I don't think it fair comparison to the original site since I don't think they are pregenerating all product pages.
@NishinG-m1n14 күн бұрын
My marketing team needs to know when images were loaded for some reason. I need to write unoptimized in the next Image tag because when images are optimized by next js the URL has some params for getting the optimized image. Also, they say why the image loading feels slow :(
@nullvoid354512 күн бұрын
If you assume no malicious actors, then maybe the clients could keep track of page loads and dump them to the server in batches later on?
@ListenSomething15 күн бұрын
I am just really curious, why we just cannot use SPA version with a restful API of that instead of Next.js, especially if we're going to fetch all the endpoints in advance? I feel like we always reinvent the same wheel again and again. I remember my website which was fetching the HTML with sync ajax in 2013 with the exactly same speed. Surely, it wasn't complicated to build like in Next.js with dozens of optimizations. IMHO, there are many ways to build a website which can load faster. Surely, 99% of them easier than implementing in Next.js. Sorry, I just don't understand. Maybe, I am not nerd enough to get the point.
@redditrepo47314 күн бұрын
While you are objectively correct in saying that SPA + REST is superior, the fact is that Next has a significant footprint in the industry and as a result there will be content made around it
@t3dotgg14 күн бұрын
Show me one faster website built in a similar way with simpler tech. If it doesn’t have millions of pages, it doesn’t count.
@BCRooke114 күн бұрын
And if you anticipate using mobile, having a REST API would be a big win
@akirapink8 күн бұрын
this is why a lot of web technologies feel pointless to- _OH_
@KlimYadrintsev14 күн бұрын
Can someone please explain to me, if it is 1 mil products it means 1 mil photos. Which if you are using vercel image optimisation is around 5000 dollars. Who out of this enthusiast payed that much? The only reason I don’t use vercel image is because my side project makes no money and is not worth to spend 5 dollars per 1000 images
@Pandazaar14 күн бұрын
You do understand that if a legit shop has a million products, it's probably way too profitable to bother about $5k
@KlimYadrintsev14 күн бұрын
@ the profit margins in e-commerce as an average is 30% 5k is not a small amount
@UnknownPerson-wg1hw11 күн бұрын
@@KlimYadrintsev uh.. yes it is
@salmenbejaoui169614 күн бұрын
How much load prefetching all links can generate on the server? what about server and bandwidth costs?
@YevheniiViazovskyi14 күн бұрын
I'm kinda scared of that request bombardment I clicked like 5 links and got 1.5k requests
@eyeswiredopendesign7 күн бұрын
3:57 ...those are pretty nuts 🔩
@LaserFur14 күн бұрын
When the McMaster-Carr website was first created it was not fast. back then it was faster to pull out the huge book than to access the website.
@ac130kz14 күн бұрын
a better idea is to preload cached pages with blurhashes and lazily load images afterwards. It's even faster and uses less resources (bandwidth, cpu)
@saiv4614 күн бұрын
You don't need bluehashes with Progressive JPEG.
@ac130kz14 күн бұрын
@@saiv46 not all images are jpegs
@nullvoid354512 күн бұрын
@@ac130kz but they can be.
@Chikowski10114 күн бұрын
this video is sponsored by nuts and bolts !
@ehm-wg8pd15 күн бұрын
This is also example that Fastest website doesnt really that matter, we care because we look at the number, but what does that means to website consumer? sometimes more function could be helpful rather than microptimizing stuff
@xXxRK0xXx15 күн бұрын
All of this prefetching, is it intensive on a server (assuming a live production environment)? Seems like it would be no?
@UnknownPerson-wg1hw14 күн бұрын
a business doesnt care if they can deliver fast products
@FusionHyperion14 күн бұрын
the problem is always between the chair and the screen
@tom_marsden14 күн бұрын
PEBKAC
@rajofearth14 күн бұрын
i used Brave Browser's Leo AI 0:00 - Introduction to the video and the McMaster website 1:40 - Analyzing the network requests and performance of the McMaster website 5:00 - Introducing the "Next Faster" website as a comparison 7:00 - Analyzing the performance and optimizations of the Next Faster website 12:00 - Diving into the code of the Next Faster website 16:00 - Discussing the custom Link component and image prefetching 20:00 - Comparing the performance of McMaster vs Next Faster with throttling 23:00 - Discussion of potential improvements to Next.js to incorporate the optimizations used in Next Faster 26:00 - Conclusion and final thoughts
@cnikolov14 күн бұрын
I think faster next missed to compress the images with brotli.
@Miguelmigs2414 күн бұрын
Couldn't stop noticing you're using a terminal called "Ghostty", what is that?
@goncalonorberto96014 күн бұрын
Faster than my Figma prototype
@deatho0ne58713 күн бұрын
11:20 I am not a fan of loading tons of data before a user gets to a page. Yes, it is nice for user experience, but it is not nice for user download rates or company server rates. Did see stopLoading if the mouse moves out, which is nice
@samfelton500915 күн бұрын
Htmx preload extension ftw
@burhanbudak604114 күн бұрын
What would be cool is of its fast on GRPS
@AashutoshRathi13 күн бұрын
"instant-fuckin-taneously"
@peroconino14 күн бұрын
So, why they did all of that? Wouldnt be better to just use nextjs built-in prefetch?
@dddddeeeevvvvvv15 күн бұрын
Perks of watching Theo 🎉
@filippobrigati121912 күн бұрын
What font are you using in vs code?
@SaurabhGuptaYT12 күн бұрын
On face Prefetching looks good, but in most cases the amount of avoidable requests and load on the server it increases is not worth it unless it's just sending the static data which can be cached easily. If any db requests or computation is happening in the backend for those requests then it's just a waste of resources.
@whentheyD12 күн бұрын
most websites that do not have UGC would benefit from this. correct me if i'm wrong
@gc19596 күн бұрын
The HTTP response indicates that McMasterCarr uses the Akamai CDN for caching.