PROOF JavaScript is a Multi-Threaded language

  Рет қаралды 261,685

Beyond Fireship

Beyond Fireship

10 ай бұрын

Learn the basics of parallelism and concurrency in JavaScript by experimenting with Node.js Worker Threads and browser Web Workers.
#javascript #programming #computerscience
Upgrade to Fireship PRO fireship.io/pro
Node.js Worker Threads nodejs.org/api/worker_threads...
Check out @codewithryan • Node.js is a serious t...

Пікірлер: 529
@daedalus5070
@daedalus5070 10 ай бұрын
I could feel my brain trying to stop me writing what I knew was an infinite loop but I did it anyway. I trusted you Jeff!
@ko-Daegu
@ko-Daegu 10 ай бұрын
0:31 concurrency incorporates parallelism what you should is asynchronism
@lucassilvas1
@lucassilvas1 10 ай бұрын
@@ko-Daegu who are you talking to, schizo?
@hypergraphic
@hypergraphic 10 ай бұрын
me too!
@universaltoons
@universaltoons 10 ай бұрын
@@ko-Daegu can I have some of what you're having
@lengors7327
@lengors7327 10 ай бұрын
​@@ko-Daegu you really thought you are being smart with that remark, didnt you? Only problem is that you are wrong
@AlecThilenius
@AlecThilenius 10 ай бұрын
Fun nerd trivia: - A single CPU core runs multiple instructions concurrently, the CPU core just guarantees that it will appear AS IF the instructions were run serially within the context of a single thread. This is achieved primarily via instruction pipelining. - A single CPU core often executes instructions totally out of order, this is unimaginatively named "Out Of Order (OOO) execution". - A single core also executes instructions simultaneously from two DIFFERENT threads, only guaranteeing that each thread will appear AS IF it ran serially, all on the same shared hardware, all in the same core. This is called Hyperthreading. And we haven't even gotten to multi-core yet lol. I love you content Jeff, the ending was gold!
@ragggs
@ragggs 10 ай бұрын
in the spectre and meltdown era, we like to say “guarantees”
@RicardoSilvaTripcall
@RicardoSilvaTripcall 10 ай бұрын
But in a Hyperthreaded systems tasks just do not appear to be executed serially, they actually are executed serially ... the only difference is that the system is going to coordinate the execution of other tasks/threads while waiting for the previous one, that is probably blocked waiting for a I/O response ... If you have a 16 core processor with 32 logical processors, it doesn't mean it can execute 32 thread simultaneously ...
@ragggs
@ragggs 10 ай бұрын
@@RicardoSilvaTripcall hyperthreads are in many cases parallel by most meaningful definitions, due to interleaved pipelined operations on the cpu, and the observability problem of variable length operations. For an arbitrary pair of operations on two hyperthreads, without specifying what the operations are, and the exact cpu and microcode patch level you can not say which operation completes first even if you know the order in which they started.
@AlecThilenius
@AlecThilenius 10 ай бұрын
@@ragggs Lol! Maybe guarantee* (unless you're Intel)
@AlecThilenius
@AlecThilenius 10 ай бұрын
@@RicardoSilvaTripcall Uhhhh. No. Sorry.
@WolfPhoenix0
@WolfPhoenix0 10 ай бұрын
That chef analogy about concurrency and parallelism was genius. Makes it SO much easier to understand the differences.
@RedlinePostal
@RedlinePostal 10 ай бұрын
Also when we say "one-core," that means "one-core at a *time*" -- computer kernels are concurrent by default, and the program's code will actually be constantly shifting to different CPUs, as the kernel manages a queue of things for the processor to do. Not too unlike the asynchronous system that javascript has, kernel will break each program you're running into executable chunks, and has a way to manage which programs and code get more priority.
@orbyfied
@orbyfied 10 ай бұрын
wouldnt that be kind of ineffective though, it wouldnt be able to take full advantage of the CPU cache, so i hope it does it as rarely as possible
@invinciblemode
@invinciblemode 10 ай бұрын
@@orbyfieduhh, different CPU cores use the same L2-L3 cache. L1 Cache is per core but they’re small and meant for minor optimisations.
@orbyfied
@orbyfied 10 ай бұрын
L1 is the fastest so having data available there is pretty significant. its also grown much in size to the point that it can basically cache all the memory a longer running task will need now. if L1 was so insignificant it wouldn't cause there data desync issues across threads
@LettersAndNumbers300
@LettersAndNumbers300 10 ай бұрын
Then…why do I only see one core active when running simple Python code…?
@jesusmods1
@jesusmods1 10 ай бұрын
​@@orbyfiedit could be more inefficient if only one process took all the CPU core for himself during all his life time. Probably the process isn't switched between cores, but it is being swaped in and out with others on the same core for the sake of concurrency. Also take in account the hit rate that a cache may have.
@JThompson_VI
@JThompson_VI 10 ай бұрын
Moments like 0:52, the short memorable description of callback functions, is what makes you a great teacher. Thanks man!
@kisaragi-hiu
@kisaragi-hiu 10 ай бұрын
Keep in mind the JS world also calls any higher order function "callback" (like the function you'd pass to Array.map), whereas elsewhere afaik it only refers to the function you pass to something non-blocking.
@curlyfryactual
@curlyfryactual 10 ай бұрын
​@@kisaragi-hiu a fact that caused me much grief coming into JS from systems level.
@ra2enjoyer708
@ra2enjoyer708 10 ай бұрын
It's a pretty good overview on how much more of a clusterfuck the code becomes once you add workers to it. And it didn't even get to the juice of doing fs/database/stream calls within workers and error handling for all of that.
@dan_le_brown
@dan_le_brown 10 ай бұрын
"Clusterfuck", I had the same word in mind 😭😂
@ko-Daegu
@ko-Daegu 10 ай бұрын
0:31 concurrency incorporates parallelism what you should is asynchronism
@angryman9333
@angryman9333 10 ай бұрын
just use Promises, it'll process all your asynchronous functions concurrently (very similar to parallel)
@SirusStarTV
@SirusStarTV 10 ай бұрын
@@angryman9333 Promise will run user written function in main thread blocking manner. Async function is just syntactic sugar for easier creation of promises. WIthout browser asynchronous api's or web workers it doesn't run code in parallel mode.
@platinumsun4632
@platinumsun4632 10 ай бұрын
@@angryman9333a what?
@elhaambasheerch7058
@elhaambasheerch7058 10 ай бұрын
Love to see jeff going in depth on this channel, would love more videos like this one.
@beyondfireship
@beyondfireship 10 ай бұрын
That's why I made this channel. I've got a long list of ideas.
@morezco
@morezco 10 ай бұрын
@@beyondfireship wonderful. Keep it up
@srejonkhan
@srejonkhan 10 ай бұрын
6:13 To see how all of your cores utilizing, you can change the graph from 'Overall utilization' to 'Logical Processor' just by right clicking on the graph -> Change graph to -> Logical Processor.
@yss64
@yss64 10 ай бұрын
Thanks for shouting out code with ryan! That channel is criminally underrated
@BRBS360
@BRBS360 10 ай бұрын
I'd like to see a video on JavaScript generators and maybe even coroutines.
@StiekemeHenk
@StiekemeHenk 10 ай бұрын
For sure, this is a really cool thing and I'm not sure how to actually use it.
@ko-Daegu
@ko-Daegu 10 ай бұрын
generics maybe ? garbage collector in more details ? benchmarkign agiants pythonic code, just to get people triggered ?
@7heMech
@7heMech 10 ай бұрын
Really cool, I actually saw the other video about nodejs taking it up a notch when it came out.
@dzhaniivanov5837
@dzhaniivanov5837 10 ай бұрын
i watched a similar video early this year, but your way to deliver content is amazing, keep going
@H-Root
@H-Root 10 ай бұрын
I am stuck step programmer 😂😂
@rizkiaprita
@rizkiaprita 10 ай бұрын
rule #34 is calling
@nullbeyondo
@nullbeyondo 10 ай бұрын
break;
@catswolo421
@catswolo421 6 ай бұрын
I would watch out or youll het multi threaded
@user-fed-yum
@user-fed-yum 10 ай бұрын
That ending was possibly one of your best pranks ever, a new high watermark. Congratulations 😂
@EdgeGaming
@EdgeGaming 10 ай бұрын
Lots of comments about memorable descriptions, shoutout to the thread summary at 3:30. Your conciseness is excellent.
@nomadshiba
@nomadshiba 10 ай бұрын
talking about multi-threading data oriented design always helps
@kiprasmel
@kiprasmel 10 ай бұрын
the `threads` package makes working with threads much more convenient. it also works well w/ typescript.
@AntonisTzorvas
@AntonisTzorvas 10 ай бұрын
aside from the outstanding quality, this ending was quite funny and hilarious! keep it up, your content is TOP 🙇🚀
@deneguil-1618
@deneguil-1618 10 ай бұрын
just a heads up for your CPU; the 12900K doesn't have 8 physical cores, it indeed has 16, 8 performance and 8 efficiency cores, the performance cores have hyperthreading enabled but not the efficiency cores so you have 24 threads in total
@daleryanaldover6545
@daleryanaldover6545 10 ай бұрын
😮
@allesarfint
@allesarfint 10 ай бұрын
Oh yeah, right. So that's why the CPU didn't go to 100% after using 8 cores.
@godnyx117
@godnyx117 10 ай бұрын
You forgot the: 🤓
@adityaanuragi6916
@adityaanuragi6916 10 ай бұрын
But at 6:57 his cpu did go to a 100% with 16
@wertrager
@wertrager 10 ай бұрын
because hyperthreading is shit
@ahmad-murery
@ahmad-murery 10 ай бұрын
It would be nice if you right click on the cpu graph and *Change graph to > Logical Processors*, so we can see each thread separately. Thanks!
@crackwitz
@crackwitz 10 ай бұрын
less useful than you might think. the operating system's scheduler may bounce a thread around on any number of cores. doesn't make it faster but spreads the utilization around.
@ahmad-murery
@ahmad-murery 10 ай бұрын
@@crackwitz Do you mean that we will not see each core graph plotting one thread?
@wjlee7003
@wjlee7003 10 ай бұрын
although it's called concurrent, schedulers still can only work on one task at a time. It will delegate a certain amount of time to each task and switch between them (context switching). The switch Is just fast enough to make it seem truly "concurrent". If a task takes longer than the delegated time, the scheduler will still switch and come back to it to finish.
@Ihavetoreturnsomevideotapes
@Ihavetoreturnsomevideotapes 10 ай бұрын
ayo , was learning event loop and had a bit of confusion about performance b/w single and multithreading and jeff just posted the video at the right time.
@MrPman1999
@MrPman1999 7 күн бұрын
best comic relief at the end ever, love you Jeff
@AlexEscalante
@AlexEscalante 10 ай бұрын
¡Wow! Just yesterday I was watching some videos about worker threads because I will use them to speed up the UI in my current development 😄
@Bell_420
@Bell_420 4 ай бұрын
the cook analogy was great and i now understand
@maxijonson
@maxijonson 10 ай бұрын
My brain: dont run it 8 years of programming: dont run it the worker thread registering my inputs to the console as I type it: dont run it Jeff: run it. **RUNS IT**
@boris---
@boris--- 10 ай бұрын
Task Manager --> Performance tab --> CPU --> Right click on graph --> Change graph to --> Logical Processors
@robertjif6337
@robertjif6337 10 ай бұрын
Thanks, now I know what script I should include in my svgs
@adaliszk
@adaliszk 10 ай бұрын
You can also pass initial data without needing to message the thread to start working, however, that one I feel like its better to use for initialization like connecting to a database.
@frankdearr2772
@frankdearr2772 5 ай бұрын
great topic, thanks 👍
@nuvotion-live
@nuvotion-live 10 ай бұрын
Little known fact, you can also do DOM related operations on another thread. You have to serve it from a separate origin and use the Origin-Agent-Cluster header, and load the script in an . But you can still communicate with it using postMessage, and avoid thread blocking with large binary transfers using chunking. This is great for stuff that involves video elements and cameras. I use it to move canvas animations (that include video textures) off the UI thread, and calculating motion vectors of webcams.
@knoopx
@knoopx 10 ай бұрын
that looks handy! thanks for sharing
@among-us-99999
@among-us-99999 10 ай бұрын
that might just help with a few of my projects
@matheusvictor9629
@matheusvictor9629 10 ай бұрын
do you have any examples on github?
@nuvotion-live
@nuvotion-live 10 ай бұрын
@@matheusvictor9629 yes
@andrewmcgrail2276
@andrewmcgrail2276 10 ай бұрын
Sounds very interesting! I have a project where I think this would be useful.
@wusluf
@wusluf 10 ай бұрын
Adding more cores might still provide gains in a VM scenario depending on the hypervisor. As long a your VM isn't provisioned all physical cores the hypervisor is at liberty to utilize more cores and even up to all physical cores for a short amount of time resulting in increased performance for bursting tasks
@junama
@junama 10 ай бұрын
Good vídeo! Next time change the CPU graph with right click to see each threat graph. Hope it helps!
@LedimLPMore
@LedimLPMore 10 ай бұрын
Wow, didn't know that. Thanks!
@timschannel247
@timschannel247 Ай бұрын
Yes Yes Yes, and exactly extra Yes! Thank you Bro for this contribution! You are speaking out of my brain! Best Regards!
@nullternative
@nullternative 10 ай бұрын
I just recently experimented with the Offscreen Canvas handling rendering on a separate worker thread. Pretty cool.
@JeremyThille
@JeremyThille 9 ай бұрын
Niiiice we have the exact same machine! (And thanks for the video!)
@DumbledoreMcCracken
@DumbledoreMcCracken 10 ай бұрын
each value should be a random value, and you should sum them in the end to ensure the compiler / interpreter does not optimize all the work away because it detected that you never used the values
@ra2enjoyer708
@ra2enjoyer708 9 ай бұрын
Pretty sure compiler won't be able to optimize side effects like this, since worker and the main thread only interact indirectly through events on message channel.
@HenokWehibe
@HenokWehibe 9 ай бұрын
Just brilliant
@subratarudra2745
@subratarudra2745 8 ай бұрын
Amazing🔥
@Quamsi
@Quamsi 10 ай бұрын
I have had hours long lectures in college level programming classes on the differences between concurrency and parallelism and the first 3 minutes of this video did a better job of explaining it. Shout outs to my bois running the us education system for wasting my money and my time 💀
@maskettaman1488
@maskettaman1488 10 ай бұрын
It's probably not their fault you failed to understand something so simple. Literally 1 minute on google would have cleared up any misunderstanding you had
@TechBuddy_
@TechBuddy_ 10 ай бұрын
​@@maskettaman1488if you have to pay to study and then you have to sell yourself to a tech corp to learn something is not that great of a system and it should not exist IMHO
@Quamsi
@Quamsi 10 ай бұрын
@maskettaman1488 lmao im not saying i misunderstood it im saying fireship is much more consice and still gets all the relevant information across compared to college despite the fact that i dont have to pay fireship anything
@lionbryce10101
@lionbryce10101 10 ай бұрын
Woulda been cool if you set it to show core usage on taskmgr
@olharAgudo
@olharAgudo 10 ай бұрын
Awesome video ending
@vforsh
@vforsh 10 ай бұрын
Wow, Love this tick - tock snippet
@zeta_meow_meow
@zeta_meow_meow 4 ай бұрын
seeing my cpu throttle and core usage rise in realtime was impresive :)
@VileEnd
@VileEnd 10 ай бұрын
Love it, we are already doing that with our Lambdas - cause why not use the vCores when you got them 😍
@Bossslime
@Bossslime 10 ай бұрын
I remember when I first learned workers, I didn’t realize k could use a separate js file so I wrote all of my code in a string, it was just a giant string that I coded with no ide help. That was fun.
@TeaBroski
@TeaBroski 10 ай бұрын
It's like you read my client's requirement and came into support
@markopolo2224
@markopolo2224 10 ай бұрын
man i been wanting something about workers for so long
@abhijay_hm
@abhijay_hm 10 ай бұрын
with the amount of time I've spent on this video because of the while loop, even the algorithm knows who my favourite youtuber is
@HedleyLuna
@HedleyLuna 10 ай бұрын
I did use this back in 2018. I don't know how much it improved, but error handling was painful. Also, when you call postMessage(), v8 will serialize your message, meaning big payloads will kill any advantage you want. And also, remember that functions are not serializable. On the UI, I completely killed my ThreeJS app in production when I tried to offload some of its work to other threads :D Apart from that, you should NEVER share data between threads, that's an anti-pattern.
@dave6012
@dave6012 10 ай бұрын
Mr jeff will you do one on creating a websocket server in node js?
@jimbowers1298
@jimbowers1298 10 ай бұрын
UNLIMITED VIEW TIMES!! AWESOME!! What a great video!
@MegaMech
@MegaMech 10 ай бұрын
A single x86 core can actually run more than one command at a time. And the n64 can run 1.5 commands at a time when it uses a branch delay slot.
@CC-1.
@CC-1. 10 ай бұрын
0:15 I already know this and already using this BLOB to create new Worker and going I use max 4 to 8 as one for each core
@Rebel101
@Rebel101 10 ай бұрын
Epic! It's FLAT!
@Xe054
@Xe054 10 ай бұрын
Fireship, the "S" sounds in your video sound really harsh. Consider using a de-esser plugin or a regular compressor plugin and your stuff will sound fantastic. Cheers.
@4541047
@4541047 10 ай бұрын
You are a youtube genius man
@BoloH.
@BoloH. 10 ай бұрын
I once made a volume rendering thingie with Three.JS and it really, REALLY benefited from Web Workers, especially interpolation between Z slices.
@xinaesthetic
@xinaesthetic 10 ай бұрын
Hang on… wouldn’t a volume-renderer in three.js be doing things like interpolation between z-slices in the fragment shader? Could certainly see workers being useful for some data processing (although texture data still needs to be pushed to the gpu in the main thread). Care to elucidate? Was it maybe interpolating XYZ over time, like with fMRI data or something? That would certainly benefit…
@gr.4380
@gr.4380 10 ай бұрын
love how you tell us to leave a comment if it's locked like we can even do that
@dovanminhan
@dovanminhan 10 ай бұрын
Hi from Vietnam, where the kitchen image was taken.
@NithinJune
@NithinJune 10 ай бұрын
i wish you showed the CPU usage on each logical processor on task manager instead of the overview
@dan-cj1rr
@dan-cj1rr 10 ай бұрын
No clue if this could be an interesting video, but teach us about how to deploy on different environment ( ex: testing, production), as a junior i always don't know what this implies. Also show us tools to handle it. Thanks :)
@RajitRoy_NR
@RajitRoy_NR 10 ай бұрын
What are some of the useful libraries which help or use workers? Like Partytown or Comlink
@TonyAlcast
@TonyAlcast 10 ай бұрын
I'm still amazed at how you find such accurate images as the one at 0:32 🤔
@NoFailer
@NoFailer 10 ай бұрын
I executed the while-loop on the orange youtube and I couldn't change the volume.... Thanks.
@PlayWithNiz
@PlayWithNiz 10 ай бұрын
I'm thinking out loud here, but have a genuine question - Could you use workers combined with something like husky to do all pre-commit/push/etc checks at once? For example, I may have a large unit/integration test suite, followed by a large e2e test suite, along with code quality checks and so on... All of which are ran in sequence potentially taking upwards of a few minutes to complete. Could workers be used to run these jobs together at once?
@ra2enjoyer708
@ra2enjoyer708 9 ай бұрын
E2E will bottleneck regardless, because of quadrillion OS APIs it has to interact on start, majority of them are synchronous.
@jack171380
@jack171380 10 ай бұрын
I wonder if there are things like mutex locks to help with the synchronisations of shared resources?
@ko-Daegu
@ko-Daegu 10 ай бұрын
0:31 concurrency incorporates parallelism what you should is asynchronism
@tinahalder8416
@tinahalder8416 10 ай бұрын
In python, handling Race Condition is easy, Use Queue, and Lock 😊
@dan_le_brown
@dan_le_brown 10 ай бұрын
I achieved something similar in TS, but rather than locking the queue, I ensured that the jobs that could cause a race condition had a predictable unique ID. By predictable, I mean a transaction reference/nonce...
@techtutorial9050
@techtutorial9050 10 ай бұрын
Well multiprocessing is much more mature than workers thread since multiprocessing has been the primary methods for concurrency in python, but for js it’s always been async.
@RegalWK
@RegalWK 10 ай бұрын
Every async thing you do goes to micro task or task queue, and every single one of them is executed on different thread, and once it’s done it goes back as message queue to micro task queue or task queue - when callstack is empty event loop takes firstly from microtask queue to callstack, later from task queue. Web workes once are done also goes to main thread to the callstack
@RegalWK
@RegalWK 10 ай бұрын
Js is still One thread
@alexanderpedenko6669
@alexanderpedenko6669 10 ай бұрын
Where did you find this info? As I know each thread has their own event loop, where micro and macro executes
@RegalWK
@RegalWK 10 ай бұрын
@@alexanderpedenko6669 when you make some Async operation like promises, timers (settineout setinterval) js engine (v8/ whatever is in node) notice that it’s async operation and delegate it to proper web/node api which is write in cpp Lang. And there it executes it code and once it’s done those api return result of that operation to some queue and later event loop moves it to main thread
@Malthael134
@Malthael134 10 ай бұрын
Just in time for my new browser game🎉
@wlockuz4467
@wlockuz4467 10 ай бұрын
I thought worker threads were virtual threads. you learn something new everyday!
@shimadabr
@shimadabr 10 ай бұрын
Aren't they? My understanding is that they are threads managed by the runtime, which in turn is responsible for allocating the appropriate amount of real threads on the O.S.
@riendlyf
@riendlyf 10 ай бұрын
Can you cover native threads vs green threads?
@JeremyAndersonBoise
@JeremyAndersonBoise 10 ай бұрын
Spawning workers in Node is not new, but support for web workers in browsers is comparatively new. Good shit man.
@user-kt1qj2ok7e
@user-kt1qj2ok7e 10 ай бұрын
That's hillarious
@DamonMedekMusic
@DamonMedekMusic 10 ай бұрын
I've used the web worker API to filter multiple arrays at once and it's okay but it is very unintuitive to use and it could definitely be improved upon. Ideally for multiple Dom manipulation at once too not just data processing.
@Steel0079
@Steel0079 10 ай бұрын
Now use the web worker where webpack is involved XD
@thecoolnewsguy
@thecoolnewsguy 10 ай бұрын
​@@Steel0079vite is the future
@Greediium
@Greediium 3 ай бұрын
IM STILL STUCK OVER HERE, HELP!?!?!?!? MY PC WONT SHUTDOWN, ITS BEEN 5 MONTH'S... keep up the great work, love your vid's!
@sobeeeeer
@sobeeeeer 10 ай бұрын
mindblowing intro
@victorpinasarnault9135
@victorpinasarnault9135 10 ай бұрын
I saw this video of Code with Ryan.
@eformance
@eformance 10 ай бұрын
Hyperthreading generally gives a 30% bump in performance, your test demonstrated that handily.
@debarkamondal6406
@debarkamondal6406 10 ай бұрын
Dude he is helarious
@jessejayphotography
@jessejayphotography 10 ай бұрын
Elixir is faster than I thought and getting faster with the new JIT compiler improvements.
@redhawk3385
@redhawk3385 10 ай бұрын
To do this magically in c/c++ use openmp, in rust use rayon.
@DranKof
@DranKof 10 ай бұрын
I tried the while loop thing and somehow my computer became sentient. Y'all should try that out.
@s0up1e
@s0up1e 10 ай бұрын
So weird, this was an interview question yesterday.
@soulofjack7294
@soulofjack7294 10 ай бұрын
thanks for hanging
@ninjaasmoke
@ninjaasmoke 10 ай бұрын
people watching on phone: “that level of genjutsu doesn’t work on me”
@joebgallegos
@joebgallegos 10 ай бұрын
I recently did a little side project where I needed to use a worker in a web app. The gist of the project is given a winning lottery number, how many “quick picks” or random tickets would it take to finally hit.
@dotnetapp
@dotnetapp 10 ай бұрын
ive made my own little helper functions, to make worker much more ergonomic. It takes a callback in it and returns a promise which you can await when finished it makes it much easier for racing conditions. Build the same with with Signals in the frontend so i made a computed which is allways multithreaded and automaticly spawns a thread when a signal gets a new value (no need for postmessage anymore) the same with rxjs.
@MysteryMixerMan
@MysteryMixerMan 10 ай бұрын
Are there any major baseline “costs” that need to be considered when spinning up workers? (start time, considerable ram etc)? Thanks for the content.
@majormayer7133
@majormayer7133 10 ай бұрын
Spinning up workers will most likely lead to additional context switches, which comes with a significant overhead if the work is not large enough
@ra2enjoyer708
@ra2enjoyer708 9 ай бұрын
Well the main cost is all interactions between the worker and the main thread have to go through serialization/deserialization routine. Ergo all values must be serializable too.
@mateuszabramek7015
@mateuszabramek7015 10 ай бұрын
Exactly. I don't know why I keep hearing otherwise
@Sowagware
@Sowagware 10 ай бұрын
6:30 Do you want to run() the jobs or double the workers and give it to the next run()?
@coolingjam
@coolingjam 10 ай бұрын
The one time I look up something, fireship uploads a video about it lol
@akashrajpurohit97
@akashrajpurohit97 10 ай бұрын
6:42 bro really doubled it and gave it to the next thread
@SinanWP
@SinanWP 10 ай бұрын
7:40 I knew the joke coming from mile away nice one 😂😂😂😂
@higurashinerd
@higurashinerd 10 ай бұрын
Nice CPU uptime
@Kareszrk
@Kareszrk 10 ай бұрын
Jeff thank you! I implemented a worked thread in my project and it's performance is unbelievable! An incredibly big for loop and data generation was 30 minutes to be finished. With worker thread it has bern reduced to just 1 second!!! Awesome!! And it was easy to implement thanks to your video and explanation!
@AnwarulIslamYT
@AnwarulIslamYT 10 ай бұрын
JavaScript is referred to high level, single threaded, garbage collected, interpreted || jit compiled, prototype based, multi-paradigm, dynamic language with a, non-blocking event loop
@LedimLPMore
@LedimLPMore 10 ай бұрын
And you can still program with multiple threads... 😂
@mrcjm
@mrcjm 10 ай бұрын
Ending is the moment you are glad you watched it on a mobile device
@jugurtha292
@jugurtha292 10 ай бұрын
in 3:11, how is swift that slow, its static and compiled and does not use garbage collection.
JavaScript Web Workers Explained
12:52
DevSage
Рет қаралды 88 М.
Node.js is a serious thing now… (2023)
8:18
Code With Ryan
Рет қаралды 626 М.
狼来了的故事你们听过吗?#天使 #小丑 #超人不会飞
00:42
超人不会飞
Рет қаралды 65 МЛН
A pack of chips with a surprise 🤣😍❤️ #demariki
00:14
Demariki
Рет қаралды 9 МЛН
Заметили?
00:11
Double Bubble
Рет қаралды 3,4 МЛН
Reacting to Controversial Opinions of Software Engineers
9:18
Fireship
Рет қаралды 2 МЛН
10 Math Concepts for Programmers
9:32
Fireship
Рет қаралды 1,7 МЛН
When is NodeJS Single-Threaded and when is it Multi-Threaded?
18:42
Hussein Nasser
Рет қаралды 69 М.
10 Rendering Patterns for Web Apps
6:55
Beyond Fireship
Рет қаралды 321 М.
I tried 8 different Postgres ORMs
9:46
Beyond Fireship
Рет қаралды 382 М.
Concurrency vs Parallelism in Node.js
7:47
Mehul - Codedamn
Рет қаралды 38 М.
CONCURRENCY IS NOT WHAT YOU THINK
16:59
Core Dumped
Рет қаралды 83 М.
How I deploy serverless containers for free
6:33
Beyond Fireship
Рет қаралды 387 М.
How principled coders outperform the competition
11:11
Coderized
Рет қаралды 1,5 МЛН
Sorting Algorithms Explained Visually
9:01
Beyond Fireship
Рет қаралды 509 М.