New GPUs are Bad??... "F**k it, I'll Do it Myself."

  Рет қаралды 391,299

Vex

Vex

Күн бұрын

Пікірлер: 971
@sanishnaik2040
@sanishnaik2040 6 ай бұрын
Im from nvidia. This guy has to be stopped.
@AbbasDalal1000
@AbbasDalal1000 6 ай бұрын
Im from this guy nvidia has to be stopped
@KianFloppa
@KianFloppa 6 ай бұрын
amd better
@TheMostWanedManforRockport
@TheMostWanedManforRockport 6 ай бұрын
Nice and AMD has to be stopped
@danieltoth9742
@danieltoth9742 6 ай бұрын
I'm from stop. Nvidia and AMD needs to be this guy'd.
@udaysingh9_11
@udaysingh9_11 6 ай бұрын
I'm in this guy, he wants me not to stop.
@ClamChowder95
@ClamChowder95 6 ай бұрын
I think he is underselling his work. This could be a stepping stone for future open source GPUs. He should be incredibly proud of what he did all by himself.
@k_kubes
@k_kubes 5 ай бұрын
This feels awfully familiar to how Linux started, basically someone toying around with a concept not expecting the project to become that big of a deal even after going open source and end up exploding in popularity
@jasonstephens6109
@jasonstephens6109 5 ай бұрын
Yeah this guy is laying the groundwork for something with huge potential. Even if an open source GPU never competes with the big boys, it could lay the groundwork for niche features that become mainstream. There's also the potential for custom instruction sets that give new life to defunct GPUS
@jnharton
@jnharton 5 ай бұрын
@@k_kubesLinux was more than just "toying around", but definitely a personal project that was never originally intended to go public let alone become a mainstream OS. I actually some files once that were supposedly part/all of Linux (maybe just the kernel) prior to version 1.0.
@harrydean9723
@harrydean9723 5 ай бұрын
this should be a thing
@xavierrodriguez2463
@xavierrodriguez2463 5 ай бұрын
Open source GPU to go with RISC-V
@AnimeGIFfy
@AnimeGIFfy 6 ай бұрын
open source software + open source hardware. this needs to happen if you want anything good to happen in your life.
@DengueBurger
@DengueBurger 5 ай бұрын
At-cost open-source GPUs when
@DengueBurger
@DengueBurger 5 ай бұрын
The Linux of GPUs
@TheDoomsdayzoner
@TheDoomsdayzoner 5 ай бұрын
That's how we evolved as species. When "secret" sciences like Algebra and Geometry became available for everyone. When "secret" tech became available for everyone like household appliances.
@wwrye929
@wwrye929 5 ай бұрын
Making the hardware would be price and that would make harder for the game maker
@AnimeGIFfy
@AnimeGIFfy 5 ай бұрын
@@wwrye929 im not talking about everyone making their own hardware from scratch
@POLARTTYRTM
@POLARTTYRTM 6 ай бұрын
The guy who made the gpu is the real "he's good with computers" guy.
@thevoidteb1285
@thevoidteb1285 5 ай бұрын
Did you listen to him in this video? Hes a novice.
@Whatisitlol
@Whatisitlol 5 ай бұрын
Why the heck am i experiencing a deja-vu This comment This reply above me I am getting crazy
@POLARTTYRTM
@POLARTTYRTM 5 ай бұрын
@@Whatisitlol normal we all have time from time to time.
@0x1EGEN
@0x1EGEN 5 ай бұрын
@@thevoidteb1285 Being a novice with HDL programming and writing windows kernel drivers does not mean he's not good with computers. That's like saying I'm not good with music because I never played the Pikasso guitar...
@yoshi596
@yoshi596 5 ай бұрын
@@thevoidteb1285 Oh is that so? Then go ahead and do it better than him. Notify me when you upload a video about your custom made GPU, I'll wait.
@SIedgeHammer83
@SIedgeHammer83 6 ай бұрын
Voodoo and Kyro GPUs need to make a comeback.
@bryndal36
@bryndal36 6 ай бұрын
Imagine this gpu being able to use a glidewrapper.
@xgamer25125
@xgamer25125 6 ай бұрын
wasn't 3DFX (maker of voodoos) got bought and absorbed into Nvidia..?
@RAgames-mc3kp
@RAgames-mc3kp 6 ай бұрын
Yes i think ​@@xgamer25125
@gh975223
@gh975223 6 ай бұрын
@@xgamer25125 it was
@itsnot1673
@itsnot1673 6 ай бұрын
And matrox maybe even trident
@Karti200
@Karti200 6 ай бұрын
I saw the news day one when it came out - and it just pissed me off how many out of touch people there were about it… Like… People literary roasted the creator of it because "it is too weak"... like cmon what the heck is wrong with some people This is literary an openware / shareware version of a GPU made by a community - this is an amazing milestone if you ask me
@elysian3623
@elysian3623 6 ай бұрын
Let's be real, consumers themselves have no idea what stuff is, what is good for them or how stuff works, they just consume, they're easily fooled into consuming stuff they don't need as well, 90% of Nvidia cards have probably never been used for a cuda work load but they were sold based on their dominance in certain tasks. I still live in hopes that AMD makes their software stack fully open and somebody comes along with a working prototype of something absolutely game changing and they work with them to actually use their combined technology to advance GPU's, currently the stagnation in GPU performance is because of die shrink and ramming things like AI acceleration into cards that really just need to be affordable and play games well.
@Dingbat1967
@Dingbat1967 6 ай бұрын
The average lamba person is an idiot. That's pretty much why. The intertubes just made it more obvious.
@TechBuild
@TechBuild 6 ай бұрын
People who have some idea how GPUs work and their differences from CPUs will easily understand the work this person has done. It is a phenomenal task of making a GPU yourself which does 3D rendering, even a basic one, so well. In the CPU space, there are lots of architectures available to build upon but not in the GPU space.
@tsl_finesse_6025
@tsl_finesse_6025 6 ай бұрын
People don't understand nvidia have around 30k employees while him alone does 1 full project by himself 😬😬. Bro is fire and a genius 💪🏾
@rextrowbridge8386
@rextrowbridge8386 6 ай бұрын
Because they are ignorant of how hard it is to make a gpu from the ground up.
@grgn5412
@grgn5412 6 ай бұрын
This is WAY better than you may think. By simply converting his FPGA design into an ASIC, you get a 10x performance increase. ASICS are expensive (From one to a few million dollars to make the mask which allows mass production), and FPGA's have been used to prototype them : the languages to program the first or design the second are the same (VHDL or Verilog), so this conversion is very common in the industry (this happenend for crypto mining, for instance).
@SUCHMISH
@SUCHMISH 5 ай бұрын
I looked the chips up, and the good news is that you can by them in bulk for a low price used... Only problem is that they are used... But I feel like this idea has some merit to it though!!
@aymenninja8120
@aymenninja8120 5 ай бұрын
I saw a video of a guy who made his own ASIC, and he didn't sound like super rich, i think the technology of making ASICs is getting more affordable.
@Endgame901
@Endgame901 5 ай бұрын
@@aymenninja8120 He didn't really make his own ASIC, he basically got in on a group buy for space on the silicon for an ASIC. Still pretty dope, but not quite the same.
@aymenninja8120
@aymenninja8120 5 ай бұрын
@@Endgame901 and that group is not a big corporation or something, my point is there it is possible now to make ASICs for clients other than big companies. if I got things right.
@Endgame901
@Endgame901 5 ай бұрын
@@aymenninja8120 you're not wrong, per se, but an ASIC like tinytapeout isn't really in the same scope as this, even if you purchased every single "block" of silicon space. The type of chip you'd need _would_ cost Big Company money.
@crumbman
@crumbman 6 ай бұрын
Really interesting video. As an electrical engineer working with FPGAs I can assure you it's a heck of a lot of writing (probably verilog) code to get this thing to work as it's supposed to. The biggest issue about doing this with an FPGA is that they run on really low clock speeds (typically ~100 -max ~250 MHz). So you can't really get speed just by increasing the clock speed (like NV and AMD have been doing more aggressively recently). Props to this man
@kesslerdupont6023
@kesslerdupont6023 6 ай бұрын
Are consumer FPGAs big enough to scale to entire architectures or are they typically cut-down and more of an evaluation tool?
@noth606
@noth606 6 ай бұрын
@@kesslerdupont6023 Well, you wouldn't be able to make a fullsize flagship FPGA based GPU that competes with the big boys, if that's what you mean. What I think this is, without looking deep into it, is a rendering pipeline mostly from scratch, my guess is that it's OpenGL based on choosing Quake to test it, so it's unlikely to support anything approaching DirectX 10 type stuff in terms of GPU functions, because it's at least one if not multiple orders of magnitude more complex. So, it's definitely impressive work, but I don't think nVidia or AMD are shaking in their boots. I'd guesstimate, based on what I know off the top of my head, that you'd need probably a few FPGA's, like 3-6, maybe more, to build up a full Dx10/11 type unit with enough ROP's, shaders etc to do something useful with including all the jank you have to have around it like memory management stuff, things to handle texture/geometry/shadercode ram plus then output handling. It kinds depends on how 'strict' of a model you aim for really, because to a point you can choose to do a lot on the host system, or not. The more you do on the host in code, the less you need dedicated hardware/FPGA space to do. It could be that this is a Dx10+ model project just not completed far enough to currently run more than basic stuff equivalent to OpenGL. I hope so.
@myne00
@myne00 6 ай бұрын
@kesslerdupont6023 they are absolutely used by Intel and amd to test design changes. They probably have really big ones that can do an entire (single) core. Nvidia probably does use them too, but it would most likely be a very cut down gpu. They would absolutely perform horrible, but it's about comparing different hardware approaches and validation. Eg if fpu design x does 100flops and design z does 102 flops you have your architectural comparison. Then you run through a bunch of tests to validate the results. Don't want a chip that gives incorrect answers. Fpgas are used in the real world in applications like telecommunications signal processing where a new technique could be released every year or so. I'm not aware of any other real world applications aside from the "MiSTer" which is mostly used to emulate old game consoles.
@kesslerdupont6023
@kesslerdupont6023 6 ай бұрын
@@noth606 Thanks for the helpful comment. I don't know much about DirectX but maybe I should look more into it.
@kesslerdupont6023
@kesslerdupont6023 6 ай бұрын
@@myne00 thanks for the info. During validation is it normal to have to fuzz the chip across all possible combinations of input/output or are there shortcuts to validation?
@POLARTTYRTM
@POLARTTYRTM 6 ай бұрын
I've had people telling me that the guy's gpu is not impressive at all because writing drivers, APIs, and all that, including the card is the "bare minimum" they would expected from any software engineer apprentice or professional, yet I don't see new cards coming out every week made by software engineers.
@taxa1569
@taxa1569 6 ай бұрын
DO NOT believe them. To do something from scratch that the past 20 years has been iterating upon over and over by massive companies is like, saying the person who added the sauce to the meal they've been preparing for 3 hours is now the chef, and these 'chefs' can ALL do what this guy did. Except he prepared the whole meal and THEY just put on the sauce. The bare minimum was in fact coming into the GPU development space and adding on to the well prepared, already existing meal.
@POLARTTYRTM
@POLARTTYRTM 6 ай бұрын
@@taxa1569 they gave every excuse possible, that the guy just messed with an FPGA and sourced the other parts... I was like in my mind... tf do it then, I want to see if it's that easy you can do it, so? Anyone can if they are given the tools.
@aliendroid8174
@aliendroid8174 3 ай бұрын
Lol most software engineers don't know anything besides react
@ryanc3011
@ryanc3011 3 ай бұрын
@@taxa1569 I changed the oil on my car but I could build a whole car from scratch, pfft easy
@monad_tcp
@monad_tcp Ай бұрын
Writing the API for a graphics card is not the bare minimum, its the thing that does the work of making the game run. Most of what you are paying in a GPU is not the hardware, but the driver. The driver is the expensive part because of all the testing that goes to ensure games actually work. Heck, even wonder why GPU drivers are that big ? really 350MB of binary executable, what's there? they have lots of copies of the actual driver optimized for specific games, that's why the nVidia has market dominance, it is the driver. Ever wonder why they don't go open source with their drivers ? (no, what they went open source on Linux was the compute driver, not the 3D graphics driver, even if they go 3D graphics driver open source, it won't be a DirectX one, that's for sure, it will be the slow poor and bugged OpenGL for almost no games that run on Linux )
@ThreaT650
@ThreaT650 6 ай бұрын
Respect to putting me on the FuryGPU, this is dope! That thing is performing around an old Radeon 8500 or something! Impressive!
@liutaurasleonavicius2680
@liutaurasleonavicius2680 6 ай бұрын
there was also one person who literally combined a Nvidia and AMD gpus and was able to use DLSS and AFMF together, this must be dark magic
@シミズルリ
@シミズルリ 6 ай бұрын
No magic, you literally just put 2 cards into 2 slots on your motherboard and they work🤣
@callyral
@callyral 5 ай бұрын
​@@シミズルリIt just working makes it seem more like magic
@shiro3146
@shiro3146 5 ай бұрын
@@シミズルリ i dont think it was as easy as that bruh
@sirseven3
@sirseven3 5 ай бұрын
​@@shiro3146it totally can work just like that. Just as you can have 2 NIC's operating at the same time. The main thing to worry about is the drivers for the card. You won't be able to pool the memory space as NVLINK but you do get an additional processor. Utilizing profile inspector you technically could get it to work as NVLINK but it takes manual config. I've ran different ram types with xmp and sometimes I got bluescreens but it was semi stable with a solid overclocking of said RAM
@granatengeorg
@granatengeorg 5 ай бұрын
I do the same in blender, rendering on an rx while using optix denoising on my gtx, all together in realtime in the viewport. Was also quite surprised that it just worked lol.
@toblobs
@toblobs 6 ай бұрын
I can only imagine what Dylan could do with Nvidia's budget with his talent
@kaptenhiu5623
@kaptenhiu5623 6 ай бұрын
Becomes a trillionaire and dominate the AI market like NVIDIA does rn?
@blanketmobbypants
@blanketmobbypants 6 ай бұрын
But I would definitely buy it
@josephdias5859
@josephdias5859 6 ай бұрын
or just enough of a budget to produce cards to play all the 90s games and early 2000s games
@ThreaT650
@ThreaT650 6 ай бұрын
And a team of engineers, YUP!
@ThreaT650
@ThreaT650 6 ай бұрын
Seems to have his head screwed on straight too. Would be great at the consumer level.
@baget6593
@baget6593 5 ай бұрын
I dont care if fury gpu succeeds, i just want nvidia to lose
@theturtlerguy1236
@theturtlerguy1236 3 ай бұрын
Real
@Rudraiya
@Rudraiya 2 ай бұрын
Dude nvidia build the GPus from the scratch too their hard work paid off
@monad_tcp
@monad_tcp Ай бұрын
@@Rudraiya they didn't build it from scratch, they made it out of parts in the inventory of ST-Microeletronics, and their 3rd generation of the product, which made their name was made out of parts for the Sega Dreamcast GPU that didn't work. They were working for other companies doing the parts that were going in the GPU before they were able to actually create the thing. They didn't made it all alone out of no-where, sure it was hard work, and it paid off, but it had to start humble like this guy with his fixed pipeline GPU that's already better than the NV1 or even the Riva TNT. That makes it even more amazing if you consider that he didn't start out of ST Microelectronics portfolio, but it was really from scratch.
@oglothenerd
@oglothenerd 5 ай бұрын
Someday we will have open source GPU instruction sets. I know it. The community always wins.
@fatemanasrin7579
@fatemanasrin7579 5 ай бұрын
And they'll be 9 year old white spoiled kids aho aill buy these tjinking they're smart and break it or put it on fire..
@xeschire706
@xeschire706 5 ай бұрын
Or we can just take either risc-v, or a custom & extended version of the 6502 that supports a 64 bit architecture, nyuzi, or even the miaow isa's, & modify & optimize them for efficient graphics processing, for use in our custom, open source GPU's instead, which I think would be a far better route in my opinion.
@oglothenerd
@oglothenerd 5 ай бұрын
@@xeschire706 I like that idea.
@lineandaction
@lineandaction 6 ай бұрын
We need open source gpu
@namebutworse
@namebutworse 5 ай бұрын
Im getting flash backs the moment i hear "source" (I am Team Fortress 2 player)
@thatonegoogleuser4144
@thatonegoogleuser4144 4 ай бұрын
​@@namebutworse I can hear that sound bro
@sklungofunk
@sklungofunk 3 ай бұрын
if feel you there
@dracoborne2648
@dracoborne2648 2 ай бұрын
Communist
@jungle2460
@jungle2460 6 ай бұрын
If that SD card actually turns out to be the VRAM, that's genius. I'd love to be able to swap SD cards to upgrade VRAM
@HamguyBacon
@HamguyBacon 6 ай бұрын
SD cards are not fast enough to be vram, thats just the bios.
@barela3018
@barela3018 5 ай бұрын
Way to slow, ram and ssds are made for different purposes, that’s why there is a loading time before games, ssd is loading the system to the ram and then when needed by the gpu, the ram will select information from the program loaded.
@aliendroid8174
@aliendroid8174 3 ай бұрын
Sd cards are very slow even for storage drives. Vram can do around 100gb per second whilst fast sd cards can do around 100mb per second. So it's about 3 orders of magnitude of a difference but beyond that there's all sorts of stuff like latency, endurance and random read write performance all of which sd cards will be terrible at
@HamguyBacon
@HamguyBacon 3 ай бұрын
@@aliendroid8174 newer sd cards can hold 128TB and do 1gb per second.
@CombatMedic1O
@CombatMedic1O Ай бұрын
Basically we as humans, we have no open source GPU tech. Sad day for us. Imagine how much faster we could progress if we did. Having basically 3 companies in the entire world owning this important information is wild and almost monopolistic. Patent laws take way to long to expire in terms of computer years. They should have to reveal source code after 10 years.
@tech6294
@tech6294 6 ай бұрын
8:57 The VRAM GDDR chip is to the lower left of the fan cooler. That mini storage card might be BIOS? Not sure lol. Great video! ;)
@jashaswimalyaacharjee9585
@jashaswimalyaacharjee9585 6 ай бұрын
Yup, I guess he is hot loading the vBIOS via SD Card.
@dbarrie
@dbarrie 5 ай бұрын
That’s the DisplayPort splitter, which takes the DP signal from the FPGA and splits it into DP/HDMI to supply the outputs. All of the (slow, DDR3!) RAM on the device is part of the Kria SoM, underneath the fan! SD card is there to update the firmware when I’m not running with the card hooked up to the dev machine!
@greyhope-loveridge6126
@greyhope-loveridge6126 5 ай бұрын
@@dbarrie That's a really good way of updating the firmware on the fly - I'm amazed you got this running! DDR3 isn't actually the worst memory you could've used, and maybe you could use some older, broken GPUs and de-solder some GDDR6 or GDDR5 to transplant while you get stuff working too? I am fascinated to see how far this comes.
@Vifnis
@Vifnis 5 ай бұрын
@@greyhope-loveridge6126 I doubt this would even work since GPUs aren't out-of-the-box FPGAs... might need to check JEDEC standards first to even see if that's possible, and iirc they only started with GDDR4 and up, wasn't everything before that 666MHz DDR3 VRAM?
@proudyy
@proudyy 5 ай бұрын
@@greyhope-loveridge6126 Definitely, he has to keep going. The amount of potential in this is crazy. And even though he said in this video that the goal never was competition, he still can get a competitor in the future :P Or even found a company which competes in the future, whatever...
@vishalkumar-dr8wq
@vishalkumar-dr8wq 6 ай бұрын
Its amazing what he was able to achieve. I used FPGA's during my time in undergraduate study and what he achieved it takes an amazing amount of skill and work.
@jnharton
@jnharton 5 ай бұрын
True, but don't discount his 20-30 years of writing code for software rendering. Not only does that mean he has a considerable foundation in understanding what the hardware needed to be capable of, it almost meant he could write his own kernel driver and an interface API to make his GPU usable under a modern Windows OS!
@cooldudep
@cooldudep 5 ай бұрын
​@@jnharton what part of op's comment discounts the guy's decades of work?
@mrnorthz9373
@mrnorthz9373 5 ай бұрын
​@@cooldudep i dont think he means the comment discounted his skill, he means to anyone that may think this is a miracle or something unexpected from a guy of this magnitude
@oktc68
@oktc68 6 ай бұрын
This is the most interesting PC oriented video I've seen for ages. Nice1 Vex, nice change of pace.
@mustafa._.9371
@mustafa._.9371 2 ай бұрын
0:56 You vs. the guy she tells you not worry about.
@CipherDiaz
@CipherDiaz 5 ай бұрын
An FPGA is *NOT* in a raspberry pi. They are quite expensive. Also, the programming on these fpga's is not anywhere near similar to C/C++ or anything like that, you are basically writing the logic for how electricity flows between components. An fpga also has a clock-rate, and the higher the rate - the more these things cost. So for him to get say, 150fps, he would most likely need a fast onboard clock. And most likely more gates to work with, since the only way to truly optimize anything on an FPGA is by heavy use of tables. Which might not apply to a GPU since its basically moving a ton of data around memory as quickly as possible. But yeah, awesome project!
@StuffIThink
@StuffIThink 6 ай бұрын
I've been watching him try to make this thing work forever super cool to see someone else giving him some exposure.
@maxcarter5922
@maxcarter5922 6 ай бұрын
What a great contribution! Crowdsource this guy?
@arenzricodexd4409
@arenzricodexd4409 6 ай бұрын
To play Quake at 60FPS?
@mrnorthz9373
@mrnorthz9373 5 ай бұрын
​@@arenzricodexd4409quake at 60 fps today, cyberpunk at 60 fps tomorrow.
@scudsturm1
@scudsturm1 5 ай бұрын
@@arenzricodexd4409 dont complain if u cant build a gpu yourself and write the driver yourself
@arenzricodexd4409
@arenzricodexd4409 5 ай бұрын
@@scudsturm1 nah this guy do it for fun. crowd source? that is an attempt to take away his passion for this.
@PeterPauls
@PeterPauls 6 ай бұрын
My first GPU was a 3Dfx Voodoo 3 3000 and 3Dfx made the first GPU available for the masses (AFAIK) and they disappeared around 2000-2002.
@SJ-co6nk
@SJ-co6nk 2 ай бұрын
Technically 3dfx never made a GPU. The first GPU was the Geforce, it contained hardware T&L which is what it set it apart from just a 3d accelerator card. Trying to figure out if this card actually has hardware T&L or not.
@tropixi5336
@tropixi5336 6 ай бұрын
"YoU DiDnT mEnTiOn InTeL"
@AryanBajpai_108
@AryanBajpai_108 6 ай бұрын
He did 😂
@Prince-ox5im
@Prince-ox5im 6 ай бұрын
​@@neon_archhe's mocking someone's comment not saying it
@ranjitmandal1612
@ranjitmandal1612 6 ай бұрын
😂
@tropixi5336
@tropixi5336 6 ай бұрын
@@AryanBajpai_108 im talking about the start where he said "NVidia and amd are top contenders" ....
@urnoob5528
@urnoob5528 5 ай бұрын
@@ranjitmandal1612 smh
@BOZ_11
@BOZ_11 6 ай бұрын
Fury?? ATI Technologies 'bout to make a complaint
@core36
@core36 6 ай бұрын
I don’t think ATI is going to make any complaints anytime soon
@BOZ_11
@BOZ_11 6 ай бұрын
@@core36 so i see sarcasm isn't your strongest suit
@urnoob5528
@urnoob5528 5 ай бұрын
@@BOZ_11 tell that to urself
@MarioSantoro-ig5qh
@MarioSantoro-ig5qh 5 ай бұрын
Unless he starts selling them they probably wont do anything.
@MarioSantoro-ig5qh
@MarioSantoro-ig5qh 5 ай бұрын
Unless he starts selling them and he makes a ton of money off them. Its unlikely they will do anything.
@Drunken_Hamster
@Drunken_Hamster 6 ай бұрын
The future where I can piece together and upgrade my GPU like I can the rest of my system would be lit. NGL I'd love for the GPU scene to instead have motherboard chip slots similar to the CPU socket, with their own section for memory, special compute units (to improve ray tracing or AI separately from rasterized processing), and output pathways so you never have to worry about finding a card with the types and quantities of outputs that you want. It'd also make cooling simpler and likely more compact, kinda like how it is for CPUs with semi-universal setups that only require certain amounts of height as the sole variable. And it'd DEFINITELY make liquid cooling more accessible, not that I want to do that as much as I once used to.
@miguelcollado5438
@miguelcollado5438 6 ай бұрын
Real3D, Mellanox, Realtek made decent GPU's in the 90's as well... but they have eventually all been absorbed by the same 3 major brands in the 2000's... Dylan Barrie deserves our community's full support for his work.
@straightshooter3693
@straightshooter3693 5 ай бұрын
INDEED !
@CrowandTalbot
@CrowandTalbot 5 ай бұрын
wasn't quake that game that used to prove or break computer builds back in the day? and his gpu handles it gorgeously? that's enough for me to know he's onto something
@dienand_gaming
@dienand_gaming 2 ай бұрын
Love how he straight up says writing the driver was the hardest part 😂
@UltraVegito-1995
@UltraVegito-1995 6 ай бұрын
*If only moore threads became successfully AI GPU in china causing them to neglect Nvidia or AMD....*
@arthurwintersight7868
@arthurwintersight7868 6 ай бұрын
I just want to see more actual competition, to drive down prices.
@zerocal76
@zerocal76 6 ай бұрын
😅😅 You must know very little about China to make a comment like that. The last thing anyone in the world wants is a gov like China's to become completely tech-independent, especially in the hardware accelaration & AI space!
@arthurwintersight7868
@arthurwintersight7868 6 ай бұрын
@@zerocal76 - China is highly likely to implode under their own pressure at some point. Especially if their shoddy construction work at the Three Gorges ends up being as bad as people think. In the meantime they can drive down GPU and NAND prices.
@arenzricodexd4409
@arenzricodexd4409 6 ай бұрын
Still does not make them to ignore AI.
@hey01e5
@hey01e5 6 ай бұрын
unfortunately, if moore threads became competitive they'd get sanctioned for "national security reasons", leaving us westerners stuck with nvidia and AMD who will just price gouge the GPUs
@dustindalke7979
@dustindalke7979 27 күн бұрын
He died next week by self inflicted high projectile wounds to the back of the noggin
@AleksanderFimreite
@AleksanderFimreite 5 ай бұрын
I would assume the rates displayed around 5:30 indicates what speed the internal game updates (ticks) are running. One render seems to take around 25 - 45 ms to draw, and another 5 - 10 ms to clear the data for the next render. This indicates a total range of 30 - 55 ms per update. Formula to calculate rate per second would be (1000ms / x) which becomes roughly 33 - 22 range. Which seems accurate to how choppy the enemies move around. Camera motion is much smoother than their movements. Despite this, I'm also impressed by the efforts of individuals trying to tackle such a daring project.
@Desaved
@Desaved 6 ай бұрын
We're at the point where GPUs are more expensive than the entire rest of the computer!
@rainbye4291
@rainbye4291 6 ай бұрын
My man just did the unthinkable. Great effort for making a gpu this good ALONE.
@cybernit3
@cybernit3 6 ай бұрын
The biggest hurdle to make a GPU is you need lots of money to make the gpu chip if its an ASIC but later on the ASIC chip would be cheaper than using an FPGA. I wish they could make high performance FPGAs that are cheap; not so expensive. I have to give this fury gpu guy some credit for making it; this could lead to something decent in the future or inspire future gpu designers. Also there is VAMPIRE AMIGA who made an extension of the AGA Amiga graphic chipset.
@Tyrian3k
@Tyrian3k 6 ай бұрын
It simply can't be as cheap as a chip that is tailor made for the specific desired purpose. It's like wanting a van that can perform like an F1 car without it ending up costing more than the F1 car.
@pauloisip3458
@pauloisip3458 6 ай бұрын
I can see this guy becoming successful unless nvidia makes a move on the guy
@AssassinIsAfk
@AssassinIsAfk 5 ай бұрын
Either 2 things will happen 1) Nvidia becomes Nintendo/ Sony and send a cease and desist 2) they send him a opportunity to work for them to fix their budget cards or AMD/intel send him a opportunity to work for them.
@NicCrimson
@NicCrimson 4 ай бұрын
@@AssassinIsAfk A cease and desist for what?
@AssassinIsAfk
@AssassinIsAfk 4 ай бұрын
@@NicCrimson I don't think you understand the joke
@sturmim
@sturmim 6 ай бұрын
He could braze some old GDDR6 chips from broken or old GPUs. Would like to see that.
@kesslerdupont6023
@kesslerdupont6023 6 ай бұрын
It may be good enough to just put some DDR5 on there depending on what speed the GPU is currently using.
@jcoyplays
@jcoyplays 5 ай бұрын
He could've used DDR3 and been on par/overkill. (800-2133 MT/s, or about 400-1051 MHz, which would match/exceed the FPGA clock speed)
@kesslerdupont6023
@kesslerdupont6023 5 ай бұрын
@@jcoyplays yeah that is true
@Hemeltijd
@Hemeltijd 6 ай бұрын
This is so informative and cool. If you find any more topics alike, can you make more videos like this?
@vextakes
@vextakes 6 ай бұрын
Fs
@mtcoiner7994
@mtcoiner7994 Ай бұрын
This type of stuff is very interesting. It always blows my mind when I see people upgrading GPU memory modules. Makes you wonder why the manufacturers wouldn't have maxed out the memory potential in the first place.
@gamma2816
@gamma2816 6 ай бұрын
Future prediction: 1. Hardware will become open source but lackluster but you can now build individual pieces of your PC just like you could the PC itself before. 2. At first the trend will be for people in the know and won't affect the market. 3. Some content creator, say Linus, will build an insane GPU and CPU and people copy it. 4. Now the open source hardware is so adept that it's actually a market threat for Nvidia and AMD. 5. Building becomes more streamlined like PC building and you can now buy parts that judt click in place building your own chips like legos, again much like PC building. 6. Mainstream GPUs and CPUs cannot compete as much like Mods for games the vanilla experience judt can't compete with what Johnny cooked up in his mom's basement. 7. For Nvidia and AMD to compete they now have to adapt so either they or some up and comer OR Intel will start making licensed parts for projects like this and will promise, "XYZ if you buy XYZ from X company and not Y or Z!" And it will be up to the individuals to check which combination is best for performance just adding another customisation choice for PC gamers. 8. A huge wave of insane complaints online about performance because optimising for this many combinations of parts on PC is impossible for developers. 9. Developers are forced to focus on specific gear so they make games optimised for specific systems, this way the top company will buy performance from devs making them optimise for their parts. 10. Now open source is once again dead because "Why buy your own gear with bad performance when you can buy a full Nvidia board with optimisation lol!" 11. Back to square one as people are forced to buy full boards or brand items, aka a premade GPU, aka what we have today. 12. 🤷‍♂️🤣
@powercore9277
@powercore9277 5 ай бұрын
really doubt it bc have you seen the machines that intel uses to make their cpus? they need to use machines made by one specific dutch brand ASML and those machines are not cheap
@gamma2816
@gamma2816 5 ай бұрын
@@powercore9277 Well I agree, probably absolutely not in the near future, but given that tech evolves and a lot of "too expensive for civil use" things in the past are now cheaply available, well relatively anyway. There was a time when cars were not an everyman's thing but here we are, so near future, absolutely not, but who knows later down the line. 😝 But then again we are at a little stop in evolution tech wise as transistors have reached their size limit in how small they can be if I understood it right. But if quantum tech is invented to a civilian usage then maybe we'll continue, but I know nothing about this, it's just what's currently suggested. But hey, if their theories are true and quantum computers reach civilians in gaming then ping will be a thing of the past as all will have instantaneous server reach SUPPOSEDLY.
@sergemarlon
@sergemarlon 5 ай бұрын
I didn't see you mention AI. You pointed out the step in which human developers will fail and it seems like you don't think that's the perfect role for AI to fill.
@gamma2816
@gamma2816 5 ай бұрын
@@sergemarlon Very true! It's I guess I just don't hope so. 😅 I love AI tech but it freaks me out, it's like staring the thing that will end you in the face. AI is great, but terrifying so I don't want to think about it too much. Guess it's much like the nuke, great in power and as a scientific marble that ended wars, but terrifying when you think about it for too long. 😅 But you're right, AI could probably handle it, but an AI built to build more machines that we don't understand, hence why we need them to do it, is a little uncomfortable for me. 😝
@jamesspencer1997
@jamesspencer1997 Ай бұрын
When you talk about MODS are you talking about paid DLC vs what some fan creates? I honestly have always been amazed how fans have actually breathed new life into even old games with the mods they have made. We're talking mainstream game studios have budgets in the millions make a DLC and still Johnny made something far more spectacular and he would be happy if someone bought him a cup of coffee or some hot pockets. I think it has to do with love.
@tukebox-mf9bo
@tukebox-mf9bo 4 ай бұрын
we learn how to prgram fpgs boards like the max 10 in school in vienna austria, its such a complicated process, props to the guy
@flakes369
@flakes369 6 ай бұрын
TempleOS energy
@zawadlttv
@zawadlttv 5 ай бұрын
the sdcard probably holds the programming of the chip. probably the easiest way to update that like that
@ohnoitsaninja
@ohnoitsaninja 6 ай бұрын
It's not hard to make a graphics card, if we abandoned our current software library. It's very hard to make a graphics card thats compatible and performant on every version of opengl, directx, vulkan that has ever come out.
@rtchau4566
@rtchau4566 2 ай бұрын
"You might hear that the audio is kinda crackly, I don't think it's running perfectly..." I was there, Gandalf. I was there 28 years ago when Quake first came out. That's what it sounds like, more or less.
@TriPBOOMER
@TriPBOOMER 6 ай бұрын
the 4060 is not the same as the 3060, it has less of ALL the cores and processors, less memory bandwidth, and 50% less ram. Statistically the 3060 is a better card in every way and with a little OC' which the 3060 is more than happy to give, the Gen gains get lost, other than frame gen & Dlss3, the 3060 is a better card
@darthpotwet2668
@darthpotwet2668 6 ай бұрын
Fg and dlss3 are the same thing?
@RogueSamuri7676
@RogueSamuri7676 6 ай бұрын
Well considering u can't get a 16gb 3060 but u can get a 16gb 4060 dose make a difference not only that but the Asus pro art 4060 ti 16gb is a little faster then the 3060 tho I don't see much of a difference in the speed nor dlss2 vs dlss3 not much of a difference there either
@TriPBOOMER
@TriPBOOMER 6 ай бұрын
​@@RogueSamuri7676 Yes 3060 has only got 12 gb, which is plenty for the resolutions its aimed at I have never ran out and never seen it swapping with the system ram in 1080p and some 1440p, both resolutions are fine with 12 gb & the 3060 isn't trying to be a 4k card so 4k is irrelevant, infact its not really a 1440p card but it can do a bit, my card runs an OC of 20% on the die and just over a 1ghz on the Vram, its keeps up with the 4060, beating it in few 1080p spots in the bench numbers and game runs I can compare to online, oh and cards claim to fame, my 3060 out benched a pro w6800 in Cinebench 2024 Gpu render, and 3rd fastest UK 3060 in port royal 🙌😎🤣and yes the 4060 Ti is faster than a 3060 but in the same way a 3060 Ti is faster than a 3060 lol
@TriPBOOMER
@TriPBOOMER 6 ай бұрын
@@darthpotwet2668 yes they are my bad, didn't mean the & or to type them that way around eg. dlss3 frame gen, lol brain fart while my kids were pecking my head lol
@adebolaadeola
@adebolaadeola Ай бұрын
the sound is fine - that was the sound back then
@mloclam6917
@mloclam6917 6 ай бұрын
But will it run Crysis
@imafirenmehlazer1
@imafirenmehlazer1 6 ай бұрын
"Only gamers will get that one"-jensen swoosh
@guarand6329
@guarand6329 5 ай бұрын
I bet if he took the transistor design and converted that to dedicated silicon vs the fpga, it would run faster. Pretty cool that he created a gpu design, also wrote the driver, and it's working.
@Alexruja3227
@Alexruja3227 6 ай бұрын
Literally became the nr 1 viewer
@Ele20002
@Ele20002 5 ай бұрын
I love projects like this. Making a driver fully compatible with windows, and actually creating all the required ports to connect via PCIe is insanely impressive. Using an existing graphics API would be even more impressive, but modern APIs are so complex these days it'd be a hell of a task for one person, so I can understand skipping that step. It'd really be great to get more GPU designs into the open though. GPU architecture isn't really shared in that much detail - everyone does their own thing, so you can only take inspiration from the higher level concepts. There's a good reason to hide GPU ISAs behind the driver though - a lot of optimisations can be enabled by the compiler and features integrated into each new version that'd otherwise need convincing developers to add support into their game for. Breaking into the GPU space in performance is also difficult because so many optimisations are made specifically targeted at a certain GPU, forcing newcomers to support hardware acceleration of that feature as well to not fall behind in benchmarks, even if there's another way to achieve the same effect that's more efficient on their hardware.
@benjamingavrilis71
@benjamingavrilis71 6 ай бұрын
I'm gay
@cachalotreal
@cachalotreal 6 ай бұрын
Hello vro❤
@I_hate_you_Forza_motorsport
@I_hate_you_Forza_motorsport 6 ай бұрын
I feel so sorry for Nvidia
@dickyadhadyanto4986
@dickyadhadyanto4986 6 ай бұрын
damn
@AryanBajpai_108
@AryanBajpai_108 6 ай бұрын
Lol
@1NH4rM0ny
@1NH4rM0ny 6 ай бұрын
kek zukzuk
@tony_T_
@tony_T_ 3 ай бұрын
Imagine a GPU that's developed by the community. There are so many geniuses just out there who I bet would love to work on this. This could actually go so far.
@iRubisco
@iRubisco Ай бұрын
Honestly, the possibility of an open source GPU and the information we need to accomplish it is worthy of any award! Amazing video ☺️
@jasont80
@jasont80 5 ай бұрын
He's basically using a single-board computer to render graphics in software. It will never be close to a modern GPU, but this level of tinkering is amazing. Love it!
@PANTHEROP001
@PANTHEROP001 2 ай бұрын
No one : well i can make Better one , just tell me the ingredients Me : SAND!
@sklungofunk
@sklungofunk 3 ай бұрын
well its clearly not meant to be competitive as a gpu but i feel that the fact that a whole gpu (and maybe other pc parts in the future) can be open source just feels like a revolution to me, i mean if in the costs of production, the costs of running a buisness with shipping costs etc, like if you were to buy something like this on amazon or something, we could break the costs of royalships since this would be open source and if people like this hero would to build other open source pc parts in the future, we could build a whole pc without royalships addons and that could leave us to buy way less costy hardware (that doesnt cost less because of sketchy ways the owners came to own these pieces of hardware) to build way less costy pcs, clearly at the cost of performance, to build diy home servers or nas'es, because nas'es or home servers doesnt need usually much performance, maybe a couple of parts could come from more performant sellers to raise a little the trusting of the machine, but this whole concept seems like a deal to me
@PhoenixKeebs
@PhoenixKeebs 6 ай бұрын
Unlike the other companies Intel actually has a chance. Since they have billions of money to spend on their GPUs and have a large team to pump out drivers. I think by Celeron they will be able to keep up with the other 2 companies. For now we should "just let them cook."
@rBennich
@rBennich Ай бұрын
This would be so cool to use as a direct gpu "emulator" in conjunction with virtual pc's for retro gaming. Like 86box, where you choose your retro parts at will, and with this card, it would reprogram the FPGA on the go, while you have it outputted to a secondary CRT monitor, for instance, and use a shortcut command to switch keyboard and mouse focus between the host and virtual pc. One virtual GPU to rule them all.
@reinaweis
@reinaweis Ай бұрын
If I had to guess, the barrel jack is to power the card when not in a system, and the sd card is to load on new firmware. This is in line with other FPGA's.
@jameshadaway8621
@jameshadaway8621 6 ай бұрын
Great video I remember the want of 3dfx cards in 90s and always wanted to work in IT and its good people can build there cards as hobby.
@Mio96O-O
@Mio96O-O 5 ай бұрын
It's probably would be more cheap and fun if there's like 15 company that can make their own cpu and gpu instead of 3/2 company monopoling the market
@bananaman8693
@bananaman8693 6 ай бұрын
This man will become head designer at NVIDIA bro trust ms
@sanketsbrush
@sanketsbrush 5 ай бұрын
I always wanted to make my own gpu with my own bare hands , but I don't have any knowledge of it
@gravecode
@gravecode 5 ай бұрын
I'm praying a open-source gpu commnunity develops low-key the world need one now more than ever.
@dennisestenson7820
@dennisestenson7820 5 ай бұрын
About 10-15 years ago I worked on a product that used an FPGA to generate video output. It's impressive, but definitely not unheard of.
@FlockersDesign
@FlockersDesign 6 ай бұрын
If his frametime is around 38 this means its running below 30 FPS 60 FPS is a frametime around 12 And yes before someone askes thjs is my job as a Enviroment/lighting artist in the game industry for 12 years
@diaman_d
@diaman_d 6 ай бұрын
16.6 ms to be precise
@BloodravenRivers
@BloodravenRivers Ай бұрын
i am astounded that he managed to do that. it means that the bar is only gonna get higher. it may not be earth shattering, but every earth shattering development usually starts by the pioneers, the pathfinders, hes laid the ground work for future enthusiasts and small devs to make, maybe not earth shattering, but market busting developments.
@ccramit
@ccramit 3 ай бұрын
I watched someone create a rudimentary GPU from a bread board. Blew my mind. Yeah, there were only like 64 pixels, but it was freaking cool.
@Gamesational1
@Gamesational1 3 ай бұрын
Cool! Now we need Open source PhotoLithography devices.
@R3TR0J4N
@R3TR0J4N 6 ай бұрын
unironically china market has been a blessing for budget builders notably the e-sport aimed GPU w/ the "SP's" and when the brand they represent was a sister brand like Palit(taiwan) and Inno3D
@gamekiller9053
@gamekiller9053 5 ай бұрын
That open-source gpu is not meant to be anything but a toy
@imSkrap
@imSkrap 19 күн бұрын
Crazy, one thing that’s bothered me so much about GPU’s is how disgustingly HUGE they are getting… like isn’t the modern idea of innovation to make stuff better in smaller packages?
@kaimanic1406
@kaimanic1406 6 ай бұрын
I can't even imagine to build my own GPU. This guy is amazing!
@LuGaKi
@LuGaKi 4 ай бұрын
"imagine having a pre-build GPU"
@histerical90
@histerical90 6 ай бұрын
You know the problem with that survey? There are also counted people with steam decks, rog allys, other apus I think that percentage is mostly from there, while for nvdia those are just proper full gpus.
@marisbarkans9251
@marisbarkans9251 4 ай бұрын
now I finally understand how people heard and saw me when I was younger. You CAN make a computer just on test boards with wires and ic's. You can make a GPU from the same. You can re-solder memory, cpu's, gpus's and make frankenstein cards. You can power mod GPu's. You ln2 overclock. You can do all sorts of things but its all for fun and learning. Even upgraded cooling solutions are usually useless unless your card has very shit cooling. Most i found useful was modding laptop heatsinks if you didnt care about the looks or noise. All of this has been already done. I would say that it would be more interesting combining gpu's and writing drivers to get them to work then making your own.; In the end you dont rly make your own x=cause you take a universal ship anyway. so why not use old amd or nvidia cards to swaps gpu dies and ram etc?
@migueldoesstuff6994
@migueldoesstuff6994 Ай бұрын
the fact he didn't talk about gta 5 in the intro is insane
@noth606
@noth606 6 ай бұрын
RasbPi has zero to do with FPGA, just relatively important point, Pi's use off the shelf chips, slap them on a custom board with some RAM and shizzle, connectors etc and call it a day. The FuryGPU is a significantly more involved project than that. I could throw together a RasbPi sort of equivalent thing in a couple of weeks for the hardware, a month or two for the adaptations of the codebase to the board assuming the chosen CPU is reasonably supported. It would take me years to fart out something approaching the FuryGPU if I'm lucky and have loads of resources. I am not bullshitting, I have done sort of similar things, but different purpose and lower power, to the rasbpi. I ended up not proceeding beyond making a first series of fully fabbed and tested boards in the end, but it didn't have anything to do with the hardware or software, but with me splitting up with my then GF, having an argument with my business partner at the same time, having to move and within less than a year finding new GF, deciding to and getting married. New wife soon got pregnant and priorities shifted from fugging around with tech, to different things. 10 yrs ago.
@MyouKyuubi
@MyouKyuubi 6 ай бұрын
the GT 1030 is an absolute gigachad of a card, no joke... It's the best graphics card you can get that can get by purely off of passive cooling (Radiator with no fan)! So it's brilliant for like tiny, almost pocket-sized "portable stationary" PC builds. :)
@vasoconvict
@vasoconvict 5 ай бұрын
It absolutely is a joke. Mini pcs can run off integrated graphics that beat it to the ground, intel and some nvidia card makers are making tiny gpus that are still quiet as passive cooling is only a good idea if you have something extremely underpowered, dust, or you cant bear 20db of noise. Its about time the gt 1030 dies.
@MyouKyuubi
@MyouKyuubi 5 ай бұрын
@@vasoconvict Integrated graphics can't run entirely off of passive cooling though... They need fans blowing air on them. :P
@vasoconvict
@vasoconvict 5 ай бұрын
@@MyouKyuubi Passive heatsinks made out of copper..
@MyouKyuubi
@MyouKyuubi 5 ай бұрын
@@vasoconvict nah, dude, integrated graphics uses CPU... there's absolutely no way you can use passively cooled integrated graphics without overheating the CPU playing something like Half Life Source. :P You're gonna needs a MASSIVE heatsink at the very least in order for passive cooling to work, at which point, we're no longer in the pocket-sized scale of computers. Get real, bro.
@SalveMonesvol
@SalveMonesvol 5 ай бұрын
This line of work could be amazing to emulate old consoles.
@Yoda_Gaming1738
@Yoda_Gaming1738 5 ай бұрын
I think people would gladly fund him a kickstarter so he can open a company and get his project going. I imagine hiring around 5-10 people would greatly speed up the process
@kobesuitt4449
@kobesuitt4449 2 ай бұрын
Whats crazy is is once you learn something and understand it. You can replicate it faster or even add new things to it. Which in turn means the next one he is going to make is going to better than this one. And honestly I cant wait to see it
@Skullkid16945
@Skullkid16945 5 ай бұрын
I hope there is an open source hardware boom soon, or at the least more big names getting involved in finding ways to make things like this more avaliable for the open source community as a whole.
@grandmasterautistwizard4291
@grandmasterautistwizard4291 5 ай бұрын
Shout out to this guy. The moment it's viable for the everyday fella, the opensource community is gonna go fucking crazy with it.
@Ninetails94
@Ninetails94 6 ай бұрын
it wouldnt be too hard to make the gpu itself, the self written code is the hardest part, so the fact that some dude made a gpu from scratch its pretty neat, hope someday we could get custom gpus that out perform the major players.
@hfric
@hfric 2 ай бұрын
Can't wait fo Dylan 20 and the news , bros just dropped a GPU patch ... it can do Raytracing...
@stop7556
@stop7556 5 ай бұрын
One of the biggest hurdles for introducing new hardware is definitely more so about making competitive drivers.
@caboose6411
@caboose6411 5 ай бұрын
To be fair we are starting to hit a roadblock that we can’t pass with electrons quantum tunneling because we can’t go smaller and closer with transistors. Maybe someday we will move past that but as of now it’s just not happening
@theodorepollock1273
@theodorepollock1273 5 ай бұрын
VGA ported GPUs aren't too hard to make, it's even possible to make on a breadboard. It's just they suck for performance. Anyone willing to undertake a project that requires BGA soldering and your own kernel writing? Now that, that's badass Dev and I'm 99% sure he's being underappreciated at his job.
@Revoku
@Revoku 5 ай бұрын
the CPU/GPU on a raspberry PI is an arm cpu/whatever internal gpu, has set instructions /pathways for both. an FPGA is a chip that you can program the gates/instructions/pathways you run code that changes the configuration of the chip
@vinvin4884
@vinvin4884 2 ай бұрын
This is super cool. I have a question though. This "GPU," was created almost entirely through software/programming. I wonder what the actual limit of the silicon is? I wonder at what point it becomes physically impossible to improve its performance further?
@TheCustomFHD
@TheCustomFHD 5 ай бұрын
I "know" someone that has been reversing WDDM, the video driver stack since Win Vista. That work would help this gpu probably a bit.
@qianqifborinaga6875
@qianqifborinaga6875 5 ай бұрын
The *"IM TIRED OF CRITISIZING BS, ILL BANKCRUPT YOUR COMPANY BY UNDERSTANDING AND MAKING MY OWN GPU"* move
@emiljagnic
@emiljagnic 5 ай бұрын
Awesome, thank you for reporting about this!
@Powerman293
@Powerman293 6 ай бұрын
I could see this project eventually turn into the PC equivalent of those FPGA clone consoles but for 90s GPUs. A very cool demonstration of tech that fills a niche market but ultimately is not threatening to the big players.
@aggressivefox454
@aggressivefox454 4 ай бұрын
For how customizable and “freeing” the pc market is with a wide variety of interchangeable parts, operating systems, etc. I would have expected for their to be less of a monopoly on gpus. I always got the impression they were just mini computers so I kind of thought that you might be able to easily build them yourself (granted not as easily as a pc). Open source and custom built gpus would be awesome to see though. I’d love to have a multi gpu set up that I can build myself
@Sunowaddle
@Sunowaddle 2 ай бұрын
Insane, never knew how extremely impressive GPU's were before this video, now I have a newfound respect for these things haha
PC Gaming is Slowly Destroying Itself
22:29
Vex
Рет қаралды 94 М.
We Bought 6 Dead GPUs. Can We Fix Them?
27:01
Linus Tech Tips
Рет қаралды 4,6 МЛН
小路飞嫁祸姐姐搞破坏 #路飞#海贼王
00:45
路飞与唐舞桐
Рет қаралды 18 МЛН
When you Accidentally Compromise every CPU on Earth
15:59
Daniel Boctor
Рет қаралды 834 М.
Lossless Scaling Deep Dive!
28:18
Jagadhie
Рет қаралды 8 М.
I Tried AMD to See How Bad It Really Is
18:01
Vex
Рет қаралды 1,1 МЛН
Overclocking a $20 GPU Until I Get 240 FPS
9:49
Lecctron
Рет қаралды 832 М.
The Judge Dismissed Valve's Defence, Now Steam Is Different.
12:10
Bellular News
Рет қаралды 641 М.
It’s Been a Good Run, Phone Providers.
26:31
Data Slayer
Рет қаралды 4,8 МЛН
I Optimised My Game Engine Up To 12000 FPS
11:58
Vercidium
Рет қаралды 687 М.
NEVER install these programs on your PC... EVER!!!
19:26
JayzTwoCents
Рет қаралды 3,5 МЛН
Wait, AMD's New GPUs might be Insane.  (RDNA 4)
26:04
Vex
Рет қаралды 75 М.
Смартфоны через 10 лет
0:12
История одного вокалиста
Рет қаралды 874 М.
Последствия выхода Айфона 16
0:23
ТРЕНДИ ШОРТС
Рет қаралды 4,3 МЛН
Кто-то еще помнит про эту консоль?
0:51
ПРОСТО ЛЕШКА
Рет қаралды 921 М.