Some bookmarked here : Graphic Card 50 Series www.nvidia.com/en-us/geforce/graphics-cards/50-series/ Gb200-nvl72 Next Gen Data Center GPU www.nvidia.com/en-us/data-center/gb200-nvl72/ www.nvidia.com/en-us/data-center/grace-cpu/ Nemotron Model Families huggingface.co/nvidia blogs.nvidia.com/blog/nemotron-model-families/ nvidianews.nvidia.com/news/nvidia-launches-ai-foundation-models-for-rtx-ai-pcs Cosmos www.nvidia.com/en-us/ai/cosmos/ github.com/NVIDIA/Cosmos huggingface.co/collections/nvidia/cosmos-6751e884dc10e013a0a0d8e6 Thor Blackwell Driving Platform nvidianews.nvidia.com/news/nvidia-drive-hyperion-platform-achieves-critical-automotive-safety-and-cybersecurity-milestones-for-av-development www.nvidia.com/en-us/self-driving-cars/in-vehicle-computing/ AI Supercomputer - Project Digits www.nvidia.com/en-us/project-digits/
@YoungBloodyBlud6 күн бұрын
Never stop making content Benji! I stay up to date watching your content in ai world! Thank you for all you do for ai community 🙌
@TheFutureThinker6 күн бұрын
I appreciate that!
@Dante02d125 күн бұрын
I saw a comment saying "with NVidia, marketing is always a hilarious joke", and I have to agree here. The 5070 mobile, the performance of a RTX 4090! .... Except it only has 8GB VRAM, so anything that uses VRAM (which means, literally everything AI related or graphics related) will be heavily restricted. The 4090 has 24 GB VRAM, btw. I WANT a 5070 laptop. But I also want loads of VRAM.
@TheFutureThinker5 күн бұрын
yes totally.
@FusionDeveloper5 күн бұрын
I'd like to see AMD do a keynote and pull a Playstation vs XBox thing. Just walk up and say "64GB VRAM, works with all Nvidia based AI models, $1000" Crowd cheers and they walk off.
@LouisGedo6 күн бұрын
Insane tech! 😲 😱
@TheFutureThinker6 күн бұрын
Yes😉
@RenzoAlba5 күн бұрын
@@LouisGedo 👋👋👋
@LouisGedo5 күн бұрын
@RenzoAlba 👋
@mickelodiansurname95785 күн бұрын
How dumb would you feel if you had just bought two 4090's today!
@dkamhaji5 күн бұрын
Hey Benji! So do they mention how much vram will each gpu stock?
@TheFutureThinker5 күн бұрын
Oh sorry I forgot to put some links I find. I will post it now. It have mentioned
@insurancecasino57906 күн бұрын
THey need to let us link GPUs again. It won't be long before they get hacked along with CPU power. Why buyer a bigger GPU then?
@TheFutureThinker6 күн бұрын
I see the NV link are available for 50 series. But 4090 ... They stopped
@insurancecasino57905 күн бұрын
@@TheFutureThinker They were losing money? II had friends that would 2x their VRAM. But I don't know the limits of hacking them. I have to study how they work. I like Nvidia, but they scam a lot. IMO
@TheFutureThinker5 күн бұрын
Few years more when quantum chips become available on the public, VRam will be legacy.
@insurancecasino57905 күн бұрын
@@TheFutureThinker Blender actually has been doing it, but it's not AI. It's tedious work.But an open source model built for Blender could probably do it. It would just be assigned the tedious work with as low as 2 GB of VRAM. Any really smart developer would be a fool to make one and give up his paywall. I would not. Gimp made a lot of folks rich, with paywalls. It was made complicated. LOLIt's designed for memory not logic. Logic is basically based on what people already know.
@mickelodiansurname95785 күн бұрын
@@TheFutureThinker really? Now thats a selling point Huang missed here!
@FusionDeveloper6 күн бұрын
More VRAM or it's trash. 5090 or 5090 ti needs to have a minimum of 32gb vram. If it has 16, it's a joke.
@TheFutureThinker6 күн бұрын
Exactly, 16gb then keep using the 4080 or 4070 then
@NoFaithNoPain5 күн бұрын
It has 32GB apparently on the 5090
@mickelodiansurname95785 күн бұрын
@@NoFaithNoPain Even thats looking meek for AI these days and 48 will be the new 32.
@FusionDeveloper5 күн бұрын
@@mickelodiansurname9578 Good to see someone else is disappointed and understands how VRAM gb amount is soo restricting for AI. It is what is holding everyone back from loading models that produce amazing outputs. I have 11gb VRAM and it is soo insanely limiting that I feel I need to get a card with more VRAM, so "upgrading" to a 5070 with 12gb VRAM "fast as a 4090 (which has 24gb VRAM)" is not for people with AI. Official specs from Nvidia: 5090 (32gb VRAM) $2,000 5080 (16gb VRAM) $1,000 5070 Ti (16gb VRAM) $750 5070 (12gb VRAM) $550 (joke card by Nvidia for trolling, worthless for AI in 2025) How it should be, if they want to push AI to it's potential instead of throttling it back: (below are not real specs) 5090 (64gb VRAM) 5080 (48gb VRAM) 5070 Ti (32gb VRAM) 5070 (24gb VRAM) (above are not real specs) Instead, we get FASTER VRAM, INSTEAD OF MORE VRAM. I had a feeling they would do this. The only positive thing, at least the 5090 will have slightly more VRAM than the 4090.
@FusionDeveloper5 күн бұрын
Official specs from Nvidia: 5090 (32gb VRAM) $2,000 5080 (16gb VRAM) $1,000 5070 Ti (16gb VRAM) $750 5070 (12gb VRAM) $550 (joke card by Nvidia for trolling, worthless for AI in 2025)
@TheFutureThinker5 күн бұрын
none of it in the list I am interested.
@FusionDeveloper4 күн бұрын
@@TheFutureThinker I think getting a 50 series is only for people who always get the latest card and for people with very old cards. Someone who saved up for a 2080 or 3090 or 4090 that was an upgrade from a 10 series or older, shouldn't save up for a 50 series.