Videos you probably want to watch: ✅ 3D Deep Learning 👉🏼 kzbin.info/aero/PL3OV2Akk7XpBPBEw1jekpxDJYIRPEHbUi ✅ Creating a 3D model of a Photo using Python and Numpy-STL 👉🏼 kzbin.info/aero/PL3OV2Akk7XpDs8uSg6iegIKlmY8XxSrNr ✅ 3D Deep Learning tutorial with Nvidia Kaolin and PyTorch 👉🏼 kzbin.info/www/bejne/eKi9aKZtedaVfNU ✅ 3D Deep Learning tutorial with Pytorch3d 👉 kzbin.info/www/bejne/rGTLhXdvopeng5o ✅ Nvidia GANverse3D - 2D to 3D Model 👉🏼 kzbin.info/www/bejne/qGaoZYeeidqemZY ✅ 2D Photo to 3D Model by Google Phorhum 👉🏼 kzbin.info/www/bejne/qILSkn19q6-MgsU ✅ Blender Python API 👉🏼 kzbin.info/aero/PL3OV2Akk7XpBlVnaa6vqEWKcbg_sBTlPX ✅ Python Problems for Beginners 👉🏼 kzbin.info/aero/PL3OV2Akk7XpC-fsHuJ3RLpnzOXmED7oAI ✅ How I created my 3D avatar with color from a 2D photo with PIFu 👉🏼 kzbin.info/www/bejne/pWfHc5qwjZajfKc
@diegoauyon93173 жыл бұрын
Cool. I was just struggling myself about what I should buy to build a PC for machine learning and image stuff.
@CODEMENTAL3 жыл бұрын
Glad I could help!
@IdiotDeveloper2 жыл бұрын
Finally, I have also built my deep learning PC using Ryzen 9 5900X and RTX 3090.
@CODEMENTAL2 жыл бұрын
Nice!
@GumRamm Жыл бұрын
Good video, still very relevant 7 months later! The only thing I would maybe change about the specs: If you’re planning on working with really big models (3B params and upwards) I would upgrade the RAM to 64GB in case you need to do CPU offloading and/or gradient checkpointing.
@CODEMENTAL Жыл бұрын
You are right. I am myself now considering to upgrade to a 4090 and i think the point about the 64gb of memory is quite relevant
@PossessedbyPhoenix Жыл бұрын
Would the new RTX 4090 be a notable improvement over 3090 for deep learning?
@DogzRGodz3 жыл бұрын
This is amazing
@PopperEtudeFromHell2 жыл бұрын
I believe Asus now has a BIOS update for X570 Tuf Gaming via USB/EZ Flash to work with Ryzen 5000 series.
@CODEMENTAL2 жыл бұрын
Nice. Thanks for the info
@notarealhandle1233 жыл бұрын
Not a great advise. If you are a Deep Learning professional who doesn't work with images (myself, for example, I only model financial data), then 3080Ti is absolutely the best choice. Not only it offers the same performance as RTX3090, but it does not suffer from over-heating, excessive noise and huge dimensions like RTX3090, which is plagued by its dual-side memory design. There are a lot of Deep Learning research out there that does not involve any graphics recognition, and thus does not require that much memory.
@CODEMENTAL3 жыл бұрын
You are right to a certain extent., depends on the use case. Chances are you don't even need a GPU for your work as most numerical optimisation problems aren't suitable for a GPU. However if you are doing computations with large matrices, GPU memory can help. That is why image problems benefit from more GPU memory. 3080ti should be great graphics card either way :)
@notarealhandle1233 жыл бұрын
@@CODEMENTAL I use TensorFlow for all my work, so I do need GPU for that.
@aziron89553 жыл бұрын
I chose a Ryzen 9 3900x instead of the 5900x because it's a lot cheaper. Will I lose a lot of performance? ( I have a rtx3090)
@CODEMENTAL3 жыл бұрын
I think you will be ok. If I was in the same situation and had to compromise somewhere, probably would start with the CPU.
@aziron89553 жыл бұрын
@@CODEMENTAL Many thanks. I bought this set-up and it works like a charm
@maloukemallouke9735 Жыл бұрын
why you don't use AMD graphic cards instead ??? (cheaper than Nvidia and same performance)
@AnimelodyPiano Жыл бұрын
The video is for machine learning… you need CUDA support for most ML tasks which only Nvidia offers
@emillaiho24303 жыл бұрын
Do you know if there is any noticeable difference between 7000mb/s ssd and something like 3000mb/s ?
@CODEMENTAL3 жыл бұрын
Depends on which context. It should make things a bit faster but you probably won’t notice it much.7000 mb/s is only possible for sequential reads and writes. Not really possible to do in most of normal operations.
@botlifegamer70262 жыл бұрын
I have a 5800x and 8x3090's and 32gb of ram is this any good? and what do i need to set it up for deep learning