A Portable GPU That Fits In Your Palm Of Your Hand! Pocket AI RTX A500

  Рет қаралды 1,150,194

ETA PRIME

ETA PRIME

Күн бұрын

Пікірлер: 1 300
@jt3000o
@jt3000o Жыл бұрын
If more companies come out with stuff like this for a more affordable price it would be a game changer
@andrewhill4751
@andrewhill4751 Жыл бұрын
I mean.... if they keep making them smaller wouldn't they just put them in the devices and skip having to carry a separate gpu?
@TheTastefulThickness
@TheTastefulThickness Жыл бұрын
​@andrewhill4751 they arent gonna get smaller than this. They need to much cooling.
@dazewaker262
@dazewaker262 Жыл бұрын
Well you can buy a good cpu laptop with average integrated gpu when you cant afford a laptop with good gpu and later go for this. There are many scenarios like this and you didnt even bother to think of one before writing your comment.@@andrewhill4751
@jt3000o
@jt3000o Жыл бұрын
@@TheTastefulThickness Oh they will get smaller especially when they use a cooling system like airjet pro
@tschuuuls486
@tschuuuls486 Жыл бұрын
This has not happend, because designing a PCB that can handle 40Gbit/s perfomance is hard and needs extremely expensive test equipment and expensive pcbs to pull it off. Completely different to a USB 3.0 gadget that you can design in half an hour with free software and get it manufactured for a few bucks. And for the MSI/Asus/Asrocks of this world it's a market that's just to niche, they can make way more money on more proven projects.
@aloneinthedev
@aloneinthedev Жыл бұрын
4Gb of Vram for AI learning or generation ? what a joke lol. Vram is just the most important thing rather than pure performance in AI right now.
@worldsendace
@worldsendace Жыл бұрын
True. And then it costs 450 on top.😅 Naa hard pass
@BillLambert
@BillLambert Жыл бұрын
The A500 is a cut-down mobile 3050... I guess it's for tablets or ultralights. Still, I can't fathom why anyone would look at a 3050 and say "too much".
@vitinhx794
@vitinhx794 9 ай бұрын
hard pass on that
@mycelia_ow
@mycelia_ow 8 ай бұрын
@@BillLambert I expect stuff like this from Jensen, he's an expert at producing bottom barrel hardware and marking them up.
@Abdi_Akhmet
@Abdi_Akhmet 4 ай бұрын
@@aloneinthedev 4gb is enough for you
@kingeling
@kingeling Жыл бұрын
Not only does it have just 4GB of memory, it also thermal throttles a lot due to it being cooled passively. It's no surprise that it doesn't perform as good as an A500 in a laptop, which would be equivalent to a GTX 1660 or an RX 590
@DeletedContent
@DeletedContent Жыл бұрын
Tbh for the price I would buy a ThinkPad P50 and it would have 4GB of VRAM. Although, scalpers keep selling them for too much smh. Especially, the 4GB M1000M and M2000M ones.
@himekomurata72
@himekomurata72 Жыл бұрын
Dang it's faster than my current pc 💀
@NomadicMeow
@NomadicMeow Жыл бұрын
Not to defend the thermal throttling issue but it does have a fan on the backside per the pictures they have on the website of the product itself.
@nikkelitous
@nikkelitous Жыл бұрын
It's literally the same chip as a 3050 just sold as a "Workstation" card. But agreed, this is a joke of a product.
@kingeling
@kingeling Жыл бұрын
@@nikkelitous It's cut down, though. Not quite the same.
@veritassyfer1185
@veritassyfer1185 Жыл бұрын
If it came with at last 8GB vram I would consider this. 4GB is too small IMO. It is an excellent concept that has potential.
@pinkipromise
@pinkipromise 9 ай бұрын
gpd g1 is 8gb
@cyberplonk
@cyberplonk 9 ай бұрын
wut are u talking abt?
@jamesbrendan5170
@jamesbrendan5170 9 ай бұрын
@@pinkipromise they're talking about the GPU's VRAM, and high VRAM is very important for training language models and the like (which, 4GB isn't)
@jamesbrendan5170
@jamesbrendan5170 9 ай бұрын
​@@cyberplonkabout how 4GB VRAM is not really enough for AI stuff.
@cyberplonk
@cyberplonk 9 ай бұрын
@@jamesbrendan5170 ah ok
@ichemnutcracker
@ichemnutcracker Жыл бұрын
The 4GB of VRAM on this thing are going to be very limiting in terms of the kinds of things you can do with it. For instance, most image generation requires at least 6GB for any reasonable resolution.
@Rotwold
@Rotwold Жыл бұрын
I think it's geared towards edge computing (eg. LattePanda) for rapid prototyping, expand existing embedded systems in the field and sell it as a learning tool to schools.
@ichemnutcracker
@ichemnutcracker Жыл бұрын
​@@Rotwold I can agree with that, but the problem is that this thing costs $500-$520. I'm sure there are narrow use-cases where that still might make sense -- particularly if the absolute smallest form-factor is a must -- but the mass-market/tinkerer appeal just isn't there. In 99% of cases, there are better options.
@skyguardian18
@skyguardian18 Жыл бұрын
While I don't think running a language model or image generation model would be feasible on this device, I think some others application (for example image recognition, audio recognition, etc) could benefit a lot from this accelerator.
@pingtime
@pingtime Жыл бұрын
@@skyguardian18 Google Coral TPU does the same thing, sized like a Wireless E-key module, cost $30, and sucks 2W power at most.
@fenix20075
@fenix20075 Жыл бұрын
You got the point... even the small text-generative AI model require at least 5GB~7GB vram...
@shankroidbeast4644
@shankroidbeast4644 Жыл бұрын
It's a cool little device, but for $450 you can do quite a bit better adding an external GPU to a laptop/mini PC with DIY components.
@mrhassell
@mrhassell Жыл бұрын
The Pocket AI is currently priced at USD $429
@CJinMono
@CJinMono Жыл бұрын
@@mrhassell still too expensive
@tschorsch
@tschorsch Жыл бұрын
It would be very hard to make anything as portable, especially with its very low power consumption. It would not be a fair comparison.
@ryanbrancel
@ryanbrancel Жыл бұрын
How much do you think eGPU enclosures are? Often they are in the $300 USD range. And then for $150 probably getting a video card that has similar output to this. But obviously there is a huge difference in portability.
@whybndsu
@whybndsu Жыл бұрын
Egpu enclosures cost you an arm and a leg and isnt as portable
@sirshinra
@sirshinra Жыл бұрын
Good start with this type of tech. Excited to see the next few years with more efficient chips with more VRAM etc. This might not be that great especially for the price, but the idea is solid. We just have to wait for advancements. This will be huge.
@DualWieldingDad
@DualWieldingDad Жыл бұрын
Things like this would be great for Single Board PCs or anything with an ARM chip. Excited to see what comes in the future.
@nathanadhitya
@nathanadhitya Жыл бұрын
Excited to see what red shirt jeff will do to this nugget.
@axotical8682
@axotical8682 Жыл бұрын
Unfortunately it is too small for Ai use, I would consider minimum 12gb to be really useful. Shame on AMD and Intel for lagging behind and allowing Nvidia to monopolize Ai market.
@jimbobcheezeburger2020
@jimbobcheezeburger2020 Жыл бұрын
Dam
@crimehole
@crimehole Жыл бұрын
​@@chrisdejonge611 it's marketed for researchers and data scientists, who are probably not training stable diffusion
@pokepress
@pokepress Жыл бұрын
AI video enhancement and some other workflows fit well within 4GB, but I agree 8-12+ GB would be better.
@midiman369
@midiman369 Жыл бұрын
Pretty much this, exactly. You can get by with 8-10gb but more than 10gb these days is much easier to work with without issues.
@Crecross
@Crecross Жыл бұрын
Not everything AI is targeted toward stable diffusion goofy 💀
@rodgerfisher219
@rodgerfisher219 Жыл бұрын
The portability is the most appealing aspect of this. It's expensive for the raw performance, but the fact you can fit it in your pocket is what makes it so impressive.
@brosplit
@brosplit Жыл бұрын
meh, gaming performance wise; you are better off with RDNA (3) alone. It's already come in many form factors.
@riklaunim
@riklaunim Жыл бұрын
Similar GPD G1 or questionable portability but stronger Asus XG mobile for Asus ROG Flow laptops are IMHO much better choices. There is a quite large performance penalty for using laptop own screen so this device will be really bad for FPS/price while the others have stronger GPUs and the option to connect the external display directly to eGPU for best performance if desired.
@cfilorvyls457
@cfilorvyls457 Жыл бұрын
At that point there's really no reason to not just get a gaming laptop which is way more.portable and convenient than whatever this thing is.
@fjorddenierbear4832
@fjorddenierbear4832 Жыл бұрын
Still, a laptop with a built-in 4080 is going to annihilate it
@deathtoinfidelsdeusvult2184
@deathtoinfidelsdeusvult2184 Жыл бұрын
@@fjorddenierbear4832 pretty sure 4080 laptops also costs 4-5 times than this.
@elitedestroyer0083
@elitedestroyer0083 Жыл бұрын
Its honestly a cool idea, it just needs more juice. 8GB minimum preferably with 12-16gb options. An output port would also be ideal.
@1337mathis
@1337mathis Жыл бұрын
I'd love to see you test the pocket AI on an actual AI project, not just games. As a dev I would find that more appealing to see what it could do.
@darkcoeficient
@darkcoeficient Жыл бұрын
I think it is outside of his scope.
@RG6Snipers
@RG6Snipers Жыл бұрын
@@darkcoeficient Yeah, this is a gaming channel.
@mingdug1824
@mingdug1824 Жыл бұрын
As cool as that would be me too he just does gaming
@cldpt
@cldpt Жыл бұрын
as others said, falls out of scope. But the thing is you don't need those tests since AI/ML and many other types of compute are not limited AT ALL by the 18-22Gbps of Thunderbolt. All you need to check the performance in AI use cases for this is to check any other A500 benchmark.
@1337mathis
@1337mathis Жыл бұрын
I mean its an AI product, not a gaming product, so then kinda pointless to demo as it wasn't all that impressive for gaming.@@RG6Snipers
@tallpaul9475
@tallpaul9475 Жыл бұрын
Well, if there is ever a pocket RTX 4080 or better with at least 16GB of GDDRx RAM, I might consider it.
@MechAdv
@MechAdv Жыл бұрын
The best you could see with this concept would be 100W limited GPU for the PD charging. So the 4080 doesn’t make much sense. I’d like to see a desktop 4060 in a 100x100x50mm form factor with 100W PD power in, oculink AND TB4 input, and a miniDP out. Oculink is good for 60Gbps and has a pretty big performance delta over thunderbolt.
@justinhockstead3551
@justinhockstead3551 8 ай бұрын
Literally the only version of one of these external GPU I've seen that's worth using
@VelocityBlasters
@VelocityBlasters Жыл бұрын
Maybe a video where you disassemble it? Would be nice to see the internals. Curious if it's an MXM module mounted on a thunderbolt adapter.
@xaytana
@xaytana Жыл бұрын
All things considered, it'd be cheaper to just produce a singular board that has the processor directly wired to the USB4 IC. There's no reason to add modularity to this type of product when there's really no parts on the market other than what they're already producing themselves as an AIB partner. A singular board would also cut down on manufacturing, points of failure, and QC testing, especially when this kind of product is at a much smaller business scale than you've probably considered. Also, MXM has been long dead, the modern alternative Dell introduced is a proprietary pinout, and Framework's re-pinning of these connectors probably won't see widespread adoption considering the history of modular mobile-spec components. It just doesn't make sense for this product to be more than just a singular board.
@BensTechTube
@BensTechTube 6 ай бұрын
ETA Prime the gold standard of KZbin! Tells you specs, Tells you price, Tells you WHERE TO BUY! I hate it when other KZbinrs DONT EVEN TELL YOU HOW MUCH OR WHERE TO BUY! Sounds simple right? When I make a video I ALWAYS try to remember this!
@Bajicoy
@Bajicoy Жыл бұрын
Very impressive size, tragic about the cost and the significantly lower performance than the 780M. If this was $150 USD it would be much more appealing than $450. I would really really like to see USB C eGPU adapters reduced to this size
@NinjaManZX10
@NinjaManZX10 Жыл бұрын
1:27 for real is a RTX 3050 Max-Q Refresh 4 GB
@johnchristianson515
@johnchristianson515 Жыл бұрын
For AI applications 4gb is a bit low, it has one major strength though and that is its size. There is a niche market for mini PCs where size and power restrictions are at a premium. For example i could see this as part of a truck drivers setup or in a tiny home.
@ishudshutup
@ishudshutup 9 ай бұрын
This is a great concept, I hope the company further develops it. Biggest issue is the 4GB VRAM, should be 8GB at the least with a 16gb option to get those 13B parameter LLM models working. There should also be a fan that can be switched on by the user whenever noise is not an issue. The company should also look into supporting the hundreds of millions of old Intel Macs out there with Thunderbolt 2/3, this would give them a new lease on life. Glad this product exists!
@y4lnux
@y4lnux Жыл бұрын
I think that if this product costs where the halve it will worth for the current specs, however I would rather prefer another eGPU option
@chlebon
@chlebon 8 ай бұрын
This is totally awesome for local small LLM. I NEED THIS!!!!
@patrickw8888
@patrickw8888 Жыл бұрын
If more companies came out with stuff like this for free it would be a game changer.
@flynntaggart7216
@flynntaggart7216 Жыл бұрын
It's a special hardware for ai training not for gaming pal
@fran2911
@fran2911 Жыл бұрын
Why would it be free?
@alexandrunistoroiu452
@alexandrunistoroiu452 11 ай бұрын
@4:17 thanx for blender benchmark review! Please make it general along with gaming reviews for others also!
@정원우-z5j
@정원우-z5j Жыл бұрын
They need to make something like this for gaming.
@MrHamncheez
@MrHamncheez Жыл бұрын
GPD G1, Asus XG Mobile, etc
@anthonypolanc0
@anthonypolanc0 Жыл бұрын
can you give more examples that arent 600+ dollars, and ASUS's solution is propietary@@MrHamncheez
@a64738
@a64738 Жыл бұрын
One alternative is to use a Thunderbolt external enclosure, that is what I did with my 2015 MacBook pro running a GTX1660ti for gaming in Windows.
@vishnuvarma2826
@vishnuvarma2826 Жыл бұрын
This is innovation ! This is the right direction for mobile GPUS ❤
@blueburger4
@blueburger4 Жыл бұрын
It would be awesome if you could drop this in as an accelerator to enable RT/DLSS with any GPU and have the GPU dedicated to rasterization operations while the pocket AI does all of the RT/DLSS calculations.
@pretentious_a_ness
@pretentious_a_ness Жыл бұрын
that's not how rendering pipeline works mate.
@blueburger4
@blueburger4 Жыл бұрын
@pretentious_a_ness which is why I said it WOULD be awesome IF you could...but also just because something is the status quo doesn't mean it can't change. People made using a drop in Nvidia GPU as a physx accelerator a real thing years ago....who's to say someone can't make something that injects into the pipeline, identifies the instructions requiring point-trace math, and "hides" it from your GPU and redirects it to the pocket AI....obviously that's still an oversimplification for a very complex (and maybe impossible) task but everything starts with an idea and a demand.
@juan9033yt
@juan9033yt 9 ай бұрын
We need this to be compatibled with all of our gaming consoles and laptops.
@artificialrevolution
@artificialrevolution 8 ай бұрын
for now the AI part of the pocket AI seems like it's nothing more than a buzzword. there is no world where 4gb vram is suitable for anything related to AI and especially training of AI
@exad_neo
@exad_neo Жыл бұрын
Imagine using this on the steam deck, it would have been a huge thing and t would have been the most coolest thing ever
@4.0.4
@4.0.4 Жыл бұрын
This is an excellent idea and would be a game changer... With like 12-24GB of VRAM.
@a64738
@a64738 Жыл бұрын
It also need better cooling as it is constantly overheating...
@computron5824
@computron5824 Жыл бұрын
Yeah and it would be twice the game changer with 48GB of VRAM
@hendrixc6988
@hendrixc6988 Жыл бұрын
Next gen handhelds with thunderbolt be amazing having one of these.
@getyroks
@getyroks Жыл бұрын
I think you need to add the MSRP to all your specs in the beginning of videos.
@Dekerus
@Dekerus Жыл бұрын
"I know a lot of you are interested to see how it games" .... 😂❤ Is it not at least 95% of us give or take? That would be my guess. 🙂✌ After watching: That is pretty fracking neat!
@deadpain2483
@deadpain2483 Жыл бұрын
They should make these things for older cpu or low end cpu that doesn't have igpu.
@cldpt
@cldpt Жыл бұрын
not possible, there's no interface other than Thunderbolt which is rare in older machines. And even on the rare ones which do have Thunderbolt, it will be severely limited by an x2 PCIE connection. If you want, you can make your own egpu over m.2 for computers where you can forfeit that, or even go mini-PCIE, but those are extra-exotic, inconvenient setups
@100Bucks
@100Bucks Жыл бұрын
Use "integer Scaler" and be happy 😅 Low res gaming on HD display. 720p is minimum to game on a 4k screen. 360p also works but 360p is recommended for 2d games only scale 360p to 4k.
@needsLITHIUM
@needsLITHIUM Жыл бұрын
I would love to see this used in a virtual machine. Since it's a professional/workstation series GPU it should support it.
@dj4monie
@dj4monie Жыл бұрын
Why? Is this a virtual machine/docker channel?
@needsLITHIUM
@needsLITHIUM Жыл бұрын
@@dj4monie because you can play video games in a VM if you have a native platform that doesn't support it? And since it is a workstation card, even shit nvidia will allow it to work with passthrough?
@GustavoMsTrashCan
@GustavoMsTrashCan Жыл бұрын
Looks great! Can't wait to plug this on my (future) rpi 5! ...oh
@publicname
@publicname 9 ай бұрын
They should make one with 16gb vram if their aim is AI
@nodewizard
@nodewizard 8 ай бұрын
4GB of vram at GDDR6 is a bit slow for $500 USD. The RTX 3060 12GB vram sells for $250 MSRP right now. Through that into a dedicated eGPU for an extra hundred bucks. Just being honest. You can't do much with 4GB of vram anymore - not in 2024.
@deathvalleybro9320
@deathvalleybro9320 4 ай бұрын
truth.
@pekholtz
@pekholtz Жыл бұрын
as usual ,great video ETA, i would love to see this on a sbc
@santukagameplays5862
@santukagameplays5862 Жыл бұрын
I wonder if at some point they will put one of these into a handheld device. Since is so tiny and low power maybe they can make it fit into something like the steam deck, that would be really nice. I hope we are getting closer to dedicated graphics on a handheld.
@milescarter7803
@milescarter7803 Жыл бұрын
Yeah . . . You want it integrated with the CPU. Which is why the Switch has an ARM. Nobody is licensing Nvidia to make x86 cores. I'd love to see maybe a software translation layer for x86 code in Proton for example, and maybe some kind of legal hardware acceleration for it in RISC or ARM, but who knows 🤷‍♂️
@Bustermachine
@Bustermachine Жыл бұрын
As far as I know, that's kinda the entire point of an APU. You're putting the CPU and GPU on the same wafer. The only limitation to how powerful the CPU and GPU combo can be is thermal concerns and production yields and it's typically more space and power efficient than a dedicated GPU.
@CMak3r
@CMak3r Жыл бұрын
Cute, but 4GB is below the baseline for Stable diffusion and quantized 7B language models. For smaller 4B models and training small neural networks models this GPU may be enough, but…
@PyromancerRift
@PyromancerRift Жыл бұрын
The jump in performance also come from the fact that the GPU and CPU are sharing the same TDP. But if you use that external GPU, the whole laptop TDP go to the CPU and the GPU use its own TDP.
@hikodzu
@hikodzu Жыл бұрын
nice to see external gpu getting more portable
@ericneo2
@ericneo2 Жыл бұрын
What are they going to accelerate with only 4GB of VRAM? SD considers 8GB low and LlamaGPT requires 6GB - 42GB of VRAM depending on the model you are using.
@vostfrguys
@vostfrguys Жыл бұрын
yeah this is useless, and so expensive like $429 while for that price you can get at least a 3060 12gb
@rohanattackongoole254
@rohanattackongoole254 Жыл бұрын
They should rly implement thunderbolt in mobile it's small enough it can fit and you can like hook up a GPU to it imagine ROG 7 with rtx 4060
@DK-dp3kk
@DK-dp3kk Жыл бұрын
seems like buying a Jetson Nano would be cheaper with similar specs . . . what do you think?
@rubeng5819
@rubeng5819 Жыл бұрын
What this is stunning, I want this thing from future version plus the legion Go, thanks man a lot of time seeing your channel about egpus and video games, great channel
@pokepress
@pokepress Жыл бұрын
Honestly the best use case for this might be something like someone who does a lot of video editing on location. They can significantly speed up rendering and AI enhancement without needing a 15+ inch laptop. Back in August I was in Japan and needed to upscale some video I had shot, and it took practically forever in Topaz Video AI on the iGPU (I have an eGPU at home, but it’s impractical to take overseas). There’s also music visualization software that supports GPU acceleration, so between that and Ultimate Vocal Remover, DJs might get some use out of this.
@tigerscott2966
@tigerscott2966 Жыл бұрын
Great product... A PC with a Thunderbolt card and that GPU would be very versatile and powerful...thanks.
@neon_arch
@neon_arch Жыл бұрын
I hope handhelds would come up with this gpu. Having an nvidia GPU means you have dlss which is a game changer. Fsr doesn't come close.
@simeondodov1142
@simeondodov1142 Жыл бұрын
Amd has fsr and asus rog ally has the z1 and z1 extreme chips which are amazing for a handheld
@erickalvarez6486
@erickalvarez6486 Жыл бұрын
I don't care about that, we have a GPU that can connec through usb, future is bright.
@green929392
@green929392 Жыл бұрын
Fsr
@collinsgichuhi8255
@collinsgichuhi8255 Жыл бұрын
This GPU is $450 lmao. Its meant for AI.
@Saif0412
@Saif0412 Жыл бұрын
Unfortunately it looks like Nvidia doesn't want to or care about competing with Nvidia on handheld SOC's.
@ChristianStout
@ChristianStout Жыл бұрын
I do want something like this, but $450 is a tough pill to swallow. Even the desktop RTX A2000 is less than that.
@rmcdudmk212
@rmcdudmk212 Жыл бұрын
Not great but would be a definate upgrade to onboard graphics chips like 90% of the intel HD chips that steal your system memory to run. 👍
@NutellaCrepe
@NutellaCrepe Жыл бұрын
That used to be the case, but the recent low power APUs from AMD are starting to outpace something like this A500.
@rmcdudmk212
@rmcdudmk212 Жыл бұрын
@@NutellaCrepe that's why I didn't mention AMD. 👍
@cldpt
@cldpt Жыл бұрын
on that size it's actually great. The closest eGPU to this in size was a much larger Lenovo Dock from like 5y ago, and that sported a 1050 Ti MAX-Q, while the smallest thunderbolt eGPU enclosures you can still get today are as large as a mini ITX case. Yeah, you can do your own m.2-based eGPU, or you can grab an outdated Gigabyte mini box but you will still be severely outdated and increasing the size by about 300%. I am of course excluding some stuff like Dell's or Asus's proprietary GPU docks, because they aren't really thunderbolt, and they only work on an expensive or outdated subset of machines)
@kingkrrrraaaaaaaaaaaaaaaaa4527
@kingkrrrraaaaaaaaaaaaaaaaa4527 Жыл бұрын
Assuming you can use it with those Intel CPUs. I can't recall any very low-end Intel CPU laptop which could benefit from this with USB4.
@nutzeeer
@nutzeeer Жыл бұрын
this is nice, a low power efficient option. i like hardware not being pushed to the limit all the time anymore
@Giordanoemanuel
@Giordanoemanuel Жыл бұрын
Amazing!! Could you please compare it with the performance given by the Asus Rog Ally? Is it possible to connect it to the Ally?
@johnpp21
@johnpp21 Жыл бұрын
no, rog ally doest have usb4, it have terrible egpu support only their proprietary egpu will work or m.2 egpu
@TAGMedia7
@TAGMedia7 Жыл бұрын
This is intriguing. If units like this catch on, it could be the start of something very big.
@williamcll
@williamcll Жыл бұрын
How does this compare to the GPD G1 using the AMD RX 7600XT?
@abdulrahmanhawarneh1605
@abdulrahmanhawarneh1605 Жыл бұрын
GPD is way stronger 😅
@atranimecs
@atranimecs Жыл бұрын
Pocket AI: $429 MSRP 4GB VRAM 25W TGP 6.54 TFLOPS Thunderbolt 3.0 Connectivity No Expansion Ports Can be Powered by PD3.0+ (40W and up) USB Battery Banks GPD G1: $714 MSRP 8GB VRAM 75-120W TGP 21.54 TFLOPS Three - USB 3.0A, One - SD Card Slot, Two - Displayport, One - HDMI Out, USB4 + Oculink Connectivity, Can Charge Devices with internal power supply. Oculink Support (Faster than Thunderbolt 3 in Gaming performance 15-30% real world) Needs AC Adapter plugged into wall In completely different categories really.
@ABlindHilbily
@ABlindHilbily 9 ай бұрын
It's crazy that I haven't heard about this. Thank you for bringing it to my attention. I've been watching your videos forever but just realized I wasn't subscribed. I fixed that
@googleevil
@googleevil Жыл бұрын
4Gb is not enough for StableDiffusion, not mush applicable for transformer models
@hassanfaizan5831
@hassanfaizan5831 Жыл бұрын
Wow this is amazing, more companies should make this, it's very useful gadget
@rod85y
@rod85y Жыл бұрын
I wish they'd make it as a pen drive dongle, with a HDMI output. Maybe in few more years
@msnirajagrawal
@msnirajagrawal 8 ай бұрын
There is a huge market of people who want these type of portable external GPU. People go on the extreme ends and spend lot of money to somehow make big cards work on their device even of they need to sacrifice a wifi port for that same. So more companies should invest and do RnD in this field it surely is a game changer.
@PokeParadox
@PokeParadox Жыл бұрын
Would this be useful for the SteamDeck to give a boost when docked and playing on TV?
@vices2744
@vices2744 8 ай бұрын
The fact that the pocket AI has no video out would make it useless for a scenario like that. If you want to dock the SteamDeck to a powerful GPU and then stream that output to a TV, consider a full fledged GPU with an EGPU enclosure.
@rdxarnav5328
@rdxarnav5328 7 ай бұрын
i rather prefer building a gaming pc in 500$ insted of buying this
@CubbyTech
@CubbyTech Жыл бұрын
Interesting video - curious how you 'switch' between the iris graphics, and the A500 accellerator?
@cldpt
@cldpt Жыл бұрын
windows 10 or 11 now has a way to do this integrated on the OS. Before, you could chose it depending on Nvidia/AMD driver support using their buggy user interfaces. Windows seems to do a better, more consistent job. I think even Linux handles this fine these days
@nowlaswolf2577
@nowlaswolf2577 Жыл бұрын
Probably by unplugging it
@ghost__017
@ghost__017 8 ай бұрын
Damn its still better than My Intel UHD 630 Integrated Graphics. I guess I should get one of these.
@TheVincentKyle
@TheVincentKyle Жыл бұрын
Have to admit I'd love to see what would happen if you hooked this thing up to a Windows-installed Deck.
@reanimationxp
@reanimationxp Жыл бұрын
Can't believe he didn't do this..
@reanimationxp
@reanimationxp Жыл бұрын
@@KingmanKingman-nb4tn my guy, as he said, he had several unexpected results throughout the thing. it’s worth it just to see anyhow. he has hooked up eGPUs on less.
@khyoyeon554
@khyoyeon554 Жыл бұрын
Nothing would happen, because this uses thunderbolt.
@RozayMalikOG
@RozayMalikOG Жыл бұрын
What a Time to be Alive😮💻
@kgrayman
@kgrayman Жыл бұрын
The sad thing is a lot of older pc/laptop does not have usb4 or thunderbolt
@antomilanisti1899
@antomilanisti1899 Жыл бұрын
Up
@oridavidmusai7184
@oridavidmusai7184 Жыл бұрын
even new ones
@kougamecs3876
@kougamecs3876 Жыл бұрын
Yah think if I try it with Lenovo Yoga 900 via USB 3.1 Type C it would work 😅? Like they say, nothing tried nothing done.😂
@unacknowledged3537
@unacknowledged3537 2 ай бұрын
@@kougamecs3876 No it won't, too little bandwidth and it only supports USB4, TB3 or TB4
@kougamecs3876
@kougamecs3876 2 ай бұрын
@@unacknowledged3537 good because I have a Lenovo Legion Go now
@wliv4173
@wliv4173 Жыл бұрын
I am just amazed that nowadays integrated graphics even can run some triple A games, I still remember the days when integrated graphics pretty much meant that you can't run anything other than a browser and maybe Minecraft along with some simple 2D games
@MichaelSkinner-e9j
@MichaelSkinner-e9j Жыл бұрын
For 25 watts, that's not bad performance. That's essentially a 2060
@lewzealand4717
@lewzealand4717 Жыл бұрын
No. It's a cut down (fewer cores) and power limited (much lower clock speed) 3050 and the 3050 is already slower than the 2060. It's maybe a 1650 Super. Before taking the additional 25% performance hit of looping the video back through TB3/USB4.
@josephdias3968
@josephdias3968 Жыл бұрын
@@lewzealand4717I think it performs like a 1660 basically a 3050
@nunits
@nunits Жыл бұрын
1650 .
@MichaelSkinner-e9j
@MichaelSkinner-e9j Жыл бұрын
@@lewzealand4717 Thanks for the correction! For a graphics card to still have this performance @25 watts, the only other option is a 1030, so look at the relative performance. For that wattage of a GPU, the only thing close is a 750ti.
@josephdias3968
@josephdias3968 Жыл бұрын
@@MichaelSkinner-e9j it would be a sweet sff gpu if they made it 50 watts
@SovereignKnight74
@SovereignKnight74 Жыл бұрын
Give that thing dedicated graphics out and I'm sold!!
@Gsmiler
@Gsmiler Жыл бұрын
Basically, super excited.....repeat to fade.........
@vng
@vng Жыл бұрын
With 4GB of VRAM, it is really not going to be much use for AI training, and larger models won't fit into memory for inference. It sounds like a good idea but it will only be useful for people with very specific requirements.
@ghardware_3034
@ghardware_3034 Жыл бұрын
Yeah not for training, but for inference?
@Nik.leonard
@Nik.leonard Жыл бұрын
@@ghardware_3034 Most inference LLM's like Llama2 are about 6gb with 7b parameters and 4bit quantization (not very good results) and for 13b parameters (better results) you need at least 12. For Image generation, SD1.5 weighs between 2.5 and 4gb, but even with 2.5gb models, you don't have a lot of space for upscaling or controlNet.
@lequack6373
@lequack6373 Жыл бұрын
@@qtsssim This thing isn't aimed at gamer.
@vostfrguys
@vostfrguys Жыл бұрын
Hope it's not aimed for people who want to do inference, it wouldn't even run LLM, TTS, SD .... this is useless need at the very least 8gb of vram @@lequack6373
@Defensivepepe7
@Defensivepepe7 Жыл бұрын
I am a proud owner of the gpd g1 and i hope it is the beginning of something.
@maliceflare
@maliceflare Жыл бұрын
ugh, 4GB. not enough for using Stable Diffusion...
@seemoremacstuff
@seemoremacstuff Жыл бұрын
Yeah 6GB is the bare minimum for decent performance.
@PremiumLeo
@PremiumLeo Жыл бұрын
This is awesome. A couple generations more and we can pretty much replace internal gpus imo, or upgrade our existing laptop with a better gpu. Super cool
@VertegrezNox
@VertegrezNox Жыл бұрын
It's only 4Gb, while it might be great in another year, this is basically false advertisement at this point in time.
@rmcdudmk212
@rmcdudmk212 Жыл бұрын
Still better then most Intel HD chips that eat system memory to work. 👍
@ThibautMahringer
@ThibautMahringer Жыл бұрын
​@@rmcdudmk212 facts
@RetroHondo67
@RetroHondo67 Жыл бұрын
Engineered for AI, so not built to render graphics
@RageyRage82
@RageyRage82 Жыл бұрын
It's not false advertisement though. He described what the product is for, and it sounds like the website for it does that, too.
@samgoff5289
@samgoff5289 Жыл бұрын
@@RageyRage82yea but if you are delusional kid who feels entitled to everything this is false advertisement because it isnt what they want it to be
@skorpysk
@skorpysk Жыл бұрын
Finally somebody is talking about this thing
@tenthant
@tenthant Жыл бұрын
This makes me want to see a handheld PC with a dedicated GPU in it instead of an iGPU.
@zigmarsz.7540
@zigmarsz.7540 Жыл бұрын
Live video production in VMIX. ABSOLUTELY.
@unveil7762
@unveil7762 Жыл бұрын
Nice!!! Battery power media server get even better with extendable gpu over thunderbolt!! Thaz pritty handy!! 🎉
@bakedbeings
@bakedbeings Жыл бұрын
This would be great for students in design/3d/games who often own newish laptops that they learn won't cover their needs. Making games isn't the same as running them, the graphics landscape is constantly shifting, and pc marketing is constantly.. optimistic, so it can (and does) happen easily to smart kids.
@trexxtechs2857
@trexxtechs2857 Жыл бұрын
Ok so this is basically a cut down RTX 2050
@25_26
@25_26 Жыл бұрын
Ok i will buy this for sure. That looks like a great portable ai card goodbye to my old massive egpu setup for ai :D
@AdonMrGveret
@AdonMrGveret Жыл бұрын
this looks really cool, imagine you can strap a cube to the back of your steam deck and make it run 25% faster
@winsucker7755
@winsucker7755 4 ай бұрын
Pocket AI - I like it! I will call my PIPI that from now on - because "it's absolutely tiny"!
@TheInfinite444
@TheInfinite444 10 ай бұрын
I wish steam would do their own external EGPU for their steam deck that be nice to see in the future :)
@jasonsaez3668
@jasonsaez3668 11 ай бұрын
This would be great for my son, who's going to college to be a Music Teacher. He also is into recording his concert band and jazz bands.
@jonathaningram8157
@jonathaningram8157 11 ай бұрын
I don’t understand how having a gpu will help him recording ?!
@csmemarketing
@csmemarketing 11 ай бұрын
4:27 You didn't really have access to the Intel GPU because it was dimmed out in the settings. You will have to choose it from the Blender System settings. The Nvidia GPU was highlighted, so you actually had access to it. If the GPU id dimmed out, it just used the CPU, that's why it took longer.
@jz-xq4vx
@jz-xq4vx 11 ай бұрын
thanks for the Blender Render!👍
@Binary0x9
@Binary0x9 Жыл бұрын
Bro tech is getting wild...🔥
6 ай бұрын
That thing is actually exactly what I am looking for and I want to boost my gaming with this :) I have a laptop with OLED display with HDR capability, but it has only IrisXe graphics. The game I play on it can run, but it is not smooth and it sometimes crashes. That means this little device will solve my problems. Additionally I don't want to connect external display, I need mobility. This thing is just perfect :)
@SirSneakers
@SirSneakers Жыл бұрын
Makes me wish Raspberry Pi had TB4 so you could do really small scale AI stuff
@GarrettNicholas01
@GarrettNicholas01 5 ай бұрын
That little thing's badass.!!
@FloatyCoyote
@FloatyCoyote 10 ай бұрын
I feel like you could get away with the same performance using a quadro p4000/5000 for around $50
@Nik.leonard
@Nik.leonard Жыл бұрын
4gb vram is not enough for loading LLM's, and is possible to load SD1.5 models, but they have to be run with some tweaks. They should offer at least 8gb and ideally 12 or 16.
@irridiastarfire
@irridiastarfire Жыл бұрын
That's a good point, I hadn't thought of the vram limitation. I guess there are ML tasks that aren't so memory intensive but it's a niche within a niche.
@elphive42
@elphive42 8 ай бұрын
The power of a GPU in the palm of my hand 🐙
@linuxphysics
@linuxphysics Жыл бұрын
This is so awesome! It reminds me of the Intel/Movidius Neural Compute Stick (NCS2). But that is a low-powered USB stick that is meant to connect to a Raspberry Pi, which can do object recognition at 5fps-30fps.
@nailsonlandimprogramming
@nailsonlandimprogramming Жыл бұрын
Kinda same thing, quite powerful for that, I think that one can do instance segmentation in real time. One value on movidus chip is the pricepoint and power consumption
@BrianThomas
@BrianThomas Жыл бұрын
This is really for AI on the edge. I would love this in a dev environment for IOT.
eGPU | Thunderbolt vs Oculink. Everything you need to know.
11:09
Try Some Tech
Рет қаралды 69 М.
When Cucumbers Meet PVC Pipe The Results Are Wild! 🤭
00:44
Crafty Buddy
Рет қаралды 62 МЛН
Lamborghini vs Smoke 😱
00:38
Topper Guild
Рет қаралды 48 МЛН
Framework Cyberdeck - DIY Portable PC
17:31
Ben Makes Everything
Рет қаралды 1,3 МЛН
I don't understand why people buy Steam Decks...
22:11
Zac Builds
Рет қаралды 938 М.
Run your own AI (but private)
22:13
NetworkChuck
Рет қаралды 1,7 МЛН
Upgrading a Soldered Laptop GPU
17:16
TechModLab
Рет қаралды 1,1 МЛН
You Shouldn't Do This to Your Cheap Thinkpad
12:50
aChair Leg
Рет қаралды 94 М.
This tiny computer changes EVERYTHING
15:57
Jeff Geerling
Рет қаралды 621 М.
Can Intel survive the valley of death?
22:26
TechAltar
Рет қаралды 237 М.
When Cucumbers Meet PVC Pipe The Results Are Wild! 🤭
00:44
Crafty Buddy
Рет қаралды 62 МЛН