I ONLY Used an Intel GPU for 30 DAYS.... Here's What I Learned

  Рет қаралды 136,058

Vex

Vex

Күн бұрын

Пікірлер: 678
@IntelGraphics
@IntelGraphics Жыл бұрын
This was awesome! Great work and thanks for all the feedback and insights into what you experienced. +Subscribed!
@tsubasawolfy
@tsubasawolfy Жыл бұрын
Wow the official channel!
@IntelGraphics
@IntelGraphics Жыл бұрын
@@tsubasawolfy That's us!
@RedRandy
@RedRandy Жыл бұрын
Love you intel ❤
@notinterested8452
@notinterested8452 Жыл бұрын
@@IntelGraphics Nice to see your comment.
@PumpyGT
@PumpyGT Жыл бұрын
Really? I was expecting your channel to have more subscribers and a checkmark
@raresmacovei8382
@raresmacovei8382 Жыл бұрын
Linus: We're doing the 30 day AMD GPU challenge. Also Linus: Yeah, we won't be doing a video on this. We forgot we had the GPUs installed. Everything ran fine. We don't need to tell anyone.
@Metamine0
@Metamine0 Жыл бұрын
Oh yeah, they never released a part 2
@Michael-wo6ld
@Michael-wo6ld Жыл бұрын
Linus isn't having issues, I think Luke did and took it out though. They talked about it on the last WAN show
@jgvtc559
@jgvtc559 Жыл бұрын
​@@Michael-wo6ld😂 Who's the long hair heavy set guy from LTT He is the only person from that page I even bother listening to about anything even if I'm not interested I hear what homie has to say about whatever he is presenting
@jamesjohnXII
@jamesjohnXII Жыл бұрын
​@@jgvtc559I think his name is Allen or something
@nathanlarson6535
@nathanlarson6535 Жыл бұрын
​@@jgvtc559Anthony?
@nick-dogg
@nick-dogg Жыл бұрын
I can’t wait till they really refine their drivers. Their upcoming iGPU’s seem promising.
@cyko5950
@cyko5950 Жыл бұрын
intel vs amd competition in the igpu space would be insane they might accidentally end up making apu's so powerful that low end gpus become pointless
@nick-dogg
@nick-dogg Жыл бұрын
@@cyko5950 It would be insane. All we need now is for intel to step up their drivers =D
@SPG25
@SPG25 Жыл бұрын
The drivers have already come a long way, it's only going to get better.
@kotztotz3530
@kotztotz3530 Жыл бұрын
Here's to hoping Meteor Lake is as good as the rumors.
@kuronoch.1441
@kuronoch.1441 Жыл бұрын
​@@kotztotz3530Dunno about the CPU, but the GPU data was based on the interpolation comparing it to current ARC GPUs. With the drivers as of that time. Given that the iGPU is an Alchemist refresh, it will only go up from there, might even be approaching 3050 desktop.
@theftking
@theftking Жыл бұрын
The Fortnite results make me think Intel might be optimizing hard for UE5, knowing nearly every game will use it going forward. Plus it's a DX12 game which Arc does best with.
@goldenhate6649
@goldenhate6649 Жыл бұрын
I will say one thing about UE5, if done right, holy fuck does it run better than UE4
@FalseF4CZ
@FalseF4CZ Жыл бұрын
⁠@@goldenhate6649too bad the devs who made Remnant 2 didn’t do it right
@V3RAC1TY
@V3RAC1TY Жыл бұрын
the first game used UE4 and also didn't run well at launch, though still not as bad as Remnant 2 dose now@@FalseF4CZ
@dreawmy2912
@dreawmy2912 5 ай бұрын
@@FalseF4CZ lmao
@jforce321
@jforce321 Жыл бұрын
You know battle mage is going to be a big deal when there's rumors that Nvidia is pressuring their partners to not make cards for them
@PeterParker-status
@PeterParker-status Жыл бұрын
They did the same to amd but public outcry made them u turn, people just don’t remember. So it doesn’t really mean anything
@theonelad3028
@theonelad3028 Жыл бұрын
@@PeterParker-status they are pushing for a full block on partners working with intel. from memory they just stopped partners from using the same coolers on amd and nvidia cards. still anti consumer but not quite the same and it really does mean something. nvidia is becoming even more blatantly anti consumer than ever before alongside amd doing similar shit its just a crap time to be a pc enjoyer
@OG_Gamers_Live
@OG_Gamers_Live Жыл бұрын
and every AIB involved should tell NVIDIA well hate to see you go, call us if you change your mind and go on about their day. I Mean NVIDIA screwed every one of them HARD when they Started manufacturing their own FE cards and that's just for starters.
@ChrisStoneinator
@ChrisStoneinator Жыл бұрын
I support Intel in this endeavour and have no love for nVidia, but that's not necessarily an indicator. As Mark Knopfler said, if they gon' drown, put a hose in their mouth.
@__aceofspades
@__aceofspades Жыл бұрын
I bought an A750 and have loved the experience. $200 beats the more expensive 6600XT and 3060, and has more features like AV1, XeSS, XMX cores, better encoding, etc. Highly recommend Intel Arc
@parthjain5248
@parthjain5248 Жыл бұрын
What are your thoughts now?? With games and videos rendering?? + Heavy multitasking like gaming streaming or rendering?
@HardWhereHero
@HardWhereHero 10 ай бұрын
@John-PaulHunt-wy7lf I wanna see Nvidia burn. I use their GPU's but dont want too. I am getting a 16gb 770
@johndc2998
@johndc2998 9 ай бұрын
6700xt is closer comparison
@82_930
@82_930 9 ай бұрын
⁠@@johndc2998not anymore, Arc has came such a long way from launch and it’s comparable to a 3070/6750XT or even a 6800 Non-XT now
@thespianmask8451
@thespianmask8451 9 ай бұрын
@@HardWhereHero Seriously. Screw Nvidia. I just upgraded from a 3060Ti to an RX 7800 XT. Totally worth it.
@QuentinStephens
@QuentinStephens Жыл бұрын
With regards to your game recording issues, it most certainly does happen on Nvidia GPUs. The Twitch streamer Nethesem experienced it when streaming Jedi Survivor with his 3080 Ti. He fixed it by adding a second GPU (a GTX 1060 IIRC) to handle the recording and streaming. I do hope you have yet to return the A770 because you're missing a treat if you don't try it at 4k medium settings. FWIW I have an A770 and a RTX 4090 (and a 3090).
@Micromation
@Micromation Жыл бұрын
If one just need 2nd GPU for recording and streaming A380 will be a cracking deal.
@QuentinStephens
@QuentinStephens Жыл бұрын
@@Micromation The trouble with the A380 is that the last time I checked Twitch did not support AV1 streaming.
@zackzeed
@zackzeed Жыл бұрын
Sorry for the dumb question, but you can use a second gpu for recording? I mean I'm aware of capture cards and such but have never used one so my knowledge is limited.
@Micromation
@Micromation Жыл бұрын
@@zackzeed it's not for capturing from external device is for the processing on the same one.
@Micromation
@Micromation Жыл бұрын
@@QuentinStephens "will be" - I wouldn't personally pay for it more than $50. Not when I've just grabbed 3060 12GB for Blender rendering for 180 🤷🏻
@wjack4728
@wjack4728 Жыл бұрын
I've got the Intel Arc a750, and really like it. I had big problems with the drivers when I first got it about 4 months ago, but with the latest drivers it's running like a champ now. I do very little gaming, and use the a750 mainly for normal daily use and encoding videos, and for that it's a great GPU for me.
@alderwield9636
@alderwield9636 Жыл бұрын
Yea av1 is a champ
@johnbernhardtsen3008
@johnbernhardtsen3008 Жыл бұрын
I got the A750 and its crap for my r7 5700x cpu!no igpu on the cpu and I bought the A750 for premiere pro usage!Now I have to build a nice i5 13500 build and use the A750 on that pc!!
@vitordelima
@vitordelima Жыл бұрын
I wonder if wasting time to support all those legacy APIs (DX9-12, OpenGL, Voodoo, OpenGL ES, ...) in every driver for every GPU is a good idea instead of just investing effort into Vulkan converters for everything (I think Steam Deck already uses something similar).
@jsteinman
@jsteinman Жыл бұрын
Same! I really like the a750. On install the card was horrible. Everything was corrupted- even Edge. After a bios update and latest drivers, it’s near perfect
@dnsjtoh
@dnsjtoh Жыл бұрын
⁠@@johnbernhardtsen3008Huh? The only difference between using an intel CPU or AMD CPU on arc is in productivity such as video editing. DeepLink isn’t useful for gaming.
@noelchristie7669
@noelchristie7669 Жыл бұрын
Been using my A770 for just about everything since I bought it in December. Only game I've ever had a problem with is emulating totk, hard crashes after some time. Been using the card between my 4k tv and my 1080p monitor. Loving it fr, it is a great upgrade from my GTX 1060! Definitely agree that there are better gaming cards though, I do a lot more than gaming, so I can't complain!
@guilhermedecarvalhofernand1629
@guilhermedecarvalhofernand1629 Жыл бұрын
My main issue for getting an arc card IS If they are going to stay in This segment. We gotta remenber This is not the First time Intel makes GPUs. If battlemage actualy comes them we can have a talk. The worst would be buying It And It gets discontinued, specialy If you live in third world countries where the entry level gpu is already expensive as fuck. L live in Brazil btw.
@Bajicoy
@Bajicoy Жыл бұрын
Intel quitting on NUCs has me pretty concerned for their future
@Konkov
@Konkov Жыл бұрын
they've already invested billions of dollars into these gpu's and have had great sales so far, it wouldnt make sense to backout now
@OG_Gamers_Live
@OG_Gamers_Live Жыл бұрын
Pretty sure The new BattleMage cards are confirmed as coming in 2024 at a recent Intel Investers meeting.
@magnusnilsson9792
@magnusnilsson9792 Жыл бұрын
A770 16GB vs 4060Ti 16GB on PCIE 3 would be intresting, so would overclocking A750 vs 7600 vs 4060 be too.
@Kitsoonaye
@Kitsoonaye 9 ай бұрын
I have an Intel GPU as well. Games always get random stutters (Example: 14:20) and eventually after a few seconds, the game you're playing will crash. It is really annoying.
@bhoustonjr
@bhoustonjr Жыл бұрын
If you use Arc with a Intel onboard graphics you can get a 40% increase encoding through Deep Link Hyper Encode. That is one of the reasons I went with Intel.
@AndersHass
@AndersHass Жыл бұрын
I would say both XeSS and FSR has taken an open approach, they are both open source and can work on other hardware than their own. It would be interesting if XeSS could use dedicated AI cores in AMD and Nvidia GPUs instead of the less performing method they are using now.
@flintfrommother3gaming
@flintfrommother3gaming Жыл бұрын
ARC A770 has 16 GB of VRAM, atleast the limited edition one which I think will still be made by third party manufacturers (unless Nvidia forces them to stop creating more Arc GPU's, ALLEGEDLY.) ARC A750 has 8 GB. The choice is not so black and white, that extra 8 GB of VRAM in Arc A770 is something to consider if you play AAA games nowadays. I personally lean more to the side of Arc A770 since they basically give out the extra VRAM for free considering what Nvidia did with the RTX 4060 Ti.
@gorrumKnight
@gorrumKnight Жыл бұрын
The extra VRAM is definitely nice. I bought the A770 & it's been solid af at 1440p. And that's with the Linux gaming tax.
@johnnytex1040
@johnnytex1040 Жыл бұрын
Edit; it's been fixed. The Halo infinite situation is genuinely upsetting, because before season 4 when Halo broke compatibility w/ Arc, it actually performed really really well. On the biggest Big Team Battle map I was getting frame rates north of 110 FPS; that's more than what my friend can get with his 6700 XT.
@keiko6125
@keiko6125 Жыл бұрын
I'm glad I'm not the only person having issues with halo infinite. I hope it gets fixed soon, I'm really liking this intel arc GPU
@johnnytex1040
@johnnytex1040 Жыл бұрын
@@keiko6125 343 has told me they're aware of the bug, but their own website has now been updated to say ARC isn't currently supported, and I've not been given a time-frame as to when any fix will come. Rest assured, Intel has also been made aware, and they're seeing if there's anything the could be done on their end to do a sort of adhoc fix. Here's hoping that's possible.
@larrythehedgehog
@larrythehedgehog Жыл бұрын
Hey guys its that crazy guy with the sonic icon who won't shut up about ARC again. Rumor going around is, Nvidia is preventing AIB's from working with intel. Theyre withholding partners from making cards with both them and intel. If you care about cheap gaming parts, you should reach out to nvidia and tell them to shove it. So battle mage will have more AIB partners than just Acer. Have a good day!
@J0ttaD
@J0ttaD Жыл бұрын
Why do they need AIBs? I like the default cooler better. They dont need em
@maddymankepovich1796
@maddymankepovich1796 Жыл бұрын
ironic that intel is the victim now
@larrythehedgehog
@larrythehedgehog Жыл бұрын
@@J0ttaD they need aibs because aibs push cards further than intel ever will. Aibs are what let kingpin become a force to be reckoned with with EVGA. And aibs help with competition overall simply by existing.
@J0ttaD
@J0ttaD Жыл бұрын
@@larrythehedgehog Hmm
@redslate
@redslate Жыл бұрын
​@@J0ttaDBoard partners allow chip manufacturers to meet volume demands. While not necessarily a breaking point this early on, eventually, a suitable supply chain will be necessary for success.
@grimwald
@grimwald Жыл бұрын
I don't see Nvidia staying 76% of the market share honestly, especially with how poorly the 4000s series are doing. The real question is who will gain the most out of that - Intel or AMD?
@OG_Gamers_Live
@OG_Gamers_Live Жыл бұрын
I've actually had issues with DaVinci Resolve Studio and some recording/ streaming with Nvidia this year on my RTX 3090 T.I Fe card but it usually happens when we get a new driver update for a new 40 series card and FYI you should see a LOT Better performance with a system that has resizable bar enabled. I'll let you know when my Arc A770 OC arrives next week. My current system is running an AMD Ryzen 5900x with Resizable Bar enabled. and your computer case looks an awful lot like a cooler master case I have an older build in :P
@perlichtman1562
@perlichtman1562 Жыл бұрын
I’d be curious to hear about your experiences. I have been using an RTX 3060 12GB with Davinci Resolve Studio Free and Adobe Premiere Pro 23 and 24 Beta. I’m trying to decide between getting a license for Resolve Studio or moving to an A770 16GB (similar pricing) or possibly another video card, so your experiences comparing the A770 with the newest drivers to a 3090 Ti would be extremely helpful. Any chance of running the free Puget Bench benchmark for Resolve Studio for each? It’s been tricky finding benchmark results using the same system and just swapping the GPU for those cards.
@badmoose01
@badmoose01 Жыл бұрын
It’s interesting that you chose to use the rx 6700 non XT for this comparison because that is the desktop gpu that is most analogous to the gpu in the PS5.
@J0ttaD
@J0ttaD Жыл бұрын
Vex could u try the game called "Control" I swear to got AMD is broken in that game I keep getting insane 1fps for like 5 seconds randomly. Could u check if intel does this? Also the RTX and details in that game is stunning, would like to see intel vs amd comparison on it. Cheers.
@daedalduo
@daedalduo Жыл бұрын
I second this idea. I want to see Control on an RX 6000 series GPU.
@maddymankepovich1796
@maddymankepovich1796 Жыл бұрын
i tried it on rx6600xt and 6800xt. In terms of performance it works fine, but the game was broken on dx12 when i was playing
@daedalduo
@daedalduo Жыл бұрын
@@maddymankepovich1796 ah i have a 6700 xt and things seems off even tho i was getting 100+ fps without rt and 70fps with rt medium. I might try playing on dx11 instead of dx12. Thanks!
@raresmacovei8382
@raresmacovei8382 Жыл бұрын
Don't use RIS + ReBar on AMD GPUs in general. Use one or the other, it's a driver bug with leaking VRAM. Played Control earlier this year at 1080p120 with FSR2 Balanced on a 5700 XT. Without ReBr. Game ran perfectly fine. Just use the HDR+FSR2 mod in Control y'all
@NipahAllDay
@NipahAllDay Жыл бұрын
Worked fine when I used to own a 6900 xt and a 6950 xt. It also works fine with my 7900 xtx.
@deneguil-1618
@deneguil-1618 Жыл бұрын
quick point for the shader compilation part on TLOU. having written shaders myself (OpenGL for a simple 3D renderer) i can guarantee you the CPU is the one doing the compilation. the shaders are run on the GPU when needed in games afterwards
@Dangeruss
@Dangeruss Жыл бұрын
I can't wait to see what Battlemage brings to the table. I'm definitely switching to Intel. I want Intel to do well. I've always wanted an all Intel PC!
@VintageCR
@VintageCR Жыл бұрын
Battlemage's "entry" level card is going to have close to 4080 performance, entry level meaning, their lower-end battlemage card. the price is going to be around 450 to 550.
@Dangeruss
@Dangeruss Жыл бұрын
@@VintageCR Are you serious I'v been holding out on a new build for years! I'm still on my intel 5960x and 1080Ti.
@horus666tube
@horus666tube Жыл бұрын
@@VintageCR i am not big into tech news for quite some time alrdy, but what you are stating there sounds like dreaming ... :D especially for that price range
@defaultdan7923
@defaultdan7923 Жыл бұрын
been using an a750 (specifically for gaming) and it has been running like a champ. i don’t think i’ve had a game crash on me yet at all, and only a few run poorly.
@BluesFanUK_
@BluesFanUK_ Жыл бұрын
Intel could save their reputation by disrupting the GPU market after years of fleecing people in the CPU market.
@thr3ddy
@thr3ddy Жыл бұрын
I was a day one adopter of the A770 (the same one you tested). Things were rough until about a few months ago, but now it's smooth sailing on Windows and Linux. The main issue I've had is soft locks when really pushing the video card. I think it's an overheating-related thing since this card runs very hot to the touch. Also, I play CS:GO competitively, and when I run it without a frame cap, I get over 1000 FPS, which seems to be causing some hitching. I have to limit it to something more reasonable, like 500 FPS.
@mRibbons
@mRibbons Жыл бұрын
Intel is gonna do whats called a "Ryzen move". Solid in value, but lagging behind. Then close the distance and refine over time.
@redslate
@redslate Жыл бұрын
Nah. First Gen Ryzen was a flop. It wasn't until the Second Gen that Ryzen offered an alternative consideration. Intel's already offering a competitive product with their First Gen ARC GPUs.
@redslate
@redslate Жыл бұрын
@bon3d3th Objectively, it offered far less than Intel's offerings at comparable pricing. I agree that it helped lay the groundwork to eventually supersede Intel in later Gens.
@inteldruid
@inteldruid Жыл бұрын
@@redslate Not really, Ryzen 1600 destroyed i5 7400 at the same price especially in terms of longevity.
@redslate
@redslate Жыл бұрын
@bon3d3th Short of early adopters and diehard AMD fans, once performance metrics were widely available, practically nobody wanted Ryzen Gen 1. Their products had to be deeply discounted to move units. It wasn't until Zen+ (Gen 2 and 3) when Ryzen began to become appealing. This really 'woke up' Intel, forcing them to resume R&D, as they were essentially just re-releasing Skylake year-after-year, with a stopgap of: add 2 more cores... okay, add two _more_ cores... -Skylake 6xxx -Skylake _II_ (Kaby Lake) 7xxx -Skylake with extra cores (Coffee Lake) 8xxx -Skylake with extra cores _II_ (Coffee Lake II) 9xxx -Skylake with _even_ _more_ cores (Comet Lake) 10xxx
@redslate
@redslate Жыл бұрын
@mirceastan4618 The 7400 ($182 MSRP) competed with the 1500X ($189 MSRP) at launch, not the 1600 ($220 MSRP), which was in a completely different product/price tier; again, lending itself to my point, "[AMD's Zen] products had to be deeply discounted to move units."
@OpsMasterWoods
@OpsMasterWoods Жыл бұрын
You should also test minecraft java with RT (Seus PTGI) looks much better than mc bedrock with rt
@theHardwareBench
@theHardwareBench Жыл бұрын
My a770 doesn’t seem any worse really than any other card so far. It does occasionally crash on moto gp 19 but I haven’t played it with anything else so might not be cards fault. It’s Rock Solid in War Thunder pinned at 120fps max setting 1080p.
@nickritz
@nickritz Жыл бұрын
The a770 is now running pretty close to the 4060. The a770 is within 2%. The a770 occasionally outperforms the 4060 in some games. The only issue I’ve ran into with my predator bifrost a770 is the fact that it doesn’t allow me to change my system refresh rate to max out my monitor (making a custom refresh rate). It’s just not as clean of a process as the Nvidia way of making a custom resolution. I have a 165hz monitor and windows only lets me go to 144hz. I got a custom piece of software to do make a custom one but I was only able to push it up to 155hz for some reason. I quit trying to get those 10hz more because tbh I was satisfied with 155hz for the most part. Overall I haven’t experienced any bugs or glitches with the card. On mostly medium settings with epic view distance on fortnite I’m getting crazy fps up into the 350’s usually resting around the 290 fps on average when looking into the distance. Overall pretty stable experience. Also this is the only gpu that costs $350 or less that offers decent rgb lighting (predator bifrost a770 OC by acer)
@AshenTech
@AshenTech Жыл бұрын
hdmi afik is limited to 144hz, display port you may need to install the monitor driver, once i did that mine defaults to 165hz 10bit rather then 60hz 8bit... same issue i have with nvidia so..l. probably windows more then gpu drivers honestly...
@OG_Gamers_Live
@OG_Gamers_Live Жыл бұрын
and you can get the ASRock Phantom Gaming Arc A770 16GB with RGB on Newegg right now for 309.99 plus tax with the current 10.00 discount applied
@narwhal4304
@narwhal4304 Жыл бұрын
As a 6700 owner, I might look towards Battlemage as my next upgrade given RX 7000 and RTX 4000 don't offer good value. And honestly, I don't think RX 8000 and RTX 5000 will be any better given the GPU market right now.
@johndc2998
@johndc2998 9 ай бұрын
Debating between 6700xt 12gb 330$, arc a770 16gb 430$ or waiting for battlemage :/ what do you think? I've compared hardware specs but need real insight
@RPGWiZaRD
@RPGWiZaRD Жыл бұрын
Bought a A750 when it was on sale as I was looking forever to upgrade my 1070 Ti to something newer but was waiting for the prices to come down which never happened and Intel was the only option I saw where price vs performance was meeting my expectations and I'm glad I did. I've been working with suprisingly few issues, especially loving the constant driver updates that tend to improve random games quite significantly in some cases. I'm looking forward to Battlemage now and probably will pick up the top SKU this time as A750 was a bit more of a evaluation purchase and now that Intel has won me over I happily pay a bit more next time.
@kolyagreen1566
@kolyagreen1566 Жыл бұрын
Are you sure about your “update”? I mean 1070ti and A750 are Almost equal.
@EuphoricHardStyleZ
@EuphoricHardStyleZ Жыл бұрын
@@kolyagreen1566 definitely not from my experience, I did actual benchmarks, it showed between 25 - 75% improvements over 1070 Ti depending on game and what we compare (minimum, average or max fps), in the 25% case I was more CPU limited in max FPS in FC6 where minimum FPS showed a 49 - 58% difference again. Kingdom Come Deliverance showed the biggest gain, 70~75% difference.
@kolyagreen1566
@kolyagreen1566 Жыл бұрын
@@EuphoricHardStyleZ I just saw tests and it's not inspiring. 25% is really small difference (okay 50%) for gpu Upgrade, considering that 1070ti is from 2017 year, okay 2018. Its marginal upgrade and not worth the money. But still, it can be okay upgrade if using old motherboard, CPU and so on.
@EuphoricHardStyleZ
@EuphoricHardStyleZ Жыл бұрын
​@@kolyagreen1566 25% was a CPU limited case scenario, I just mentioned it to not sound biased, the typical difference I saw in my tests were between 45 ~ 65% which to me is equal to a great generation shift from Nvidia to another. I used to make those kinda upgrades all in the past before prices went crazy and I stopped doing upgrades every 1½ ~ 2 years but I guess that's subjective what's "worthy upgrade" but to me it definitely was, not to mention with the price of the sold 1070 Ti card, the upgrade was the cheapest GPU upgrade I've personally done, bought A750 for 269€ and sold 1070 Ti which was OLD from Nov 2018 for 120€. For me it's all about performance vs price, I've been Nvidia customer for my whole life (and had lots of great bang-for-buck cards such as GF4 Ti 4200, 7900 GTO, 8800 GT to mention a few) until I got the Intel card and probably had 20 or so Nvidia GPUs but I'm done with them with current pricing, neither AMD or Nvidia offers what I see as reasonable pricing atm.
@kolyagreen1566
@kolyagreen1566 Жыл бұрын
@@EuphoricHardStyleZ It’s like “upgrading” from 4070 to 4070ti:D Okay, I got you bro! Thanks for telling the experience of intel Card. Maybe I will consider Intel card in future) sounds cool
@zedorda1337
@zedorda1337 Жыл бұрын
I can not express just how happy and excited I was when I heard Intel was getting into the gaming GPU space. It was a long time coming and I had been expecting it for over a decade. They will become the third leg of a truely competitive market finally.
@igavinwood
@igavinwood Жыл бұрын
Good job in working through an entire month and recording your experience Vex. I'd argue that AMD are actually providing the greatest competitive push to upscaling with their Open Source approach, as it can be used by every GPU, meaning that the propreitry software of nVidia and Intel has to be maintained and developed to stay meaningful. So if you think the GPU market is bad now, imagine how worse it would be without AMD pushing an upscaler that can be used by everone. That's why I don't like all the "dumping" on FSR, not because of it's performance, or lack of, but because of it's pressure to keep the propreitry software more "honest" and the option for it to be developed outside of AMD and across all platforms of games because it's Open Source. Of course AMD could also go propreitry with FSR and tie it to their CPU development. I'm glad they haven't, but if people keep dumping on FSR without seeing the bigger picture I can see a point where they say screw it, let's bring it in-house and tie it down as propreitry software, as we're not getting any marketing benefit by keeping it Open Source. As for the blocking of other upscalers on their sponsored game titles, I haven't seen any developer come out saying that's true, but I've seen reports where some say it isn't. Poor communication as always with AMD. Let's see how true or not it may be. At least they are not doing another nVidia manipulation with their AIB partnership programme v2 shennanigans. What always baffled me with ARC was that DX is an Intel tech, so how could they screw it up so badly at the start?
@phoenixrising4995
@phoenixrising4995 Жыл бұрын
XeSS is also open source, Ngreedia's DLSS isn't, but at least they're not locking out other tech like AMD. Seems like both AMD and Nvidia are going the same crap direction for the value priced GPU market (sub-$500 at this point). I hope Intel can solidify as a player in the GPU space, otherwise get ready for more 4060 or 4060Tis, but $150 more expensive at launch.
@vishnuviswanathan4763
@vishnuviswanathan4763 Жыл бұрын
when people say "FSR looks bad" ya of course it does it doesn't have dedicated hardware and it is nothing but black magic some time it comes close to DLSS
@phoenixrising4995
@phoenixrising4995 Жыл бұрын
@@vishnuviswanathan4763 FSR is just slightly more advance version of fractional upscaling. Nothing more, nothing less.
@Mcnooblet
@Mcnooblet Жыл бұрын
@@vishnuviswanathan4763 Which is why it can be open source. Ngreedia somehow gets crapped on for hardware acceleration, and that somehow makes them "greedy" since it can't be open source as it isn't purely software based. Nvidia with tensor cores = bad to many people, make it work on AMD with no tensor cores!
@Felix00007
@Felix00007 Жыл бұрын
4060 vs a770?
@fantasypvp
@fantasypvp Жыл бұрын
4:55 my 6700xt can get far more FPS than either of those with a decent *fully path traced* shader pack on java edition while looking far better than the basic raytracing pack that Nvidia and Microsoft made together. MC bedrock edition is not a good benchmark for Minecraft performance as almost everyone on desktop actively avoids playing bedrock
@aspencat2239
@aspencat2239 Жыл бұрын
I still will buy a 7900xt rather than 4070ti
@tomaszzalewski4541
@tomaszzalewski4541 Жыл бұрын
A question, just because I'm curious. Is it even beneficial for you to upgrade your gpu in the first place?
@Marco-vn1db
@Marco-vn1db Жыл бұрын
Is it because of the price or is there another reason? If most games supported dlss 3 I'd go Nvidia all the way, even if they're bad value compared to Amd
@superchills
@superchills Жыл бұрын
@@tomaszzalewski4541depends because if you have an rtx 2060 for example an rtx 3050 won’t be much better but it can also depend on the ram in the graphics card
@prabhakardhar1379
@prabhakardhar1379 Жыл бұрын
4070ti is the worst gpu as per its price soo far, great price for just 12gb or vram 😑
@MGTEKNS
@MGTEKNS Жыл бұрын
I was torn over getting the intel arc 770 over the 4060. I'm not using it to game but rather encode, decode, and transcode AV1 as soon as programs like Handbrake once its supported. I know OBS does but I want to transcode all my media to AV1 like movies and shows to save on hard drive space. I have a 3080 Ti for gaming so the 8GB model 4060 doesn't bother me.
@mrbabyhugh
@mrbabyhugh 4 ай бұрын
5:30 the main thing is what you said at the beginning, these ARC cards are NOT made dedicated for gaming, they are productivity cards and this is why out of the gate, they performed well in production software (Adobe, Blender, not that much Davinchi) than they did with games. But as drivers update, we see the ARC catching up in games too. Battlemage is where the FUN will begin in the GPU industry!
@SudeeptoDutta
@SudeeptoDutta Жыл бұрын
After a month's research, I went with *AM4 5600X + 6700 XT* PC Build. This is my first PC build. I was very tempted by Intel's 1st gen GPUs given their _on paper_ price to performance ratio. Alongside new games, I also wanted to play a lot of DX9 games which I missed out on the last decade or so. So it was sad to see Intel's performance on DX9 or generally older games. I think Battlemage will be great and its a great thing that once I upgrade my GPU in future, I'll have a lot of options including Intel's offering.
@Tainted79
@Tainted79 Жыл бұрын
Good combo, I have the same. Would add the 6700xt out performs the competitor in emulators as well. I used the adrenaline software to auto overclock my 5600x to 4.85ghz. Can't believe how cool it runs. And after 15 years AMD got me to replace msi afterburner.
@ARTEMEX
@ARTEMEX Жыл бұрын
kzbin.info/www/bejne/a6C5dWmvr86Bers посмотри этот тест. Там огромное количество игр с api dx9-11 P.s. работает хорошо arc в старых играх. Не верьте мифам. 3 вылетвших игры не значит, что все остальные не заработают
@82_930
@82_930 Жыл бұрын
⁠@@Tainted79you can push a 5600x all the way up to 5.2ghz on all cores if you got a decent silicon, mine had an amazing silicon and I achieved a stable 5.4ghz @1.34 volts until I upgraded to a 5800X3D about a year after, the 0.1% lows were like 30% better @5.4ghz but it drew like 80% more power
@RobBCactive
@RobBCactive Жыл бұрын
That's a good choice! There's been performance problems with just 8GB and no matter what this video says, the established benchmarkers covering a lot of games show the Intel competing with 3060 not really 6700xt on performance and value. As for ray tracing he says himself that it needs turning off, which is also true for the 4060 Nvidia cards.
@SudeeptoDutta
@SudeeptoDutta Жыл бұрын
@@Tainted79 Yes Emulation was also a big reason I went AMD. I also dual boot Linux ( haven't done so yet as my PC is just 2 weeks old :D ) & AMD drivers are generally more stable on Linux as well.
@ElJosher
@ElJosher Жыл бұрын
I’d like to see emulation benchmark compared to other gpus.
@saricubra2867
@saricubra2867 Жыл бұрын
Emulators are basically 100% CPU bottlenecked.
@Nicc93
@Nicc93 Жыл бұрын
@@saricubra2867 checkout SomeOrdinaryGamers video where he turned a single 4090 into 32 gpus. You can basicly create multiple pcs in one and each can have its own gpu
@Nicc93
@Nicc93 Жыл бұрын
@@saricubra2867 there are some pretty heavy emulators out there and virtualization takes it to a next level :)
@saricubra2867
@saricubra2867 Жыл бұрын
@@Nicc93 I only have Intel UHD 770, playing Smash Bros at 1080p with 8 ice climbers lvl 9 CPUs, i have a 100% CPU bottleneck with one of my cores but it's perfect and locked 60fps with a flat 16.67 ms frametime, not a single stutter, meanwhile a REAL Nintendo Switch tanks reaching 40fps. It's funny when techtubers talk crap about gaming fps and nothing of frametimes. "4090 into 32 GPUs" That's nice, but if you use a 4090 for emulation, it's a huge waste of sand. Virtualization adds another massive layer of bottlenecks or CPU overhead. A really BAD idea. A 4090 makes your mods and PC ports look better at absurd resolutions and settings, but for a potato 2015 phone hardware like the Switch has, it's just dumb.
@saricubra2867
@saricubra2867 Жыл бұрын
@@Nicc93 You can run every emulator with top notch perfomance with a Ryzen 9 7900X (finally, a proper Ryzen desktop gen with integrated graphics), Core i7-12700K as the minimum alone without dedicated graphics cards. Anything below will be worse specially for shader compilation (5900X kinda is the exception, but single core perfomance is worse by a significant margin like 25% or more).
@dracer35
@dracer35 Жыл бұрын
I bought the A770 LE 16gb at launch just because I was curious, wanted to play around with it for fun and personally I think its the best looking GPU on the market right now. Since then I've swapped it in and out a couple of times. The very first drivers were pretty bad but I noticed it getting better and better with the newer drivers. The last time I pulled my 4090 (that I bought used for $200 under MSRP) out and dropped in the A770 in about a month ago. Obviously there is a huge performance difference in the numbers. But in realty, it has been doing what I need it to do and I haven't felt like I needed to go back to my 4090 to have a good time. That being said, Its worthwhile to note that I don't use my GPU for anything other than gaming so I cant attest to streaming or editing workloads. Running triple monitors. Two 32" 1440p 144Hz monitors and a third 1080p monitor for watching casual videos while I'm gaming. I like my A770 and I plan to try Battlemage when it comes out also.
@johndc2998
@johndc2998 9 ай бұрын
Should I buy a A770 or wait for battlemage? Or just get a 6700XT on sale for 100$ less than a770.
@dracer35
@dracer35 9 ай бұрын
@@johndc2998 As much as I like the A770 as an enthusiast, if you just want something to play games right now, id probably recommend the 6700xt if it's really $100 less than the A770. I know I will be buying battlemage but I buy them because I enjoy playing with different hardware for fun.
@robertlee6338
@robertlee6338 8 ай бұрын
Intel Arc 750 crashes my pc once a week
@VicViper2005
@VicViper2005 9 ай бұрын
It’s phenomenal how far Intel graphics have come back like 10 years ago you couldn’t even get 30 fps on intel graphics but now on arc you can get fps in the 100s
@prince.1033
@prince.1033 Жыл бұрын
5min before I watched your amd GPU video, & searched vex Intel arc, & u uploaded before 10min 😂
@RedRandy
@RedRandy Жыл бұрын
Thx for the 30 Day Experience man. Intel has been doing great with there GPU Drivers. Hopefully they could be the next Big Bang for the buck GPU’s
@Tyrelguitarist
@Tyrelguitarist Жыл бұрын
Thank you for your work. Your effort is paramount in getting the information to the people. Looking forward to the future of the GPU market. Battlemage gonna change the game. Nvidia already allegedly mad about it. And thank you Brian for providing the product.
@eagle_rb_mmoomin_418
@eagle_rb_mmoomin_418 Жыл бұрын
I'm sorry but this is a really shit take regarding upscaling and open source. You want the upscaling algorithm and code to be open source because then it will then persist over multiple GPU generations across various manufacturers that can be refined, improved upon and increase the pace of development. These AI networks are trained using large amounts of data. Right now everything is silio'd with them all doing their own thing🤦. MS are working on their own one as well🤦 Using H/W acceleration that adheres to a minimum baseline industry wide standard still allows individual manufacturers to offer improvements and innovation over and above that targeting their H/W improvements/efficiciencies. When you do that in the consumer wins long term. Basic standards actually help drive improvements. Currently it's a mess of everyone doing whatever the fuck they want🤦. This hurts consumers not the GPU companies. You do realise that Intel's drivers improvements are using a big chunk of open source code to improve D9-11 performance on their GPUs by using Vulkan under the hood 😂😂😂😂🤦🤷. They'd be utterly utterly fucked without it. Poor research fella. Intel are actually benefiting and building upon work done by Valve, Collabora AMD, Red Hat (IBM), CodeWeavers (involved with Proton/WINE) and others. That's an example of how OSS graphics technologies can be used to lift several boats, not just one and impede progress.
@jskyg68
@jskyg68 Жыл бұрын
We have enough nvidia shills on YT, clearly he didn't get the memo........
@eagle_rb_mmoomin_418
@eagle_rb_mmoomin_418 Жыл бұрын
@@jskyg68 I actually prefer NVidia H/W feature set currently to AMDs. I have both. Please don't start that shit. Ironically NVidia released an open source kit to support the easier common integration of different upscaling technologies into games. What I want is AMD is to keep doing what they are doing and NVidia to continue improve their attitude and support for OSS along with Intel.
@jskyg68
@jskyg68 Жыл бұрын
@@eagle_rb_mmoomin_418 Yeah nvidia is making sure dlss can be incorporated into any game with forced advertising and backward compatibility... (b) NVIDIA Trademark Placement in Applications with the DLSS SDK or NGX SDK. For applications that incorporate the DLSS SDK or NGX SDK or portions thereof, you must attribute the use of the applicable SDK and include the NVIDIA Marks on splash screens, in the about box of the application (if present), and in credits for game applications. (c) NVIDIA Trademark Placement in Applications with a licensed SDK, other than the DLSS SDK or NGX SDK. For applications that incorporates and/or makes use of a licensed SDK, other than the DLSS SDK or NGX SDK, you must attribute the use of the applicable SDK and include the NVIDIA Marks on the credit screen for applications that have such credit screen, or where a credit screen is not present prominently in end user documentation for the application
@mtaufiqn5040
@mtaufiqn5040 Жыл бұрын
i can see it that you would married your own gpu based on your thumbnail
@mistermystery819
@mistermystery819 10 ай бұрын
I wonder why they don't just use Intel HD Drivers for better Dx9 and Dx10 performance since I used to game on a Uhd630 and it was really good for Dx9 and dx10.
@anasevi9456
@anasevi9456 Жыл бұрын
Great video and Secondary has an A770 16GB and is used as a typical single display gaming only machine . One driver didn't like indie unity engine games (crashes); but besides a few visual bugs last year, that's it for issues. Played countless dozen games on it with zero issues, old games with DXVK also run great. XeSS is functionally the same as DLSS2 when used on ARC gpus. Daily a RTX 3080 so I know. If considering ARC; (performance has improved ALOT even in the last 6 months): 1. Motherboard under 6 years old. 2. Bios under 2 years old. 3. Modern monitor too. Going by support threads, most having issues are trying to use pro software, or have thrift shop monitors and/or old DDR3 era computers.
@TanjidulAbsar
@TanjidulAbsar Жыл бұрын
In Bangladesh a750(founders edition)is around 290 doller and a770 is around 380 doller so which one is best? I can buy the a770 by cost cutting other components
@syncmonism
@syncmonism Жыл бұрын
The biggest issue with Intel graphics cards (for the consumer) is that they require faster CPUs to work well. This is an area where AMD is actually the strongest, but Intel is worse than Nvidia in this regard. This makes Intel less practical as an upgrade for older systems.
@ElDaniWar
@ElDaniWar Жыл бұрын
Bro, continua haciendo estos geniales videos. E visto a varios youtubers cuanto empezaban con menos de 10k subs y ahora estan en 3m+ subs y tu me das la misma sensacion de genialidad. Sigue asi y llegara lejos :)
@danielfernandezaguirre
@danielfernandezaguirre Жыл бұрын
I used only Nvidia gpus for years, then I switched to amd with the 6950xt and my experience has been great and stability is rock solid (i only update to recommended driver versions). Last year I built a second pc with an rtx 4080 and I would say Nvidia and amd are on par on terms of experience. Of course Nvidia is better for ray tracing and media work but i don't work on media, I'm a developer and amd is better for my work. I would give intel a chance but not now. I would wait at least 2 more generations to do that.
@WheresPoochie
@WheresPoochie Жыл бұрын
Developer in what regard? I do video and audio editing and am trying to pick between a 7900 XTX and a RTX 4080 for my 4K60hz monitor.
@OlavAlexanderMjelde
@OlavAlexanderMjelde Жыл бұрын
Problem in Norway is that the 16Gb 770 costs 595 USD. 8 GB version costs like 350 USD of the 770. The 750 8Gb costs also 350 USD. For the 16GB version you can in Norway get better cards from AMD and Nvidia. I was looking into building ARC rig, but its not priced right in Norway I think.
@chev9632
@chev9632 Жыл бұрын
It's the same problem I had in Brazil, I would jump from a GTX 980 to an ARC 770 but the price is not competitive.
@tony001212
@tony001212 10 ай бұрын
I don't understands why are people complaining, if you take into account this cards are first gen products I Think intel did prety well including funtionalities like upscalling and raytracing, things that took the competence years to implement, looking fordward to Battle mage.
@penonton4260
@penonton4260 Жыл бұрын
for me. above 30FPS is very good. that mean if i have that GPU. Ray tracing will be always ON.😎
@MarcABrown-tt1fp
@MarcABrown-tt1fp Жыл бұрын
Raytracing is indeed the future... Well 8 years into the future yes. None of the raytracing titles out right now use actual raytracing It's just Raytraced lighting and shadows on rasterized graphics. It looks only marginally better in its current state.
@LordBeef
@LordBeef 10 ай бұрын
It’s weird that I’m getting to the point where non-child KZbinrs are multiple years younger than me. Wack.
@mr.guillotine3766
@mr.guillotine3766 Жыл бұрын
I'm hoping they get those bugs sorted out. I do need to upgrade, especially since I'm disabled and the only way I have to make enough $ to buy groceries is through using my computer for certain things (not really GPU related most of the time, but I've done some freelance photography/editing here and there), so I need to keep my computer up and running, and it's currently running a 1650 Super. I've considered Intel, but I need whatever I buy to just work. Even though I can troubleshoot a bit, I often lack the energy to do so...my pain levels put me in a place where I don't have the patience to sit here and force my computer to do the job it should just do automatically, so consistent bugs are a dealbreaker for me. Still, I like how much they have improved and if they can get most of it sorted out by Battlemage, then maybe I can consider one then...maybe I'll just find a cheap 6650xt that will be "good enough" for 2 or 3 years and see if any of these companies will care about consumers by then and release a decent card at a reasonable price? idk..thanks for the video.
@drp111
@drp111 Жыл бұрын
I own a bunch of these and use it in servers for industrial imaging. They do perform good and the compatibility with serious applications is also good out of the box.
@MrSmitheroons
@MrSmitheroons Жыл бұрын
Very interesting and thought provoking review! I think two things will matter the most for intel soon, and it's not this first gen... It'll be with Battlemage. - One, the performance has to be there. That is the majority of the beginning, middle and end of the conversation for GPUs. (Related is it has to be decent price for the performance, but maybe that's obvious.) A new gen with new chips is a chance to change the performance numbers in a big way, and they should use it. (Sidebar: Maybe it's dumb since so few people buy top end, but I think they need a "halo" product to compete with 4070, 4080 for performance. Just to show they can. People naively or wishful-thinking transfer their feelings about the halo product onto the lower-end products. That's why they need it, IMO. It's irrational, but that's marketing for you.) - The other big thing is, they're still proving themselves for quality and eyes will be on them, whether fair or not (it's a new product line, I think it's justified). Quality needs to keep going up up up for the drivers and software ecosystem integration. If they can nail power efficiency, heat, low idle consumption, rock solid encoder smoothness, least crashes possible, all the software integrations like OBS, DaVinci, Premiere, etc, all the smaller things, that will cover more bases of "quality." Bonus for intel would be more XeSS adoption, since it seems to be keeping up okay with DLSS (not a trivial thing!!). Something to look out for: AMD has invested a lot in AI chips the past few years. Will they get an AI solution baked in and finally use it for a better upscaler??? Can they use it to enhance their encoders??? Catch up to NVidia????!?!?!? Time will tell for AMD's next (or next-next) gen... Lastly, on a bit if a tangent, something people aren't talking about much: Arc (actually Xe) low-power variant is going to be the new iGPUs. Will intel's iGPUs suddenly be competing with AMD's APUs? Outperforming them? Or does the lower wattage and die size (read: cut down perf and features?) make them not exciting? The potential there is really really huge IMO, but the potential for a let down is equally as big. It could just be a light bump over UHD iGPUs. I'm personally very excited to see what happens with the iGPUs in Metor Lake and then Arrow Lake (2024), Lunar Lake, etc... Lots to think about, and I hope it shakes up the GPU space for the better.
@marxmaiale9981
@marxmaiale9981 Жыл бұрын
Have been using an a380 since launch. The November driver updates fixed the last of my issues in use and a more recent update for the annoying driver console launch needing administrator permissions. What will make or break Arc is 3rd party software support. Programs that exclusively use nVidia/cuda for speed enhancements will continue to perpetuate nVidia dominance in the productivity arena. I expect more support for Arc features once the tGpu elements start reaching developers in meteor lake laptops. The abundance of availability it a great incentive to incorporate features.
@DIvadxat
@DIvadxat Жыл бұрын
Good video, Intel definitely have an opportunity for gains in the GPU space provided they continue the driver improvements and software updates. Specifically I can confirm that both Halo MCC and Halo infinite launch issues have been fixed with the latest beta driver update. To be noted that some games like Halo infinite are poorly optimised by the developers 343.
@jodysin7
@jodysin7 10 ай бұрын
Maybe its time to get an a750. We gotta buy some.of these cards so intel keeps going with it.
@anarchicnerd666
@anarchicnerd666 Жыл бұрын
It's awesome to see Intel doing shockingly well on their first attempt with graphics cards! Unfortunately, it's not for me - I'm a games preservation enthusiast, and while DXVK is a lifesaver, it's a band aid at best. It gets you more stability, but the 1% lows are still a problem. It's tempting tho. ESPECIALLY the encoder, wonder if running an A380 for just the encoding alongside another GPU is worth it? Probably not given the headaches with conflicting graphics drivers, but still, cool to think about. Awesome video Vex :) Keep it up!
@brugj03
@brugj03 Жыл бұрын
Shockingly well is something completely different in my book. Just acceptable maybe if you`re not picky.
@anarchicnerd666
@anarchicnerd666 Жыл бұрын
@@brugj03 AV1 encoding, superior ray tracing performance, machine learning based upscaling, a 256 bit bus and competitive value aren't enough for you? You're hard to please :P
@redslate
@redslate Жыл бұрын
​@brugj03 As a competent first Gen product offering competitive price : performance with substantial regular improvements, "shockingly well" is a more than apt description.
@brugj03
@brugj03 Жыл бұрын
@@anarchicnerd666 No. Unreliable in games, artifacts in games, different behaviour on different systems, terrible software support an unusable in productivity. And i`m not even being too picky. How about that.....
@brugj03
@brugj03 Жыл бұрын
@@redslate If you say so...... It seems you don`t care about your gaming experience, cheap and good on paper is all that matters. I rest my case.
@nevergibyouupp
@nevergibyouupp Жыл бұрын
Intel graphic is having many potiental. I hope it will improve and work better with intel cpu
@RAaaa777
@RAaaa777 Жыл бұрын
This is until they catch up and then they will dick you around like the rest. They already did this back in the days in the cpu market
@martim_trindade
@martim_trindade Жыл бұрын
first of all, raytracing is not the future, there's something called reflections, if devs want to make it they will, they are plenty of games out there with reflections as good as raytracing, second of all, intel gpu is a better choice than amd?? I had both, some games wouldn't even open on the intel 😂😂 like the last of us (don't know if its fixed I no longer have the gpu) not to mention the 1% lows and 0.1% lows, they were awful. Also, the 8% from intel is hardly true, most of it is from iGPUs. Edit: Also I have no idea why you compared the arc a770 with the rx 6700 when in reality, (just checked 12/07/2023) the 6700 xt is actually cheaper than the arc a770 and it destroys the intel gpu, check daniel owen video on the benchmarks I agree with the conclusion on amd, nvidia and intel. Competition is always good
@jskyg68
@jskyg68 Жыл бұрын
Dudes a fanboy, just trying to get people to stop buying Amd. (because nobody can recommend nvidia right now, even fanboys)
@Madvicius
@Madvicius Жыл бұрын
XeSS is open source by the way ^^
@CristianLopez-xi4rt
@CristianLopez-xi4rt Жыл бұрын
I would definetly get an Intel GPU before an AMD.
@hughmungus6941
@hughmungus6941 Жыл бұрын
6:35 How come AMD using and supporting open and inclusive upscaling is a minus? FSR is a godsend for aging gpus, including nvidias, the current most popular gpu on steam is gtx 1650 without any specialized hardware to upscale and nvidia is just saying go buy new shiny, while amd provides an option to extend your gpus lifespan no matter the manufacturer. I think amd deserves a lot more praise for that than it gets
@hughmungus6941
@hughmungus6941 Жыл бұрын
Furrther more I dont see any upscaling technology as a valid selling point for a NEW graphics card. Like why should I buy a new underpowered 128 bit bus gpu to upscale if I can do that on my old one not terribly worse. Upscaling is a feature for older cards, not newly released ones, unless they are targeting the lower end, but nvidia is marketing upscaling for a 400 DOLLAR 1080p GPU that's ludicrous
@MacTavish9619
@MacTavish9619 Жыл бұрын
Another lie about fsr... weird because i tried the witcher 3, forspoken and cyberpunk (I spent a lot of time in this game) with fsr and never had glitches like you, really really weird. Yes, i had some quality lose on trees (leaves) but never had glitches/bugs on places which you show in your video...
@jskyg68
@jskyg68 Жыл бұрын
Anyone that bought a 3080 tends to be a fanboy.
@mikeclardy5689
@mikeclardy5689 Жыл бұрын
intel is starting to look good. Though it seems they targeted the low end spectrum of gpus to enter into the market, if they fix their performance on some older games they could easily take the king of the low end title. cause you are right, they packed allot of good stuff into this card. it just needs better drivers to get it going at a steady pace. Battlemage is their actual serious gpu, i think the arc was a test run that they can still collect allot of useful data from. im kinda excited to see it, 4080 performance at a price lower than the 7900xtx... sounds amazing. it will crush me a bit considering i have a 4080... but hey, it means more pc gamers and upgraded veterans. also game devs can target higher specs if the baseline specs of users improves. besides, nvidia forced ray tracing to be a thing, and now everyone needs good performance in it. more competition will bring healthy innovations towards that and probably more tricks on scaling. or simply cutting resource usage down (which would help lower end cards keep up).
@Zxanonblade
@Zxanonblade Жыл бұрын
Their whole gpu brand is arc, this first gen was called Alchemist. Next gen will be *Arc* Battlemage.
@brugj03
@brugj03 Жыл бұрын
Well clearly ARC sucks, it`s full of issues and worst part is that this is only on your system. Take another system and you`ve got plenty of different issues. Let`s face it, for casual gaming it`s maybe just acceptable. For serious gaming it`s a disaster.
@77wolfblade
@77wolfblade Жыл бұрын
Seems not work well on emulators that well either, in some cases
@raspiankiado
@raspiankiado Жыл бұрын
Damn. As a person who cares 0 about raytracing, I still don't care about RayTracing.
@jskyg68
@jskyg68 Жыл бұрын
OH but it's the FUTURE.....So who the hell cares today? (hmm improve PQ with RT but you have to degrade PQ with upscaling...who thought this up?) Gpu's are getting too fast, they needed a gimmick to justify $2k graphics cards...( create the problem offer the solution)
@Aerobrake
@Aerobrake Жыл бұрын
Everyone say "Brian is a legend"
@zalin2000
@zalin2000 Жыл бұрын
I built a new high end gaming pc 6 months ago. I sank so much money into it I got tired of spending so I bought a A770 to use until I bought a 4090. I got my 4090 coming today, but I used the arc for about 6 or 7 months and I loved it. The card was great for the price point! And I would have used it for much longer if it was able to handle my ultra wide screen 240hertz. I love the design and I will keep it around because I love this little champ.
@bighersh14
@bighersh14 Жыл бұрын
I actually had the a750 until I upgraded to a 6950xt and Intel is getting so good so fast that I think it won’t be long until they actually put some pressure on amd to make good products.
@samvortex4136
@samvortex4136 Жыл бұрын
brian is real hero here! ty brian
@sandornyemcsok4168
@sandornyemcsok4168 Жыл бұрын
To be honest, I would be really looking for the new generation of Intel GPUs. I smiled at some folks 2 years ago, when during the crypto crazyness they were wishing for Arc to save us.... And I planned to switch to AMD first time in my life. But now seeing that AMD follows Nvidia and practically gives up the low end and potentially the mid range consumer GPU market, Intel seems to be the only hope. Unfortunately, Battlemage is still far in the future. 😐
@brazz201
@brazz201 9 ай бұрын
In Europe, the a770 goes for a least €400
@Nelthalin
@Nelthalin Жыл бұрын
That 980 Pro is also not stretching its legs on a B350 and will be limited in certain situations. My 990 Pro still performs better than the 970 Evo Plus in a 3.0 4x slot so its useful but the 4.0 slot makes the raw throughput 2x as fast. IOPS went up about 20-25%. Good to see ARC is doing quite ok. I wonder what battle mage will do and if they still dare to release a GPU the size of the A770. Because lets not forget the Die size and power usage difference between the A770 and the RX6700 or better yet its fully enabled RX6700 XT brother. 406mm2 vs only 335mm2! thats a major difference in production cost. And the RX6700XT and RX6750 XT are quite a bit faster than the RX6700. The RX6700 is using about 15-20% less power compared to the A770 so in the long run it will be even cheaper because of the electricity bill. So yes its not bad but there are some things that still need improvement.
@PinkyTech
@PinkyTech Жыл бұрын
I recently switched to the A750 and I am really enjoying AV1 encoding for streaming and recording.
@FhtagnCthulhu
@FhtagnCthulhu Жыл бұрын
I've been running an A750 since shortly after launch launch and it has been hit or miss for me. You see, I am running linux (Fedora), and that has complicated things. When the cards were new, things were kind of tricky to get running, but that has been cleared up. Performance is good in Linux with the open source drivers, and they are actively developed. My main issue has been that a key feature required to run some DX12 games (sparse residency) is not implemented. This makes an ever growing list of modern games unrunnable. They are looking to sunset the current driver and move to a freshly written one that should support this feature, but that is in development still and as far as I know that driver will not support HuC for DG2 cards. I got the card as a novelty, to tinker with and because I cannot believe how much new AMD and Nvidea cards cost, so I am not really disappointed. However, it is not as clean a story as the improvements to windows drivers. Performance updates keep coming through.
@PeterB144
@PeterB144 Жыл бұрын
Very nice Video. Vex you're totally right. Intel is also a company which want's to make money by selling things to people. But Intel has built a very good and top modern Graphics Card line to compete with Ngreedia and AMD and for their first product ARC it's a really astonishing piece of hardware and they made a very good product when I think about their Xess upscaling, the video encoding, etc. and much better than the others when it comes to price to performance and I am looking absolut forward on their glory future when they continue perfecting their drivers on and on and the experience they gain also for the next generation graphic cards battlemage. Unfortunatly I don't have the money at the moment to grab me a ARC 770 with future-stable 16 GB but the time will come and I believe with their comming battlemadge gen. they will rock the graphic card market.
@johnpaulbacon8320
@johnpaulbacon8320 Жыл бұрын
Great video. Thanks for taking the time to make this and do such an extensive testing.
@gettoecoding1058
@gettoecoding1058 Жыл бұрын
I bought a a750 for my old pc and it's been pretty stable other than for the driver installations blue screening on windows 11. Tried ddu and a few other things. Every time it gets to the arc control installation it bsods. It is on a old ryzen 1600 so no rebar is possible so not to sure if it's that. Maybe worth testing with my ryzen 5700x
@GoonyMclinux
@GoonyMclinux Жыл бұрын
It needs rebar to work, its a requirement.
@jashaswimalyaacharjee9585
@jashaswimalyaacharjee9585 Жыл бұрын
Now NVIDIA Needs to make CPUs
@Lucas-ck1po
@Lucas-ck1po Жыл бұрын
_"Intel makes the best 'non-gaming' cards"_ Remember that video encoding isn't the only task you may need heavy gpu use, CAD softwares are still best served by Nvidia professional cards. I work with a workstation that has an "punny" 4GB Nvidia Quadro T1000, massively outpaced by more modern professional dedicated gpus out there and they are still the best cards i've used in therms of compatibility and stability. It just works. In fact i still have access to a workstation with a ddr3 Quadro and it's smoother that any other cards i've tried so far not Quadro. I wish Intel starts working on CAD drivers so i can ditch damn Quadros forever. Im sick of their stupid prices.
@noelchristie7669
@noelchristie7669 Жыл бұрын
Intel does have a line of pro workstation cards! I'm not sure how they perform compared to the Quadros but, you want to look for Intel Arc Pro A40 and A60. Those are the workstation cards.
@Lucas-ck1po
@Lucas-ck1po Жыл бұрын
@@noelchristie7669 Well, nobody knows exactly how they perform or where to get them yet, they haven't been launched. But Intel taking this route seriously is amazing! My reply was to the content maker here claiming that Intel makes the best non gaming cards, basically because of their video encoding abilities but theres ALOT of stuff out of the video editing scope that's untested. Professionals use cards for mining, machine learning, rendering, CAD softwares, CFD computation etc. So far Nvidia has been dominating this area, this is why they basically charge people 4K USD for a 3080 with more memory and CAD drivers (a4000 and a6000). Its insane what companies do without competition. I really REALLY want intel to bring up pro cards to compete with their A Series of cards, and for them to work properly. If so, good ridance Nvidia.
@rangermatthijs1740
@rangermatthijs1740 Жыл бұрын
I bought the A770, because i wanted to support Intel to increase the competition in the market. I also wanted a cheap but decent 2nd GPU for work (trading) and multimedia (netflix, hbo etc.). I must say, after installing the GPU, never had a problem. But, that being said, I am not a intense user.
@tohur
@tohur Жыл бұрын
I just don't understand why AMD doesn't make FSR look better on their own GPUs like Intel does with XESS as XESS works on any GPU also but Intel makes XESS look better on their GPUs
@spindragonkiller
@spindragonkiller Жыл бұрын
thanks vex for the insight, time, and dedication. followed your vids on av1 for a while and now this moment and subbed! as a proud owner of a750 I salute you. 😊 and Brian for lending the a770 for testing! NB:I recorded & exported 5 hr. ish long gameplay 1080p 60hz AV1 6K BitR around 2 hrs. ❤ not sure if its too long but pretty decent to me.
@spike178
@spike178 6 ай бұрын
Yeah amazing for free 😂
@nenadcaric5195
@nenadcaric5195 Жыл бұрын
Dude! My mind blown when I heard that you use it on gen3... i remember that on gen 3 intel was absolutely terrible. Like much worse than with gen4 and bar turned on. Good job intel. Bring the competition!
@doz3r943
@doz3r943 Жыл бұрын
a770 is so beautiful🥰
@vikrantburde8870
@vikrantburde8870 Жыл бұрын
Xess is also open source tho, and is not specific to intel gpus, it just cant run on as many gpus as fsr can but yes as u said, it looks better and runs better. I used it to play spiderman at 50-60 fps 1080 p on my gtx1650
@ScottJWaldron
@ScottJWaldron Жыл бұрын
Interesting, great job on the video! Probably not building a new system anytime soon but nice to see there could be a third viable option by then. Though any hard locks or maybe even blue screens can be concerning. With GPUs I'm especially interested in encode and decode of video which I do more than gaming. Still on a 1060 6gb so probably won't touch anything under 12GB whenever the time comes. 😄 We will also see if Battle Mage has a good price to performance ratio like their first generation. Arc seems like a special case and its 256 bit memory bus and 16GB version.
@Voltic_Playz
@Voltic_Playz Жыл бұрын
i bought my first pc with i3-12100F + Arc A750 just 4 months ago and the Driver Updates are making my editing Flow so much faster!!!, i was concerned about my decision and that i should've gone for the RTX 3060 since their drivers are more stable but boy i dont regret having this slick baby running with a stock cooler in this hot atmosphere rendering my 40GB project within 8 minutes!!!, and btw if you use The AV1 it'll shrink the size to 10GB but i love to go the casual OBS way.
I Tried AMD to See How Bad It Really Is
18:01
Vex
Рет қаралды 1,1 МЛН
The selfish The Joker was taught a lesson by Officer Rabbit. #funny #supersiblings
00:12
Life hack 😂 Watermelon magic box! #shorts by Leisi Crazy
00:17
Leisi Crazy
Рет қаралды 13 МЛН
Watermelon magic box! #shorts by Leisi Crazy
00:20
Leisi Crazy
Рет қаралды 51 МЛН
No One ACTUALLY Cares about Graphics... Here's Why
11:21
World's First Intel ARC PC Build! (A770)
16:55
Marques Brownlee
Рет қаралды 3,2 МЛН
Intel Arc - Is it ANY good?
8:00
A Gamer on a Budget
Рет қаралды 3,2 М.
The Judge Dismissed Valve's Defence, Now Steam Is Different.
12:10
Bellular News
Рет қаралды 121 М.
The Combined Timeline | COMPLETE Half-Life & Portal Story & Lore
2:51:23
The Great Blue Hope | Intel Arc A770 16GB
18:19
Iceberg Tech
Рет қаралды 105 М.
The selfish The Joker was taught a lesson by Officer Rabbit. #funny #supersiblings
00:12