By lowering the 2080 ti to 86% power you are nullifying/gimping the real performance of the card. This makes the comparison invalid. The cards should all be left at stock to see what their real world performance is out of the box.
@teflonstestbench6 ай бұрын
What I'm doing is setting it at its stock settings. Gigabyte bumped the tdp to 300w while a stock 2080 ti should be at 250w, so all I'm doing is rebalancing the card so it doesn't have an unfair advantage
@pkpnyt47116 ай бұрын
@@teflonstestbench Stock settings for 3rd party cards is whatever it comes out of the box. What you're doing is gimping it period. It's been the same for every 3rd party card since forever. They're clocked higher and pushes their voltages to whatever it is in their default settings. Agree to disagree.
@teflonstestbench6 ай бұрын
What you say it's true to a degree, it's true they have higher boost clocks and different voltages and PCBs, but that doesn't matter much as Nvidia's boosting works over that. What I'm doing is limiting the power because I want the performance of the base card, the average for that card and not to account for the extra performance that manufacturer is giving it so it levels the playing field as much as possible. It would be fair to compare a card that has extra tdp and a special PCB to a low-end card then as the performance difference forced by their design will affect theresults and make the gap between the cards artificially bigger or smaller than it really is. So by downgrading or in other cases upgrading the card I can make sure they are as close to the average performance for each card and make a fair comparison between the lineups and not specific manufacturer cards.
@pkpnyt47116 ай бұрын
@@teflonstestbench the base cards were barely sold by Nvidia (reference models), it's the same for the 3xxx series. They were very few in number, so your logic is flawed. Most of the people that owned 2080 ti's (including myself) all bought the 3rd party cards because they were readily available and performed better than reference. If you're also using the same logic then you should also use the refence cards for the other 2 cards, which you aren't. So, you're contradicting yourself which brings me to my original point.
@teflonstestbench6 ай бұрын
I'm using the default values from Nvidia that most cards follow, 3rd party card usually maintain the same tdp and really close speeds, the thing is that some have extra power or boost way higher, only on those cases I compensate.
@tahinkhan37636 ай бұрын
Great Video! Very helpful but I have a question, why was the frametime graph in rdr going insane on the 6700xt?
@teflonstestbench6 ай бұрын
My guess is AMD drivers. It happens with every AMD card and every system, I've seen it on other people tests too and also the ram goes extremely high vs the Nvidia cards so it's probably a driver thing. Tho the game seems to be playing well.
@takik.22206 ай бұрын
could you do a benchmark at 1440p with these cards?
@teflonstestbench6 ай бұрын
Yeah, will do it soon, next videos are gonna be all 1440p and ray tracing tests
@هيثم-ح6م2 ай бұрын
@@teflonstestbench thx
@thedjoker3521Ай бұрын
How do you enable real power consumption?
@teflonstestbenchАй бұрын
I measure it via an external device that measures it from the PSU cables so that way it can calculate all the electricity going through them and get the real power it is using.
@feilinzhou25586 ай бұрын
I have RTX 2080 Ti and RX 6700 XT. I have always been wondering why RX 6700 XT feels so slower than RTX 2080 Ti in basically every game I tested. But in your test, it doesn't seem to be that slower.
@teflonstestbench6 ай бұрын
I mean, it is noticeably slower than the RTX 2080 Ti but depending on the game or your test system that difference may change, like for example if you were testing the RX 6700 XT without rebar or had models with more or less boost clocks or boards configurations than the ones I had.
@JustAGuy855 ай бұрын
@@teflonstestbench Exactly. Gotta have that SAM on the AMD cards to get those huge boosts. AND depending on your CPU, SOME games benefit 0% from SAM with Zen 2 or Intel 10th gen in certain games, while in those same games, SAM shows the improvements with Zen 3 and Intel 12th gen. Not to say that ALL games don't benefit from same on Zen 2/Intel 10th gen, but I saw that in a test and it really surprised me. Dunno if it's an architectural thing or a "soft" limit but yeah... If you're running a Zen 2/Intel 10th gen, there's gonna be games that SAM gives 0% gains in whereas with a Zen 3/Intel 12th gen, you'll get substantial gains. But the 2080Ti is a beastly card, too. As is the 3070. It's always surprising to see the 6700XT 12GB hang with the 3070. 2 years ago, there weren't any games it could hang with it in. AMD Fine Wine drivers really are a thing lol. The 6700XT 12GB today is not the 6700XT 12GB at launch. I'd imagine the 6700XT 12GB does substantially better than the 2080 Ti in Call of Duty games. RDNA2 or just AMD in general really loves that engine or something. Kind of like Forza Horizon 4 and even 5.
@_Isshin_The_Sword_Saint5 ай бұрын
@@JustAGuy85 Between you and me, I gotta say I'm jealous of you having a 2080ti...jk 😂. But, it's cool. Before 3000 series came out my friend secured a banger deal on a strix 1080ti. And now I wish to do the same on a rtx2080ti, I really wanna get one, even founders edition would be perfect as I wanna collect these cards in the future hopefully. But the point, you own both and and nvidia, as of right now for upcoming games, black myth wukong, phantom blade 0 , doom etc. at 1080p high or medium settings which GPU should I get. A rtX 2080ti or a Rx 6700XT?
@JustAGuy855 ай бұрын
@@_Isshin_The_Sword_Saint Ah, my bad, I have a 5900x (PBO/180a EDC limit/+200 mhz boost/per core Curve Optimizer) + 6700XT 12GB (XFX Qick 319 flashed to the XFX Merc model) + 2x16GB DDR4 3600 C16 G.Skill Ripjaws @ DDR4 3666 w/ 1:1 IF. RAM has FULLY tuned timings/sub-timings and it's a dual rank kit, too. So, I tend to get better performance than the "usual" benchmarks you see. Dual rank RAM has shown to give up to 10% performance boosts in some games. (Edit: May as well list my mobo: Asus B550-F Gaming Wi-Fi II with WiFi 6e and BT 5.2, but now it's BT 5.3 on Windows 11 lol.)( It's an awesome board with a great VRM set up. My 5900x is pulling 220 watts in Prime95 with my settings with zero issues and more room to go if I wanted to. My 5900x is cooled by a Fuma 2 Rev B with 3x fans and behind the 3rd 120mm fan is an Arctic P14 140mm exhaust fan lol! My case has all 140mm P14s in it.. minus the Fuma 2 Rev B) But, yeah, I have the 6700XT 12GB. It's definitely a 2080 Ti competitor. The newer the game, the better the RDNA2 technology does. RDNA2 + UE5 = great. These cards LOVE UE5 engine games. It has 96MB of Infinity Cache, too. Hell, I'd rather have a 6700XT 12GB or 2080 Ti 11GB than a 3070 with only 8GB VRAM. My RX480 had 8GB VRAM lol. And games USE that VRAM now. I was playing The Callisto Protocol and was using a total of 22-24GB total system RAM and 11.x+ GB VRAM. 8GB just ain't cutting it. I also play DCS: World which uses 24+GB RAM (including Windows) and nearly all my VRAM. 8GB VRAM just won't last through 2024 imo. Look at the 3070 vs 6700XT in the latest 20024 UE5 games, now. The 8GB VRAM is killing the 3070. BUT Nvidia does have DLSS and superior ray tracing performance... in "RTX" ray tracing games. But when it's not sponsored by Nvidia, AMD does just as good. Like in Far Cry 6 or that Avatar game, the 6700XT beats the 3070 because of VRAM and lumen+nanite.
@JustAGuy855 ай бұрын
I mean, everything I play uses well over 8GB VRAM. F1 22/23/24. Callisto Protocol. I'm trying out that new Robin Hood game. Forza Motorsport. Call of Duty games (which the 6700XT beats the 3070 in anything from Warzone and newer in CoD games). It also beats the 3070 in Battlefield 2042. I WANT a 7900 GRE, though. That's a great price per fps if you ask me. That's my next card I plan on getting. I don't think there's a better $500 GPU you can get for that money. I was looking at the 6800XT... but if I'm gonna upgrade, I may as well really upgrade. So I thought the 6950XT would be a good choice. Then someone reminded me about the 7900GRE. And, yeah, THAT'S the one I'm gonna get. It's pretty much right on par with the 6950XT but has slightly better ray tracing. And since it is RDNA3, there will eventually be new features that only RDNA3 will be able to do that RDNA2 will be left out of getting.
@Baz8710027 күн бұрын
6700 XT is the winner for keeping up with the other two, but with significantly lower power draw to both.
@adlibconstitution160914 күн бұрын
0:58 look at the real gpu power, the 6700xt is drawing 210w which is close to the 3070. Real tdp of 6700xt is 220w
@Howlsowls6 ай бұрын
Hey Teflon love your benchmarks, I mustve watched a hundred of your videos making buying decisions every year
@teflonstestbench6 ай бұрын
Thank you so much for the nice comment, and so happy they helped :)
@pizzapr92875 ай бұрын
6700xt could probably keep up after an oc
@AdadG2 ай бұрын
And it would be even better with an undervolt, but the same can be done with the other 2, especially with the 3070 (the RTX 3000 that has the best gains with oc and uv). In my case, for example, I can reach 2040 MHz at 968 mV (stable in almost all games) and +1000 MHz in the VRAM (in Afterburner) with my MSI Gaming X Trio, which translates into being 8-9% faster than stock, even consuming less power; and it is already a slightly above average GPU. But I am very curious to know how much can be squeezed out of a 6700 XT.
@thedjoker3521Ай бұрын
@@AdadGis this with a 3070? What’s the power consumption with your undervolt? The difference in percentage
@AdadGАй бұрын
@@thedjoker3521 Do you mean an amount of power taken from the wall? I don't know because I don't have the necessary tools. I also couldn't give you an exact amount of what the GPU reports via software because I haven't done metrics on that either, furthermore it varies from game to game and I also always limit the framerate to have a smooth experience, which impacts consumption. But what I can tell you is that the reduction is usually evident on the temperature and frequencies because the stock behavior of the RTX 3000 is always to seek maximum consumption to reach maximum frequencies, but this in turn is counterproductive because this increases temperatures which in turn decreases frequencies. By the way, when I wrote "stable in almost all games" I meant that those numbers are present in almost all games but not in the most demanding ones, such as Quake II RTX, there even capping the fps to 90 the frequencies usually drop one or two steps due to temperature. But those values 2040MHz@968mV are stable themselves.
@AdadGАй бұрын
@@thedjoker3521 Oh, and yes, it is a 3070. While I can hit the above numbers I prefer the following: 2025MHz@937mV, which is just a step down in frequency but lower in power consumption. To give you a comparison and certain idea I can give you the following numbers that I remember: stock it used to be around from 925 to 965MHz with from 1000mV to 1050mV. While with the voltage/frequency curve modification it use to stay stuck and has a limit of 2025MHz@937mV. But this is not an issue because you have to consider the other part: the current (amperage), which is dictated by the GPU usage percentage.
@sc94334 ай бұрын
rx 6700xt almost half the power draw.
@pwnomega45624 ай бұрын
No it's not. Look at the real time data in the video.
@jzk9meow69621 күн бұрын
Who cares for power draw
@yummerz63026 ай бұрын
Seeing the 6700XT lose sort of hurts my soul. I hope everything is in order with that 6700XT Also god damn how is that 2080Ti so cool?
@teflonstestbench6 ай бұрын
Yeah, it's just that the RX 6700xt was meant to compete with the Rtx 3060 ti, so it is performing as it should, even better than I would expect given it's close to the RTX 3070 performance. The RTX 2080 ti is that cool because I had to watercool it as the gigabyte design is so bad the cooler stopped making proper contact with it and it couldn't be cooled down properly, so the only option was to use another cooler and the only ones compatible with all the models are the water cooler kits like the NZXT kraken g12 I'm using.
@yummerz63026 ай бұрын
@@teflonstestbench Only now Ive noticed that I haven't read the important notes after watching lmao, ah here's an interesting video Idea if you ever get a 6750XT, OC and Undervolt the 6700XT and put them up against each other! I can share my settings for the 6700XT if you'd like
@teflonstestbench6 ай бұрын
@yummerz6302 yeah that is something I would love to do with the other 50 series from that generation as the differences are just oc and power, tho sadly I can't afford to get all those GPUs to test as they would only be relevant for those tests and they are quite expensive. Same with the 7600 and 7600xt that the difference is just the memory...
@yummerz63026 ай бұрын
@@teflonstestbench the ultimate bottleneck of a gaming pc... budget...
@feisaljatnika91606 ай бұрын
2080ti is the greatest invention from nvidia, even it has higher fps than 4060.
@cfif_asd6 ай бұрын
в rdr2 при активации Sam 0.1 процент вылетает до небес и карта резко будет обходить 3070. можете не благодарить)
@leonardosc61952 ай бұрын
Cual es mejor la rtx 3070 o la rx 6700xt??
@teflonstestbench2 ай бұрын
In terms of performance the RTX 3070 is better most of the time, but depending of the price the RX 6700 XT could be a better option
@PLAEJA6 ай бұрын
RAYTRACING ON !!!! BOAH EY ....
@هيثم-ح6م2 ай бұрын
This is an example of CPU bottleneck.
@teflonstestbench2 ай бұрын
It is clearly not, this CPU is more than capable for these GPUs, especially at this resolution and quality settings.
@هيثم-ح6م2 ай бұрын
@@teflonstestbench Cyberpunk shows cpu bottleneck crystal clear, look at the gpu utilization and you will figure it out.
@teflonstestbench2 ай бұрын
Ok I was doing some investigation and found the issue. It seems like it was a bad cyberpunk update that killed the performance on GPUs and that is why they were not being fully used and we're working correctly on other games. So it is not a CPU bottleneck and but there was an issue. For example I compared the results to other videos of mine like this one: kzbin.info/www/bejne/iHuViohmZqijeMkfeature=shared Here the RTX 3070 performs as it should and gives aperformance result more in line with its power.
@هيثم-ح6م2 ай бұрын
@@teflonstestbench Your work and efforts are so much appreciated, Good job.
@leadersjs89042 ай бұрын
RTX3070 6700xt great
@dr.eduardogomes85925 ай бұрын
nVidia better AMD white line.
@piyapolphetmunee38796 ай бұрын
Most of these tests are CPU limited
@teflonstestbench6 ай бұрын
Why?
@JustAGuy855 ай бұрын
He's running... a 5800x. You know that is on par with a 12600/12700k in gaming, right? Despite it being released in the same generation as the Intel 10th gen. I have a 5900x. It was released the same year as the Intel 10th gen. It eats the 10900K's lunch in gaming. Beats the 11900k in gaming. Goes back and forth with the 12700K (which is a 12c CPU, like the 5900x). 5800x performs practically the same as a 5900x, maybe 2-3% slower due to ever so slightly lower boost clocks + half the L3 cache. Shi... my 5900x has PBO on, CO tuned per core, I have 2x16GB DDR4 dual rank RAM @ 3666 C16 with fully tuned timings/sub timings. I can't get my CAS latency any lower without throwing more voltage at my RAM than I want to because it's dual rank RAM, but dual rank RAM has shown to give anywhere from 3-10% gains in performance. And DDR4 3600 C16 is quite the sweet spot for Zen 3, anyways. I just went to 3666 with IF @ a 1:1 1833mhz for the lulz. Ryzen cares more about timings than anything. DDR4 3400 C14 would perform better than my RAM most likely.