It's time. (FOR SCIENCE!): Tahiti's Revenge




Back in 2013, the Radeon R9 280X and GeForce GTX 770 faced each other at around the 300 dollar price point. Although the GeForce card was actually 30 dollars more expensive. actually. But anyway, these cards often traded blows.


But what about now, in 2019, six entire Earth Cycles later? I want to do my own test, with my own rules because I am curious about this. And it's for SCIENCE!


Yay! I'm doing a benchmark test for the first time in a long time. But now I am finally in the mood for it. Oh, this post is just to tell you that I am going to do a write-up and test of performance between custom models of the Radeon R9 280X and GeForce GTX 770, with games I select, now in 2019. I also am waiting for a Kraken G12 to arrive so I can throw a hot-clocked R9 290X into the mix for fun!


Back in 2013, the 280X offered the best deal. I say that not because I am a Rabid AMD Fanboy that is positively foaming at the mouth for all things Radeon, but because it offered similar performance, with 1GB of extra VRAM (that will likely prove critical) for 30 dollars less. Sure it was a bit less efficient but not a lot of people really cared about 30-40W when you're running these high end, aftermarket parts.


I want to test to see just how much of the 'last laugh' everyone's favourite Legendary GPU, Tahiti, has, in 2019. I won't be testing the latest AAA games (I have a couple) because I don't own them all. But I will test a selection of games that I have, including Metro Exodus - still one of the most graphically intense video games on PC to date.


I am thinking of two distinct testing 'modes'; VRAM limited and VRAM unlimited. That's right, to get a measure of GPU performance only, one mode will have me select minimum textures to cram, hopefully, into that comparatively diminutive 2GB frame buffer on the GeForce and even the 3GB on the Radeon is tad on the small side these days. Secondly, I will be testing at settings I deem 'playable' on this performance tier. We are not talking Ultra, we are talking medium/high and mixed. Obviously all settings will the be the same but in this test I will use medium/high textures. I want to emphasise just how much of a better experience you get if you did decide to invest in the Radeon with the extra 1GB.


I am so excited! I am just going to set up my test bench. In case you are wondering, yes my Test bench is somewhat improved from last time.


Ashley's Test Bench

The good old Ryzen 3 1200 has been retired from running the test bench, instead, relegated to 24/7 World Community Grid, YouTube videos and media playback next to my main PC. If you know me well, you'll know that there is no way in Hell I am messing with my main PC with these cards - I will leave that well enough alone. So my test bench will consist of my former living room HTPC with the following spec:


Ryzen 5 3400G @ Stock

16GB (2x8) 3200 MHz CL16

MSI B450M Mortar mATX

WD Green 240GB SATA m.2 SSD


But, I will move it from my Slim Micro ATX case into my Test Bench Table, and swap out the TFX 300W PSU for a more Beefy 1000 W Gold unit I have lying around from ages ago. The 3400G should be enough, it reaches 4.1 GHz+ across all 4 cores in gaming, and the limitation of only having 8x gen3 lanes for graphics won't matter with these GPUs.


Anyway, enough writing. I gotta set this up before I eventually start procrastinating.


:D


©2020 by Sashleycat (Read the Legal stuff)