Looking at many reviews in the past 24 hours, I'v come to a conclusion. Better, let's call it a theory. A theory that Radeon VII will age better in future games then the current RTX and GTX cards. So why do I think this? Looking at BF V, it's suprising that AMD is performing better with RX, Vega and Vega VII compared to the Nvidia counterparts. Especially since it's a title where Nvidia and DICE worked together to implement Raytracing.
The theory I have that this is related to the Nvidia's drivers using Instruction scheduling vs AMD's hardware global/warp scheduling. Basicly, the result is that Nvidia's software takes advantage of Multi-core cpu power that is not being used by games. The only problem is that performance degrades when the game is optimized for many cores/threads or when CPU power has to be shared by a game and encoding/streaming software, like Streamlabs OBS. In these cases, there is no more headroom for Nvidia's driver to 'steal' CPU power in order to boost the GPU.
Most games still draw power from the first two threads on a CPU, with less utilization on the rest of the threads. As such, Nviida's driver can use the spare CPU cycles on the other threads to increase draw call throughput. Games that are truly well threaded, like Battlefield V, are rare. But as long as there are more cores then the engine requires, an Nvidia gamer will not run in to CPU bottleneck problems. In games that are mostly single threaded, Nvidia can keep the lead. And in DirectX 11, it's relatively easy to hurt performance, with sloppy coding or on purpose, the primary thread can be fully loaded leaving AMD GPU's idling. Therefor Nvidia's approach has been superior in the last couple of years, but thanks to optimized engines like Frostbite we finally see that this approach will fail more often as long as the Nvidia gamers don't provide more cores then the game can use. So in a way, now that AMD is starting the CPU core war, CPU cores will increase on Intel CPU's and AMD is still supporting Nvidia's approach in this mather. AMD can only hope that more developpers start releasing games that use 8-16 threads so that their GPU's age better then Nvidia GPU's, simply because AMD has CPU usage to spair, while Nvidia gamers will not as long as they don't upgrade to 16-24 thread CPU's in the future.
I'm very active on Battlefield and streaming forums, subreddits, you name it. You'll find many topics about Nvidia gamers with high-end hardware encountering unexplained framedrops or issues while playing BFV and especially while playing + streaming ( CPU encoding ) on the same system. Just do a search for "100% cpu usage" or "1080ti" or "2080" and plenty of topics to find with performance complaints. People try to fix it by reverting back to previous windows version, trying other drivers, changing settings, etc... But I think the explanation is plane and simple, the Nvidia drivers increase the CPU usage to a point it impacts the games performance. Benchmarkers won't notice this because they benchmark singleplayer gameplay while CPU usage in 64-player conquest mode is much higher then in the singleplayer.
Is this theory based on just one game? Not really.
When looking at 3DMark DirectX 11 Timespy benchmark with low/bad Multi-core optimalisation you'll see the Radeon VII loosing against an RTX 2070.
That's bad no? A 700$ AMD card should be able to match a 700$ Nvidia card, the RTX 2080 but in this benchmark, even the RTX 2070 kicks it's ****.
While looking at Firestrike Extreme, from the same developpers on DirectX 12 with good Multi-core optimalisations the Radeon VII does seem to keep up and even beat the RTX 2080.
As soon as the game or software requires lots of CPU power, the RTX 2080 can't keep up anymore due too Nvidia's driver overhead. Just like in Battlefield V, where the game it optimized for more then 4 cores.
Now let's have a look at the CPU usage on BF V when we try to stress the GPU as much as possible:
That's right on a, Intel Core i7 8700K clocked at 5Ghz, there are multiple spikes up to 100% CPU usage while the Radeon VII only reaches 64% CPU usage. Keep in mind that this is without streaming. CPU usage would be even higher while streaming ( CPU encoding ) and every spike above 90-95% cpu usage would cause framedrops in a livestream.
It would make sence if the RTX 2080 is producing more fps, but that's simply not the case. Have a look at multiple sources:
Same results can be found from other sources. Visit techpowerup, anandtech or others, the Radeon VII will outperform the RTX 2080 in Battlefield V and 1.
So the framerate is higher, the CPU usage is lower, but what about RAM usage? Again, Nvidia owners will also run in to RAM bottlenecks faster then AMD users:
Hold on for a second. It must be due to the fact that it's 4K gaming and that these cards are the topmodels. Surely you won't see the same results on lower resolutions with slower cards?
Well... let's have a look:
Ok, again CPU usage is higher, RAM usage is higher, but what about the framerate?
I personally bought a Radeon VII, simply because Battlefield 1 is still my #1 source of fun when it comes to PC gaming and looking at the frostbite's engine evolution, I'm pretty confident that my card will keep outperfoming the RTX 2080 in future BF V updates and in Battlefield 6. I can't mention enough that is just a theory. But going back years in time, you will see the same results over and over again. In every review that shows CPU usage and RAM usage, you will see the Nvidia cards using more CPU power. But only now we are starting to see games where 4-6 core CPU's simply can't provide all the power a powerfull Nvidia card needs in all situations.
On top af that, I stream every game session on the same CPU I play with. And as a streamer I would encounter the limits of my CPU much faster with an RTX 2080 then with my Radeon VII. Benchmarkers don't actually play the games they test, and they will never test streaming performance on cards. And personally, I think they should. It would give gamers another perspective on Nvidia hardware.
Sources: Hardware.info, gpureport.cz ( charts ), techpowerup, anendtech and many others.