Brad_Hawthorne
You do know the history of raytracing right? Conceptually, this is the last genre that should be embracing raytracing. Then again, it takes all kinds of people to argue just to argue. Not a fan of apologists that would rather have circular arguments than address the issue of applying a rendering method that is least receptive to FPS habits would be embraced by people that would probably even buy Battlefield or Nvidia toilet paper just because...well it's Battlefield or Nvidia.
To a point, it might have sense (coming from you and without silly claims as the previous one), but you'd rather talk about scene complexity. Unluckily, most modern games would behave mostly the same. The most impressive effect is in 1st person games.
The sad (for early users) fact is: it doesn't really matter in the end. Attacking some tech out of pure fun is simply wrong. You can't do more with the existing hardware and it has nothing to do with skills of Frostbite programmers. It's about hardware and the fact of its inner workings and the hybrid pipeline.
It's not that the implementation doesn't work or is poorly implemented, but it's simply the fact that hardware N offers is at the start of its evolution. You have a lot of work to do and it's only so much available. What is most taxing to RT are reflective, refractive and scattering surfaces, and it's less about geometry than you'd think. It's a huge amount of work to shift data and use it with RT, the impact is quite high.
Not only Frostbite is one of the best tech solutions out there, but they're also one of the pioneers when it comes to RTX. The performance we see will repeat in Tomb Raider and Redux. And other games. Without RTX as a highly performance-killing technology, Frostbite is capable of running at 5K on a single 2080Ti. That's fantastic given the poly count, resx size and post-processing they do. As a rendering engineer and researcher, I really can't find many points at which you could improve much or at all. Frostbite is highly optimized for both paths, AMD and NVIDIA.
Not for one second Nvidia or DICE or anyone else, left room for speculations; since day one we all knew that RTX will be of limited performance and application. How could anyone miss that?
Personally, I wouldn't even release RTX to consumers yet. Not below 2080Ti and Titan Turing. But then again, many people still play at FullHD. Eye-candy is cool.
Also, I'd rather go with shadows and replace SSAO instead of reflections. I guess the performance or wow-effect was less visible.
I sincerely hope we'll soon be running 7nm chips and double the RTX hardware, which would make 1440 a viable resolution. Also, ray-tracing is begging to scale to SLI...