EVGA

NVIDIA to Enable DXR Ray Tracing on GTX (10- and 16-series) GPUs in April Drivers Update

Page: < 12 Showing page 2 of 2
Author
ty_ger07
Insert Custom Title Here
  • Total Posts : 21171
  • Reward points : 0
  • Joined: 2008/04/10 23:48:15
  • Location: traveler
  • Status: offline
  • Ribbons : 270
Re: NVIDIA to Enable DXR Ray Tracing on GTX (10- and 16-series) GPUs in April Drivers Upda 2019/03/20 13:50:42 (permalink)
atfrico
LOL i find it funny. Sure you may have the RTX GPUs running RT better than the GTX 1080Ti, but you guys have not seen the big pic here.....what AMD GPU series is running the Crytek demo? and how good is running it? compare to the cards recently released, i said the lesser GPU is giving a good run for its money to those that own a card twice the price?
Yeah, well, i guess AMD computation cores are more important than Tensor cores in my opinion

The AMD Vega 56 running ray tracing in the Crytek demo at 4k at 30fps was very impressive. It was beautiful too. Better looking than the RTX ray tracing videos I have seen. High end RTX cards are getting 4K ray tracing FPS in the mid 30 to 60 FPS range. It really makes you wonder. But, the Crytek demo wasn't a true environment with AI NPCs or multiplayer, so you have to wonder how big of an effect that has on performance. I guess we will see some day. Until then, all we can do is speculate.

ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
My EVGA Score: 1546 • Zero Associates Points • I don't shill

#31
GTXJackBauer
Omnipotent Enthusiast
  • Total Posts : 10323
  • Reward points : 0
  • Joined: 2010/04/19 22:23:25
  • Location: (EVGA Discount) Associate Code : LMD3DNZM9LGK8GJ
  • Status: offline
  • Ribbons : 48
Re: NVIDIA to Enable DXR Ray Tracing on GTX (10- and 16-series) GPUs in April Drivers Upda 2019/03/20 14:29:07 (permalink)
How do we not know that Crytek demo isn't optimized to run perfectly on that AMD GPU?  I'm sure Nvidia can get someone to do the same no?  I'm just guessing here unless someone knows more info on this.

What is Team Red running vs. Team Green on 3DMark's RT benchmarks with everything on full blast @ 1440p and 4K?

 Use this Associate Code at your checkouts or follow these instructions for Up to 10% OFF on all your EVGA purchases:
LMD3DNZM9LGK8GJ
#32
atfrico
Omnipotent Enthusiast
  • Total Posts : 12753
  • Reward points : 0
  • Joined: 2008/05/20 16:16:06
  • Location: <--Dip, Dip, Potato Chip!-->
  • Status: offline
  • Ribbons : 25
Re: NVIDIA to Enable DXR Ray Tracing on GTX (10- and 16-series) GPUs in April Drivers Upda 2019/03/20 18:11:18 (permalink)
ty_ger07
atfrico
LOL i find it funny. Sure you may have the RTX GPUs running RT better than the GTX 1080Ti, but you guys have not seen the big pic here.....what AMD GPU series is running the Crytek demo? and how good is running it? compare to the cards recently released, i said the lesser GPU is giving a good run for its money to those that own a card twice the price?
Yeah, well, i guess AMD computation cores are more important than Tensor cores in my opinion

The AMD Vega 56 running ray tracing in the Crytek demo at 4k at 30fps was very impressive. It was beautiful too. Better looking than the RTX ray tracing videos I have seen. High end RTX cards are getting 4K ray tracing FPS in the mid 30 to 60 FPS range. It really makes you wonder. But, the Crytek demo wasn't a true environment with AI NPCs or multiplayer, so you have to wonder how big of an effect that has on performance. I guess we will see some day. Until then, all we can do is speculate.

I understand is a demo and can be modified, but the same goes with Nvidia's RT and DLSS demos as well, if you doubt the Crytek demo of Ray Tracing, what makes you think i won't doubt the same for Nvidia? Everyone needs to draw the neutral line here.
Check the specs of the GPU running it, that alone proves a point.
I am impressed as well to be honest, heck, i am looking at my used EVGA GTX1080Ti at the moment and questioning its performance over the AMD Vega56 running the Crytek video and my EVGA GPU is saying:
 

 

 
 
post edited by atfrico - 2019/03/20 18:17:50

Those who abuse power, are nothing but scumbags! The challenge of power is how to use it and not abuse it. The abuse of power that seems to create the most unhappiness is when a person uses personal power to get ahead without regards to the welfare of others, people are obsessed with it. You can take a nice person and turn them into a slob, into an insane being, craving power, destroying anything that stands in their way.
 
 
Affiliate Code: 3T15O1S07G
#33
NazcaC2
EGC Admin
  • Total Posts : 7420
  • Reward points : 0
  • Joined: 2008/06/21 09:43:08
  • Location: Niagara Falls, Ontario Canada
  • Status: offline
  • Ribbons : 38
Re: NVIDIA to Enable DXR Ray Tracing on GTX (10- and 16-series) GPUs in April Drivers Upda 2019/03/21 08:18:28 (permalink)
NVIDIA did this to entice GTX users to buy an RTX card, even if they're getting slide show like performance with ray tracing. They want to show users with GTX cards what they can expect visually from a faster RTX card intended for the task.

GTX get Tier 1 ray tracing support while RTX gets Tier 2 support.

My system specs are quite dated so I'm not even going to try to enable it.

Intel i9-12900K
ASUS Prime Z690-A
Corsair 850W RM850x
Windows 11 Professional
Arctic Liquid Freezer II 360 A-RGB
Corsair Dominator 32GB DDR5 5200MHz
EVGA GeForce RTX 3080 Ti FTW3 ULTRA GAMING
4x Samsung 2TB 980 Pro SSD + 1x ADATA 512GB SU800
Corsair iCUE 5000X RGB SIGNATURE SERIES Mid-Tower - Neon Night
#34
Gold Leader
CLASSIFIED Member
  • Total Posts : 3939
  • Reward points : 0
  • Joined: 2009/05/30 03:06:17
  • Location: Dirksland, The Netherlands
  • Status: offline
  • Ribbons : 62
Re: NVIDIA to Enable DXR Ray Tracing on GTX (10- and 16-series) GPUs in April Drivers Upda 2019/03/21 15:50:44 (permalink)
atfrico
LOL i find it funny. Sure you may have the RTX GPUs running RT better than the GTX 1080Ti, but you guys have not seen the big pic here.....what AMD GPU series is running the Crytek demo? and how good is running it? compare to the cards recently released, i said the lesser GPU is giving a good run for its money to those that own a card twice the price?
Yeah, well, i guess AMD computation cores are more important than Tensor cores in my opinion


Very well said man

So yes I greatly agree with you 200%,  I am done with the NVIDIA garbage. the lies, the false marketing, nothing is adding up, this entire RTX scam (money wise) I am like seriously?

GTX 1080 Ti was 57% faster than GTX 980 Ti Price was about the same.
but GTX 1080 Ti is only 35% slower than RTX 2080 Ti yet RTX 2080 Ti is more than 200% More expensive....
quite the similar image with GTX 1080 to GTX 980 and RTX 2080 to GTX 1080, but yeah you get the drill.

Every reviewer I watched came up with similar rants, the real world prices didn't lie about it either, not something I'd leave out.
NV kept coming to me with the careless customer support, they could never solve the issue I was having even after people saying my issue was fixed ina new driver, nope, nada nothing, GeForce Drivers 384.94 WHQL was fine everything after it was drama pure drama.

So I decided to sell my card; an EVGA GTX 1080 FTW2, from that I went to a Sapphire RX Vega 64 Limited Edition 2.0 and to me image quality, image fidelity & driver quality is far superior, it works much better with the system I am using, I have no more black screens at start up or when I am on the internet, even in games the 1080 FTW2 ran fine, so it was basically crappy NVIDIA driver issues I was dealing with.
here that thread I even made of it:
https://forums.evga.com/Random-Blackscreens-with-EVGA-GTX-1080-FTW2-m2730277.aspx
This issue would not get solved had it for over 5 months and no improvement that was when I decided to sell the EVGA GTX 1080 FTW2 and get the RX Vega 64 board. from Sapphire Nitro LE series.
The EVGA forums were helpful, that I won't deny, that has always been a very strong factor of this place
But NVIDIA them self? Ergh it was a constant loop of things I already tried and after some point they simply ignored my case, that made me upset and lose confidence in what I thought was the best GPU company the planet knew of... speak of inaccurate insights.

Overall I have and no malarkey nonsense such as DLSS, Ray Tracing etc, I posted enough of evidence in other threads that were proven by hardware experts that were not funded by NVIDIA to do their reviews...of what negative impact the entire RTX series gave people, some like it it's their choice doesn't have to be everyone's choice, doesn't it?
And you know what I find so hilarious? That none of these RTX users & supporters were able to post neutral evidence that actually proved neutrally the actual performance of their cards...
Here that thread for those wondering:
https://forums.evga.com/FindPost/2936073

Mainly the links from Hard|OCP they have actual RTX performance with RTX on in all settings, it may have improved well I hope so, it should make some sense here and there right
Anyways they all showed the actual RTX performance with 20fps on average less than what everyone is posting here..  no matter what they did to their cards,systems etc, the diffs were marginal.
In the end I am just question myself why couldn't anyone actually prove on what they are saying? Hence the curiosity.

Also it was AMD's Radeon RX Vega 56 that did the Crytek demo, for those that didn't know, or those that didn't manage to type out the name with accuracy...even that the video & article had info of what GPU it was..
30fps for an almost 3 year old card doing software Ray Tracing, AMD's Compute power isn't that bad after all, if I have to admit that a 1080 Ti does about the same as some are concluding rofl
Also why AMD GPU's rocked with mining over NV GPU's most miners used RX 580's & RX Vega's just due to high compute performance.

"AMD GPU's are like wine, they improve over time" as my experience goes, they do get better over time.
And I had a similar thing when testing the 3dfx Voodoo5 6000 AGP 128MB Rev.A 3700 Engineering Sample in time it did get better thanks to improved driver development and game optimizations etc, is with everything in the end.

As for GTX 10 & 16 series getting DXR, just a new method to replay false hopes on a card series that was not even designed to do ray Tracing, hey my 10 series can do Ray Tracing but I only get 18 a 22 fps .. great it can but useless in the end.

Sorry for not agreeing on many terms with RTX users here , I prefer to be to the point and post facts over the opposite, each to their own in the end.
Just for those here I am not here to bash people never was my intention just bashing NVIDIA's methods of mistreating people, I never support such things and I'll always leave my thoughts of such actions in the end.
post edited by Gold Leader - 2019/03/21 15:55:24


#35
lehpron
Regular Guy
  • Total Posts : 16254
  • Reward points : 0
  • Joined: 2006/05/18 15:22:06
  • Status: offline
  • Ribbons : 191
Re: NVIDIA to Enable DXR Ray Tracing on GTX (10- and 16-series) GPUs in April Drivers Upda 2019/03/21 17:34:43 (permalink)
It isn't like ray tracing wasn't possible on graphics cards until Turing, we're ultimately talking about compute programming and while nVidia isolated that aspect into what they call "Tensor" cores, that doesn't mean the rest of the CUDA cores can't do it.

CUDA afterall is an acronym for Compute Unified Device Architecture, Tensor cores are specialized cores for a specific task, i.e. deep learning, which effectively makes them ASIC processors. This can explain the big deal of Tensor and why nVidia hasn't dropped CUDA and went all out on Tensor cores for consumers. A GPU would need more CUDA cores to compensate for less Tensor cores.

Up to this point, before nVidia brought us Turing and Volta, ray tracing has been done entirely with ASICs, this isn't new. I found an older article that compared a low wattage PowerVR graphics chip having the same RT performance as the then-flagship GTX680-- I imagine nVidia is reacting to that kind of competition by embedding ASICs into their graphics cards.

Anyway, as from the cryptomining crazes, ASICs do better than central or general purpose processors, but they are specialized. I don't know if nVidia can just raise the Tensor core counts, and until AMD introduces their own, nVidia still has to debut rasterization graphics cards.

I don't know what AMD's or Intel's plans with RT are, im just saying nVidia's Tensors aren't the only way. The others can embed their own ASICs and call it something else...

For Intel processors, 0.122 x TDP = Continuous Amps at 12v [source].  

Introduction to Thermoelectric Cooling
#36
Gold Leader
CLASSIFIED Member
  • Total Posts : 3939
  • Reward points : 0
  • Joined: 2009/05/30 03:06:17
  • Location: Dirksland, The Netherlands
  • Status: offline
  • Ribbons : 62
Re: NVIDIA to Enable DXR Ray Tracing on GTX (10- and 16-series) GPUs in April Drivers Upda 2019/03/21 18:40:02 (permalink)
Hmm I remember that the Amiga 500 could do Ray Tracing dunno if any of you remember this, here is an article of that cool thing:
https://bytecellar.com/2018/08/31/ray-tracing-is-no-new-thing/

Ray Tracing is nothing new, it's been around for about 45 years

But what so ever NV is doing is just ripping people off blindly, that I don't support as some others, it's good to see that some people are waking up, well gladly
CUDA is one thing, OpenCL is far more useful to be honest, it's also multi-platform and multi GPU vendor friendly, thus no restrictions.

And AMD like Intel will get Ray Tracing solutions when they think it's time, so far what RTX showed me was the ultimate joke I was not impressed at all if I have to be honest , but yeah I view it from different aspects than some RTX  users here.


#37
Page: < 12 Showing page 2 of 2
Jump to:
  • Back to Mobile