2020/10/18 13:31:21
z1nonly
lantern48
z1nonly
lantern48
z1nonlyProject Cars 2 on the HP Reverb (my primary reason for upgrading) is proof that it's easy to bring a graphics card to it's knees without blowing out the vram buffer. 
I'll test it at some point. Don't like the game, so it won't be anytime soon. But again, both games that I happen to be currently playing are limited by the V-RAM at 4k with most settings maxed. 

 
Long beach. 20 car grid. Night time. Thunderstorm. 
 
^^My 1080Ti get's ~29FPS with just over 7Gb of vram usage. Since the 3080 isn't 3 times faster than a 1080Ti, I don't expect it will be able to get, much less hold, 90FPS @4k. (Reverb is a little more than 4k IIRC, but close enough)
 








Wait... you don't even have a 3080 and you're arguing about what it can and can't do???
So that's why you put words in my mouth and are arguing for the sake of arguing. Got it. Makes sense now.
I won't be wasting anymore time on this with you.




Wait, you didn't read my post?
 
Post number 4...in this thread...
 
I'll copy and paste here since you seem confused:
 
z1nonlyNew games could also tax the GPU itself more than current games and you end up running out of GPU "horsepower" before you run out of vram. 
 
Every graphics card I have owned has ran out of GPU grunt before vram. My 1080Ti only uses 7.2gb of vram when I max out my favorite sim on my Reverb. -And the game is absolutely unplayable at around 29 FPS! Once I turn down the settings to where I can hold 90 FPS, I'm only using about half of the 1080Ti's available vram.
 
At some point in the future, the mighty 3080 will not be "enough". The 10gb vram buffer might make running out of vram more likely than it has been for previous generations, but I still don't think it will be the most likely reason for "needing" an upgrade. My 1080Ti has come up short of GPU power in 100% of the games I have needed to turn down settings. Maybe the 3080 will end up with a game or two that run out of vram in a couple years, but I bet the list of games that just simply bring the GPU to its knees is still much longer than those that max out the vram.

 
 
 
2020/10/18 13:39:17
lantern48
CraptacularOneThe reality of the matter is that there are very few if any games currently available that actually use more than even 8GB of VRAM at 4K. 
This is simply not true. I've already provided 2 games that I happen to be currently playing and one is 5-years old. Both exceed V-RAM usage and are only limited by the V-RAM at max or most settings at max.
 
 
CraptacularOneTLDR: No, 10GB VRAM isn't going to be a problem for 4K gaming for quite some time. By the time it does matter, they card in question will have long outlived its useful life. 
It's already an issue. Even in a game that is 5-years-old.
 
CraptacularOne99% of the people you see claiming they used all their VRAM simply have no idea what they are talking about.
I think this is a statement that you may want to apply to yourself. The claim isn't just that all V-RAM is being used. It's being clearly explained when V-RAM exceeds a certain point -- in my case 9.8 GB's -- that there is massive FPS loss and the games stuttering like crazy. 
 
You can go play either of the games I mentioned and see for yourself. 



2020/10/18 13:46:03
CraptacularOne
lantern48
CraptacularOneThe reality of the matter is that there are very few if any games currently available that actually use more than even 8GB of VRAM at 4K. 
This is simply not true. I've already provided 2 games that I happen to be currently playing and one is 5-years old. Both exceed V-RAM usage and are only limited by the V-RAM at max or most settings at max.
 
 
CraptacularOneTLDR: No, 10GB VRAM isn't going to be a problem for 4K gaming for quite some time. By the time it does matter, they card in question will have long outlived its useful life. 
It's already an issue. Even in a game that is 5-years-old.
 
CraptacularOne99% of the people you see claiming they used all their VRAM simply have no idea what they are talking about.
I think this is a statement that you may want to apply to yourself. The claim isn't just that all V-RAM is being used. It's being clearly explained when V-RAM exceeds a certain point -- in my case 9.8 GB's -- that there is massive FPS loss and the games stuttering like crazy. 
 
You can go play either of the games I mentioned and see for yourself. 





Ahh again one of the less informed people. Thanks for standing up to be noticed. You comment about "thanks to the 450w VBIOS" for letting same stay over 60fps even though you claim to be VRAM limited did make me chuckle though. Please I really suggest you educate yourself before posting nonsense and perpetuating further misinformation. 
2020/10/18 14:05:46
arestavo
Hey guys, let's all panic because 1 person has issues running a 7 year old game with settings that aren't optimal. PANIC!
 
In other news, I gamed at 4K on a 2080 Super without issues on actual new titles. EDIT: the 2080 super has 8 gigs of VRAM, for those interested.
2020/10/18 14:07:45
Endworld
CraptacularOne
There is a difference in games using VRAM and REQUESTING available VRAM. They are not the same and often people have no idea what they are talking about when they "say X games uses all my VRAM, I need more!". What they are really seeing in apps like Afterburner and other on screen read outs is what a game is "requesting". Some games just request all VRAM that is available regardless if it's needed or not. There is nothing inherently wrong with a game requesting all VRAM that's available but it does lead to the less informed people thinking they they need more than what they have because they see their whole frame buffer being "used" in whatever game they are playing. 
 
The reality of the matter is that there are very few if any games currently available that actually use more than even 8GB of VRAM at 4K. Sure there are some edge case scenarios like a heavily texture modded game or what have you but far and away 99% of the people you see claiming they used all their VRAM simply have no idea what they are talking about. This used to be easier to demonstrate when you could buy cards with the exact same specs but differing VRAM amounts but today this is harder to do since there are no high end cards with different size frame buffers. 
 
TLDR: No, 10GB VRAM isn't going to be a problem for 4K gaming for quite some time. By the time it does matter, the card in question will have long outlived its useful life. 




Thank you. This is absolutely correct.
2020/10/18 14:13:27
LexR5
I ran a few benchmarks in RDR2 today with the EVGA 3080 FTW3 Ultra at 3440x1440@100hz. Settings were all set to highest (ultra) and the memory usage was around 5600MB.

(I am aware 4.9m pixels is around 60% of 4k so not a fair comparison, but wanted to share my findings)

I too was wondering if 10GB was on the low side for higher pixel density displays with higher refresh rates, but RDR2 and TD2 both run at ultra, at 3440x1440@100hz without issue.

Edit: I think it's worth stating that at 3440x1440@100hz I am not getting a locked 100fps at ultra settings, but I'm also not hitting any VRAM issues.
2020/10/18 14:25:57
ghastlyone
10gb Vram is plenty. And all the people holding out to spend $1000+ on 20gb models are going to waste their money on cards with zero perceivable performance difference.
2020/10/18 14:29:26
lantern48
ghastlyone
10gb Vram is plenty. And all the people holding out to spend $1000+ on 20gb models are going to waste their money on cards with zero perceivable performance difference.

It's simply fact that if you exceed the 10GB of V-RAM you will have a catastrophic loss in performance. Obviously you can limit settings to stay under, but that makes the whole point that V-RAM is a limiting factor at 4k with most settings maxed.


Instead of just making claims out of left field, you can go try out any of the 3 games I mentioned and see for yourself.
Facts don't care about your incorrect opinions.
2020/10/18 14:35:22
CraptacularOne
Funny how it took me all of 30 seconds to illustrate you are completely ignorant to what you're trying to argue.
 
Here we have GTA 5 running at 4K with 8XMSAA no less and only "requesting" 9.3GB VRAM
https://www.youtube.com/watch?v=Axd3-007kWM
 
Thanks for also taking the bait and PROVING you have no idea what you're talking about.
lantern48
Since you edited to add this comment, I'll respond. It's not a hard concept to understand. The extra 50w is allowing higher overclocks which provide more FPS. There are games where that extra few FPS where you're hovering around 60 FPS makes a difference. The 2 I already listed and Far Cry 3 at 4k are all examples.
 
You make yourself look really bad when you can't understand something so basic and simple.


If you were really VRAM limited, no amount of overclocking or increased power limits would help. When a game is truly VRAM capped it starts swapping information between the windows page file and physical system RAM. Increasing a GPUs core clocks will not alleviate this. DO I need to redownload this myself on my RTX 3090 system as well? 
2020/10/18 14:35:54
ghastlyone
lantern48
ghastlyone
10gb Vram is plenty. And all the people holding out to spend $1000+ on 20gb models are going to waste their money on cards with zero perceivable performance difference.

It's simply fact that if you exceed the 10GB of V-RAM you will have a catastrophic loss in performance. Obviously you can limit settings to stay under, but that makes the whole point that V-RAM is a limiting factor at 4k with most settings maxed.


Instead of just making claims out of left field, you can go try out any of the 3 games I mentioned and see for yourself.
Facts don't care about your incorrect opinions.




I seemed to have missed all the threads of people with 8gb cards getting "catastrophic loss in performance" on some video games for hitting VRAM limit.
 
Make sure when you purchase a $1000 20gb 3080, you report back here to tell everyone how you're getting the exact same performance as the 10gb model.

Use My Existing Forum Account

Use My Social Media Account