Reddawne
SSC Member
- Total Posts : 682
- Reward points : 0
- Joined: 2011/03/06 10:56:29
- Status: offline
- Ribbons : 3
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/16 13:21:36
(permalink)
Johnny_Utah, Point Break was one of the coolest movies ever made  Always wondered about how a sequel would had played out a few years after it came out.
|
garnetandblack
FTW Member
- Total Posts : 1700
- Reward points : 0
- Joined: 2008/06/21 11:34:51
- Status: offline
- Ribbons : 13

Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/16 13:33:25
(permalink)
In 2012, you may see games that really need the extra vram (not just some borderline debatable games), but right now, I don't see the point if you aren't playing in surround/gaming at 2560X1600. Of course, Kepler will be out in 2012, so no real need to worry about it this gen for me.
"There's a box of Twinkies in that grocery store. Not just any box of Twinkies, the last box of Twinkies that anyone will enjoy in the whole universe. Believe it or not, Twinkies have an expiration date. Some day very soon, Life's little Twinkie gauge is gonna go... empty."
|
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/16 13:44:10
(permalink)
Just for a FYI : I played Crysis 2 today on my SLI 480's The keyboard display of AfterBurner showed an avg. of 1470 upto 1505 Vram usage on both cards.
Intel Core i5-6600K SKYLAKE ~ MSI Z170A-G45 gaming board~ EVGA 1080 FTW ACX 3.0~ G.SKILL Ripjaws V series 16 GB DDR4 3200 ~ CORSAIR Vengeance C-70 Military Green case~ EVGA SuperNOVA 750 Platinum2 PSU~ CORSAIR Force GT 240 Gig SSD~ CORSAIR H80i V2 cooler~ Logitech G502 Proteus Core mouse~ CORSAIR K70 mech. keyboard~ HGST NAS -2TB Hd & 4TB Hd.~ Dell UltraSharp U2412M 24" LED @ 1920 X 1200~ Win 10 x64 STEAM
|
boylerya
FTW Member
- Total Posts : 1284
- Reward points : 0
- Joined: 2008/11/23 19:18:00
- Status: offline
- Ribbons : 0
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/16 13:51:48
(permalink)
Reddawne boylerya I dont have the source but it was a legit article, not somethin off the forums. But I read that the VRAM for the GTX680s is planned to be 2GB. However 2GB is already becoming inadequate for powerhouse video cards and I am going to have to wait a bit longer after the GTX680 release for 4GB when 3GB would have been perfect. But wth, always good to have too much VRAM than too little IMO. I'm with you on waiting after the release. If memory serves me correct, the 3gb version of the GTX580 came out 4-5 months after the vanilla? Maybe even wait till the Classified version comes out 8 months after that; I know I will I am running a single GTX275 1792MB card on a Dell U3011 1600p that needs a display port to get the full 1.07billion colors. If it wasnt for the fact that I am an environmental science major at Rutgers and have no life but chattin on evga forums, studying, and taking a peak at how recent games bring my vid card to its knees at 1600p 8x supersampling. Otherwise I would have snapped and bought a GTX570HD 2.5GB, mostly for the displayport. I think it is safe to say the Kepler vid cards will have PCI-E 3.0, but I NEED a display port. That is a deal breaker for me, if they dont have one. Otherwise I could probably keep on waiting until the 2nd generation of Kepler vid cards are released as long as Rutgers keeps on sucking away my life and free time. On the positive side, I have never run outa VRAM without making my games look like a slideshow first.
post edited by boylerya - 2011/10/16 14:16:30
|
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/16 20:50:02
(permalink)
gregjc Remember that just because a game is using so much VRAM, doesn't mean that it NEEDS to load that much VRAM to provide a smooth, consistent gameplay experience. For instance, I have used 1.25gb GTX 570s in SLI and have never noticed any kind of sudden lag spikes in the games referenced above. This is at 1080p resolution with higher AA settings. Of course, I don't notice micro stutter either, so maybe I'm not the best judge. A single GTX 570 with 1.25GB of VRAM cost well over $300 in most cases, so I highly doubt that you'll see more than 1 or 2 game developers create games that will hiccup on less than 2/3GB of VRAM in the near future, even at the highest settings. Unless you're using multiple monitors with higher AA settings, of course. So I think the VRAM thing may be an issue within ~2 years, but not in the short term unless you have very specific needs. May I suggest you go back to post #14.. You will read why this is the case (Highlighted portion)... And in an attempt to prove once and for all that this very same subject has been there done that (Like I said above from 256MB to 512MB) http://www.pureoverclock....w.php?id=33&page=1
|
jose1980
SSC Member
- Total Posts : 752
- Reward points : 0
- Joined: 2009/01/03 18:35:59
- Location: NewJersey
- Status: offline
- Ribbons : 0
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/16 21:16:36
(permalink)
Brad_Hawthorne Afterburner Ok... I am really tired and probably not going to add anything of great value here... But I thought the 590 was a pair of 1.5gVrams equaling the 3G... So in essence it is still just a 1.5g? Yes, the GTX590 is really just two GPUs with 1.5GB. It's not a true 3GB setup. It's why I always recommend GTX580 3GB SLI over a GTX590 to anyone NVIDIA Surround. You'll VRAM bottleneck. I VRAM bottleneck all the time on my GTX470's. My 5870 E6 2GB cards even VRAM bottleneck depending upon the game and settings. VRAM capacity is the "dirty little secret" with all triple-head gaming rigs anymore. It's not really a "definitive" thing until you are triple-head though. It's where it really can show. So much about triple-head gaming is a compromise though that often this is just explained away as just another thing to accommodate by dialing down settings. With the bottom fallen out of commodity RAM market in the last year there is no real reason why they can't just slap double capacity ram by default on these cards now. I'd really like to see 3GB be the norm for the GTX670 and GTX680 series cards next year. Would be cool if in the future they can allow you to add memory to the video cards manually, like you would on a mother board. But that's just wishful thinking..
Heat -http://www.heatware.com/eval.php?id=67041 Case: Silverstone Temjin TJ07B/MurderMod Faceplate/Serpentin Acrylic top. PSU: Antec TPQ-1200w Mobo: EVGA X58 SLI LE/EK-FB Block CPU: Intel LGA 920 DO@4.1Ghz/EK Supreme block/Thermochill pa 120.4 Rad./EK 250 res/MCP 655 pump Ram: Corsair CMT6GX3M3A 1600C7 6gig VGA: EVGA GTX 580/EK Block HD: Western Digital 500gig Sound: Creaive X-FI Titanium Fatal1ty Champion series
|
SirWaWa
FTW Member
- Total Posts : 1404
- Reward points : 0
- Joined: 2010/08/04 07:55:54
- Status: online
- Ribbons : 0
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/16 21:17:32
(permalink)
so much for nvidia's flagship card
Intel i7 960 @ 3.2GHz with Intel EE Heatsink/Fan Delta DBX-A Asus P6T6 WS Revolution X58 LGA 1366 Asus BW-12B1ST CD-RW/DVD-RW/BD-R (x2) Corsair Obsidian 800D Corsair HX850W Professional Series Corsair Dominator GT DDR3 1600 6GB 7-7-7-20 eVGA Nvidia GeForce GTX 780 Ti SC 3.0GB DDR5 ACX WD VelociRaptor 300GB 10,000 RPM SATAII WD Caviar Black 2TB/1TB/1TB 7,200 RPM SATAII Razer Megalodon 7.1 Headset Logitech G502 Proteus Spectrum Razer Onza Tournament Edition Xbox 360/PC Controller Logitech G810 Orion Spectrum Logitech X-540 5.1 Speaker System LG M2362D 1920 x 1080 23" 60Hz (x2) Windows 7 Ultimate 64-bit
|
Nulltime
New Member
- Total Posts : 59
- Reward points : 0
- Joined: 2008/05/27 05:18:02
- Status: offline
- Ribbons : 0
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/17 08:09:56
(permalink)
Great info in this thread. I will make sure my next card has the max RAM available. This clears up why I keep seeing 1.5 and 3gb cards. I was wondering what the difference was...now I know!
|
1ceTr0n
CLASSIFIED Member
- Total Posts : 2789
- Reward points : 0
- Joined: 2011/09/27 00:21:20
- Status: offline
- Ribbons : 9
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/20 22:35:38
(permalink)
Meh, I don't run AA on my games, so im not really concerned.
|
dukenuke88
FTW Member
- Total Posts : 1698
- Reward points : 0
- Joined: 2010/11/28 15:00:23
- Status: offline
- Ribbons : 0

Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/20 22:39:15
(permalink)
1ceTr0n Meh, I don't run AA on my games, so im not really concerned. but it looks so much nicer with AA
|
1ceTr0n
CLASSIFIED Member
- Total Posts : 2789
- Reward points : 0
- Joined: 2011/09/27 00:21:20
- Status: offline
- Ribbons : 9
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/20 22:40:50
(permalink)
dukenuke88 1ceTr0n Meh, I don't run AA on my games, so im not really concerned. but it looks so much nicer with AA Pfffft, a matter of perspective opinion, nothing more. Honestly, at resolutions at 1600x1200 and above, it really becomes mute IMO. I'im far to busy playing/enjoying the game then worrying how "jaggy" the edges look
|
HeavyHemi
Omnipotent Enthusiast
- Total Posts : 10929
- Reward points : 0
- Joined: 2008/11/28 20:31:42
- Location: Western Washington
- Status: offline
- Ribbons : 57
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/20 22:46:59
(permalink)
1ceTr0n dukenuke88 1ceTr0n Meh, I don't run AA on my games, so im not really concerned. but it looks so much nicer with AA Pfffft, a matter of perspective opinion, nothing more. Honestly, at resolutions at 1600x1200 and above, it really becomes mute IMO. I'im far to busy playing/enjoying the game then worrying how "jaggy" the edges look Certainly is subjective but I find jaggies on a 40" Sony unacceptable when I have more than enough GPU power to go for the smooth. As long as my min frame rates stay over 60, I'm good to go.
EVGA E758/i7 980x 4.3ghz 1.36v 12GB Corsair Dominator 2000C9 GTX 980Ti w/GTX TITAN SC for PhysX Crucial M4 512GB SSD/2x WD RE4 Black 2TB Corsair H50 Cooler/Corsair AX1200/Window 10 Pro
|
dukenuke88
FTW Member
- Total Posts : 1698
- Reward points : 0
- Joined: 2010/11/28 15:00:23
- Status: offline
- Ribbons : 0

Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/20 22:55:08
(permalink)
1ceTr0n dukenuke88 1ceTr0n Meh, I don't run AA on my games, so im not really concerned. but it looks so much nicer with AA Pfffft, a matter of perspective opinion, nothing more. Honestly, at resolutions at 1600x1200 and above, it really becomes mute IMO. I'im far to busy playing/enjoying the game then worrying how "jaggy" the edges look I'll agree that "gaming" in general can be subjective. But there are certain games out there that look absolutely atrocious without any AA...Fallout 3 and Fallout New Vegas being one that stands out in my head.
|
sputnik7913
SSC Member
- Total Posts : 555
- Reward points : 0
- Joined: 2007/06/22 09:58:58
- Location: New York, USA
- Status: offline
- Ribbons : 0
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/20 23:05:10
(permalink)
|
Johnny_Utah
CLASSIFIED Member
- Total Posts : 4330
- Reward points : 0
- Joined: 2008/02/13 16:26:04
- Status: offline
- Ribbons : 5
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/20 23:17:28
(permalink)
Reddawne Johnny_Utah, Point Break was one of the coolest movies ever made Always wondered about how a sequel would had played out a few years after it came out. I agree! I suppose now we will never know;)
5930k (H20) // ASUS Rampage V Extreme EVGA Titan SC x 4 (H20) Gskill Ripjaws 2400 EVGA 1600G2 // DD Torture Rack
|
kalyyy
New Member
- Total Posts : 54
- Reward points : 0
- Joined: 2011/10/15 11:30:51
- Status: offline
- Ribbons : 0
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/21 01:17:44
(permalink)
hm you have 1450 mb max not 1500
|
creep7t9
New Member
- Total Posts : 42
- Reward points : 0
- Joined: 2007/03/26 13:14:16
- Location: MN
- Status: offline
- Ribbons : 2
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/24 08:03:29
(permalink)
boylerya Reddawne boylerya I dont have the source but it was a legit article, not somethin off the forums. But I read that the VRAM for the GTX680s is planned to be 2GB. However 2GB is already becoming inadequate for powerhouse video cards and I am going to have to wait a bit longer after the GTX680 release for 4GB when 3GB would have been perfect. But wth, always good to have too much VRAM than too little IMO. I'm with you on waiting after the release. If memory serves me correct, the 3gb version of the GTX580 came out 4-5 months after the vanilla? Maybe even wait till the Classified version comes out 8 months after that; I know I will I am running a single GTX275 1792MB card on a Dell U3011 1600p that needs a display port to get the full 1.07billion colors. If it wasnt for the fact that I am an environmental science major at Rutgers and have no life but chattin on evga forums, studying, and taking a peak at how recent games bring my vid card to its knees at 1600p 8x supersampling. Otherwise I would have snapped and bought a GTX570HD 2.5GB, mostly for the displayport. I think it is safe to say the Kepler vid cards will have PCI-E 3.0, but I NEED a display port. That is a deal breaker for me, if they dont have one. Otherwise I could probably keep on waiting until the 2nd generation of Kepler vid cards are released as long as Rutgers keeps on sucking away my life and free time. On the positive side, I have never run outa VRAM without making my games look like a slideshow first. My might find this of interest to your DP connectivity issue: newegg. com/Product/Product.aspx?Item=N82E16814162082 I just found out about it yesterday!
"Never tell me get a life, I'm a gamer.. I have many lives!"
|
sheckie81
ACX Member
- Total Posts : 468
- Reward points : 0
- Joined: 2010/12/21 16:50:15
- Status: offline
- Ribbons : 2
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/10/24 09:08:51
(permalink)
So to start off, Im not trying to be a douche and I respect someone who can come on here and give some sort of ideas of the differences between 1.5gig and 3gig cards for today's demanding games. +1 for that. I personally have SLI GTX 580's SC (1.5gig) and game at 1920x1080 and have never played Rage, but I could 100% play that game at 50-60 FPS with all the eye candy turned up if there wasn't so many bugs and memory leaks going on. Its a console port. The GTX 590 being essentially 2 underclocked GTX 580's getting 1 FPS and not loading textures I find very hard to believe (again not trying to be a douche just my opinion) Also this whole VRAM war going on is getting crazy. People getting angry at each other because someone's got 1.5gigs versus 3 gigs and so on its nuts...haha. Yeah seeing the benefits in SOME games for higher VRAM makes me a little green with envy but honestly if I can maintain a solid 60 FPS with vysnc on (play all my games this way). Someone with the same settings with 3 gig GTX 580's in SLI might get 90 FPS where I get 75 I could care less. As long as I stay above 60 :)
Core i7 2600k Oc'd to 4.5 Antec Khuler 920 GTX 980ti FTW Coolermaster HAF X case with GPU cooling shroud (120mm fan) Soundblaster Fatality X-Fi Titanium EVGA FTW Z68 WD 1 TB HD Corsair AX1200 PSU Windows 10 Home Premium 64 bit BENQ 1080p 27" 144hz LED monitor
|
carye
ACX Member
- Total Posts : 278
- Reward points : 0
- Joined: 2011/03/01 16:10:24
- Location: Tennessee
- Status: offline
- Ribbons : 0

Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/11/10 16:56:53
(permalink)
Harmony is a troll
Just to let all you forum members know, that Harmony (his OCUK forum name) is an idiot, his threads get shut down and anyone with any intelligence at OCUK consistantly prove him wrong time and time again.
Please do not listen to this fool.
O.K. thanks for the information Harmony = thinkfly WTH?? If that's so I would hardly consider him a fool. Go play elsewhere please.
post edited by carye - 2011/11/10 17:02:08
|
slyde
New Member
- Total Posts : 25
- Reward points : 0
- Joined: 2006/12/12 08:11:04
- Status: offline
- Ribbons : 0
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/11/10 19:05:04
(permalink)
I hate to display my ignorance but oh well. If you add a second 580(1.5G), does that double your available vram to 3 gigs?. My 580 gets very close to max on Crysis 2 and BF3 with all ultra settings on a 1920x1200 display. If I added a second card would it still use ~1.5 gigs or would it use roughly the same as one 3 gig card?
|
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/11/10 19:10:35
(permalink)
slyde I hate to display my ignorance but oh well. If you add a second 580(1.5G), does that double your available vram to 3 gigs?. My 580 gets very close to max on Crysis 2 and BF3 with all ultra settings on a 1920x1200 display. If I added a second card would it still use ~1.5 gigs or would it use roughly the same as one 3 gig card? in a word... no...
I was made a cannibal to fix problems like you
|
arestavo
CLASSIFIED Member
- Total Posts : 3010
- Reward points : 0
- Joined: 2008/02/06 06:58:57
- Location: Through the Scary Door
- Status: offline
- Ribbons : 8

Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/11/10 19:11:41
(permalink)
Negative, you would still only have 1.5GB of VRAM.
EVGA affiliate code: 9ZWDWFNW6A (Don't forget to upload your invoice or no credit is given!) FOLD ON
|
DHLEVGA
ACX Member
- Total Posts : 357
- Reward points : 0
- Joined: 2009/07/30 16:22:42
- Status: offline
- Ribbons : 2
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/11/10 23:18:23
(permalink)
There is a review of both MSI Lightning GTX 580 cards (the 1.5G and the 3.0 G Extreme addition) that suggests that opposite of what the OP is claiming: See http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/43643-msi-gtx-580-lightning-xtreme-edition-xe-3gb-review-16.html There are differences, but only after frame rates get so low as to be unplayable. The GPU is suggested to be the bottleneck, so the additional 1.5 G is a questional investment at 500 series GPU technology. Here's the excerpt below: Giving 3GB a Workout In this section we will do some simple tests: use two notorious GPU memory hogs -Shogun 2: Total War and Metro 2033- to see how far we have to push things in order to reach the memory limit of a standard MSI Lightning (at its factory overclocked levels). We’ll then compare and contrast those results to those we encountered while using the 3GB Lightning XE. For Metro 2033 we used our standard gameplay playback. We decided to bypass Shogun 2’s built in benchmarking utility and instead use a playback from a custom battle. This effectively removes any CPU bottlenecks from the equation since AI is disabled in playback mode. Additionally, the config file was modified to allow the 1.5GB card to read as having 3GB of texture memory available which is necessary as any other option would limit the available video card settings. First up we have Shogun 2, an impressive looking game when DX11 is enabled but it is also extremely tough on hardware. As a purely anecdotal point, we read a whopping 2.1GB of texture memory being used at 2560 x 1600 with 8x MSAA enabled. It seems like even with such a demanding game, the core actually becomes the bottleneck here rather than the available memory bandwidth. The 3GB card can easily win when the settings are cranked but it doesn’t make a difference since playable framerates were impossible to achieve with MSAA enabled at 2560 x 1600 regardless of the card being used. At 1920 x 1200 up to 4x MSAA things are a bit easier but since available video memory wasn’t an issue, both cards drew even. Metro 2033 shows much of the same thing as Shogun: the 3GB framebuffer does make a difference but once again the GPU itself becomes the limiting factor when asked to push around massive rendering workloads. On the positive side of things, the 3GB card was actually able to play Metro 2033 with everything maxed out….albeit at a paltry 17 frames per second. Minimum framerates on the other hand were absolutely dominated by the Xtreme Edition.
|
thinkfly
ACX Member
- Total Posts : 323
- Reward points : 0
- Joined: 2011/08/11 18:41:49
- Status: offline
- Ribbons : 3

Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/11/11 06:37:32
(permalink)
DHLEVGA There is a review of both MSI Lightning GTX 580 cards (the 1.5G and the 3.0 G Extreme addition) that suggests that opposite of what the OP is claiming: You've missed my point. Minor vram shortage does NOT necessarily have decreased average fps / adhoc min fps. Also, you see why I tested 1.5GB vs 3GB in SLI? That's because when you put GPUs into SLI, the GPU power goes up, while the vram does not stack up. That's when you want large vram.
post edited by thinkfly - 2011/11/11 06:40:27
|
Zibri
ACX Member
- Total Posts : 350
- Reward points : 0
- Joined: 2011/09/30 09:33:17
- Location: Italy
- Status: offline
- Ribbons : 2

Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/11/11 09:18:34
(permalink)
garnetandblack In 2012, you may see games that really need the extra vram (not just some borderline debatable games), but right now, I don't see the point if you aren't playing in surround/gaming at 2560X1600. Of course, Kepler will be out in 2012, so no real need to worry about it this gen for me.
Running on Core i7 950 @ 4.2 ghz 2x EVGA GTX 580 in SLI @ 920 / 2250 / 1.150v 24/7 on 335.23 whql
|
Zibri
ACX Member
- Total Posts : 350
- Reward points : 0
- Joined: 2011/09/30 09:33:17
- Location: Italy
- Status: offline
- Ribbons : 2

Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/11/11 09:26:27
(permalink)
dukenuke88 1ceTr0n Meh, I don't run AA on my games, so im not really concerned. but it looks so much nicer with AA Hmm.. it depends on the game and resolution. For example: at 2560x1600 on a 27" or 30" monitor AA is less useful and noticeable than at 1920x1080 or 1920x1200. Also, some games already draw antialiased "lines" and use antialiased textures, so much less full screen antialiasing is needed. In bf3 (for example) AA can be turn down on my 30" wide gamus H2IPS monitor (HP ZR30W) but I must set it higher on my old 24" 1920x1200 TN panel (a dell E248WFP). At the end, AA is more needed at lower resolutions (which use less vram) and less needed at higher ones (which will use more vram) and the total vram usage is always enough to be managed with 1.5gb. Setting everything at MAX may make you feel "cool" but in reality the differences are minimal in the majority of "good" games. I already posted a lot of screenshots of bf3 with various settings. Framerates gets hit with higher settings (but I'm still over 60fps minimum framerate) but the overall quality is barely noticeable. And if you look at 2560x1600 images on a 1920x1080 monitor, you see them "enlarged" when panning on them at 1:1 pixel ratio... that's not how you see them on a native 2560x1600 monitor.
Running on Core i7 950 @ 4.2 ghz 2x EVGA GTX 580 in SLI @ 920 / 2250 / 1.150v 24/7 on 335.23 whql
|
thinkfly
ACX Member
- Total Posts : 323
- Reward points : 0
- Joined: 2011/08/11 18:41:49
- Status: offline
- Ribbons : 3

Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/11/11 09:57:47
(permalink)
Zibri At the end, AA is more needed at lower resolutions (which use less vram) and less needed at higher ones (which will use more vram) and the total vram usage is always enough to be managed with 1.5gb. Then which case does Total War Shogun 2 at 1920x1200 8AA belong to? (first game in the OP)
|
Johnny_Utah
CLASSIFIED Member
- Total Posts : 4330
- Reward points : 0
- Joined: 2008/02/13 16:26:04
- Status: offline
- Ribbons : 5
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/11/11 11:23:22
(permalink)
thinkfly DHLEVGA There is a review of both MSI Lightning GTX 580 cards (the 1.5G and the 3.0 G Extreme addition) that suggests that opposite of what the OP is claiming: You've missed my point. Minor vram shortage does NOT necessarily have decreased average fps / adhoc min fps. Also, you see why I tested 1.5GB vs 3GB in SLI? That's because when you put GPUs into SLI, the GPU power goes up, while the vram does not stack up. That's when you want large vram. I see exactly what you are saying, and it makes sense. When you SLI cards, you increase power, but not VRAM. With one card, it won't make a difference. With 2 or 3 cards, it sure will.
5930k (H20) // ASUS Rampage V Extreme EVGA Titan SC x 4 (H20) Gskill Ripjaws 2400 EVGA 1600G2 // DD Torture Rack
|
Rich Z
New Member
- Total Posts : 18
- Reward points : 0
- Joined: 2011/10/13 15:48:13
- Status: offline
- Ribbons : 0
Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/11/11 12:11:51
(permalink)
thinkfly
DHLEVGA
There is a review of both MSI Lightning GTX 580 cards (the 1.5G and the 3.0 G Extreme addition) that suggests that opposite of what the OP is claiming:
You've missed my point. Minor vram shortage does NOT necessarily have decreased average fps / adhoc min fps.
Also, you see why I tested 1.5GB vs 3GB in SLI? That's because when you put GPUs into SLI, the GPU power goes up, while the vram does not stack up. That's when you want large vram.
Well, I guess I'm just confused about all this concerning memory "stacking", or not... What does SLI actually DO then? I thought it divided the workload between the two GPUs that were ganged together. And how does it divide the workload? Well, I thought it would divide the pixels needing to be pumped to the screen by the number of GPUs now handling the workload. And where is that pixel info actually stored? Well I THOUGHT in the VRAM for each of the GPUs in SLI configuration. Is this not the way it works? And if not, then why bother having separate VRAM for each GPU in those cards (ie: GTX 590) that have two processors on one board if there is no gain to doing so? Surely the manufacturers wouldn't just waste that additional RAM by putting it on the board, now would they? Sorry if I am overlooking something obvious. Back in the day when Voodoo first used SLI, they called it Scan Line Interlacing, whereby each processor took alternate scan lines from the monitor thereby speeding up the display by dividing the workload in half. Maybe this isn't how it works any longer......
|
thinkfly
ACX Member
- Total Posts : 323
- Reward points : 0
- Joined: 2011/08/11 18:41:49
- Status: offline
- Ribbons : 3

Re:[In] Formal proof of the limitation of 1.5GB VRAM at 1920x1200 resolution
2011/11/11 12:39:26
(permalink)
Rich Z Well, I guess I'm just confused about all this concerning memory "stacking", or not... What does SLI actually DO then? I thought it divided the workload between the two GPUs that were ganged together. And how does it divide the workload? Well, I thought it would divide the pixels needing to be pumped to the screen by the number of GPUs now handling the workload. And where is that pixel info actually stored? Well I THOUGHT in the VRAM for each of the GPUs in SLI configuration. Is this not the way it works? And if not, then why bother having separate VRAM for each GPU in those cards (ie: GTX 590) that have two processors on one board if there is no gain to doing so? Surely the manufacturers wouldn't just waste that additional RAM by putting it on the board, now would they? Sorry if I am overlooking something obvious. Back in the day when Voodoo first used SLI, they called it Scan Line Interlacing, whereby each processor took alternate scan lines from the monitor thereby speeding up the display by dividing the workload in half. Maybe this isn't how it works any longer...... A while ago both NVIDIA and ATI used split-frame-rendering (SFR), which was to divide a screen into parts, reducing the resoultion processed by each GPU. In such case vram did stack up in SLI and CF. However SFR was quickly dropped by both companies, as it brought too much problems, mostly due to complicated bugs and ridiculous cost introduced to driver maintenance. Now the mainstream way of doing SLI and CF is by alternate-frame-rendering (AFR), which is essentially multiple GPUs rendering the same full screen, but they output the rendered picture in turn. This is why vram does not stack up nowadays. AFR is a lot easier than SFR for those writing drivers, though if the output of each frame is not evenly distributed, it would lead to microstuttering.
|