EVGA

NVIDIA GeForce RTX 4080 Comes in 12GB and 16GB Variants

Author
rjohnson11
EVGA Forum Moderator
  • Total Posts : 85038
  • Reward points : 0
  • Joined: 10/5/2004
  • Location: Netherlands
  • Status: offline
  • Ribbons : 86
Monday, September 05, 2022 7:43 AM (permalink)
https://www.techpowerup.com/298560/nvidia-geforce-rtx-4080-comes-in-12gb-and-16gb-variants
 
NVIDIA's upcoming GeForce RTX 4080 "Ada," a successor to the RTX 3080 "Ampere," reportedly comes in two distinct variants based on memory size, memory bus width, and possibly even core-configuration. MEGAsizeGPU reports that they have seen two reference designs for the RTX 4080, one with 12 GB of memory and a 10-layer PCB, and the other with 16 GB of memory and a 12-layer PCB. Increasing numbers of PCB layers enable greater density of wiring around the ASIC. At debut, the flagship product from NVIDIA is expected to be the RTX 4090, with its 24 GB memory size, and 14-layer PCB. Apparently, the 12 GB and 16 GB variants of the RTX 4080 feature vastly different PCB designs.

We've known from past attempts at memory-based variants, such as the GTX 1060 (3 GB vs. 6 GB), or the more recent RTX 3080 (10 GB vs. 12 GB), that NVIDIA turns to other levers to differentiate variants, such as core-configuration (numbers of available CUDA cores), and the same is highly likely with the RTX 4080. The RTX 4080 12 GB, RTX 4080 16 GB, and the RTX 4090, could be NVIDIA's answers to AMD's RDNA3-based successors of the RX 6800, RX 6800 XT, and RX 6950 XT, respectively.
 
Looks like NVIDIA wants to offer 16GB of memory as an option to compete against AMD's next gen GPUs. 

AMD Ryzen 9 7950X,  Corsair Mp700 Pro M.2, 64GB Corsair Dominator Titanium DDR5  X670E Steel Legend, MSI RTX 4090 Associate Code: H5U80QBH6BH0AXF. I am NOT an employee of EVGA

#1

5 Replies Related Threads

    Miguell
    FTW Member
    • Total Posts : 1112
    • Reward points : 0
    • Joined: 4/16/2008
    • Location: Portugal
    • Status: offline
    • Ribbons : 0
    Re: NVIDIA GeForce RTX 4080 Comes in 12GB and 16GB Variants Monday, September 05, 2022 4:41 PM (permalink)
    so can we assume... 16GB is the new base Vram from now on?
     
    i mean.. if people truly want sweet eye candy texture details at 4K and even 8K then 12GB may not be enough!
    right?

    Case: Cooler Master Stacker 830
    Display: 32" AOC Q3279VWFD8 @2560x1440@75Hz
    Cpu: Intel Core i7-8700
    Cpu Cooler: Cooler Master - MasterLiquid ML120L - RGB
    Mobo: Asus ROG Strix Z390-H Gaming
    Vga: Asus Dual RTX 4060 Ti 16GB Advanced Edition
    Ram: 32GB DDR4  G.SKILL - RIPJAWS V @3200Mhz
    Sound: Hama uRage soundZbar 2.1 Unleashed  - (Optical)
    Storage: 500GB SSD M.2 A2000  NVMe  Kingston (OS) + 8TB (4+4) HDD X300 Toshiba (Data)
    Psu: SeaSonic M12 700W
    Os: W10 Pro 64Bit
    #2
    CraptacularOne
    CLASSIFIED ULTRA Member
    • Total Posts : 6266
    • Reward points : 0
    • Joined: 6/13/2006
    • Location: Florida
    • Status: offline
    • Ribbons : 222
    Re: NVIDIA GeForce RTX 4080 Comes in 12GB and 16GB Variants Monday, September 05, 2022 5:13 PM (permalink)
    Miguell
    so can we assume... 16GB is the new base Vram from now on?
     
    i mean.. if people truly want sweet eye candy texture details at 4K and even 8K then 12GB may not be enough!
    right?


    How can you even possibly infer that "16GB is the new base VRAM" from a report that conflicts 12GB and 16GB variants? The reality is that most people (it's not even close actually) do not use a 4K monitor for their PC, most people (the vast majority) still use a 1080p monitor for their PC. Hell, the latest Steam survey only had 4K primary monitor resolution at 2.5% usage. You know what was higher than that by almost 3 times as much? The old outdated 1366x768 resolution at 6.2% usage. Also we are in no real danger of being 8K ready to game, even on next gen GPUs. They would need to increase their performance output by at least a factor of 3 to even start to be viable at 8K resolution with high graphical settings. We just aren't there yet and probably won't be for at least another 2 or 3 generations at the earliest. 
     
    The point I'm trying to make is that no, 16GB VRAM will not be the baseline for VRAM not for this generation. Just because a high end video card has 16GB of VRAM does not make that the new baseline. The standard will more than likely continue to be 8GB with entry level starting at 4 and 6GB VRAM. 

    Intel i9 14900K ...............................Ryzen 9 7950X3D
    MSI RTX 4090 Gaming Trio................ASRock Phantom RX 7900 XTX
    Samsung Odyssey G9.......................PiMax 5K Super/Meta Quest 3
    ASUS ROG Strix Z690-F Gaming........ASUS TUF Gaming X670E Plus WiFi
    64GB G.Skill Trident Z5 6800Mhz.......64GB Kingston Fury RGB 6000Mhz
    MSI MPG A1000G 1000w..................EVGA G3 SuperNova 1000w
    #3
    rjohnson11
    EVGA Forum Moderator
    • Total Posts : 85038
    • Reward points : 0
    • Joined: 10/5/2004
    • Location: Netherlands
    • Status: offline
    • Ribbons : 86
    Re: NVIDIA GeForce RTX 4080 Comes in 12GB and 16GB Variants Monday, September 05, 2022 7:02 PM (permalink)
    CraptacularOne
    Miguell
    so can we assume... 16GB is the new base Vram from now on?
     
    i mean.. if people truly want sweet eye candy texture details at 4K and even 8K then 12GB may not be enough!
    right?


    How can you even possibly infer that "16GB is the new base VRAM" from a report that conflicts 12GB and 16GB variants? The reality is that most people (it's not even close actually) do not use a 4K monitor for their PC, most people (the vast majority) still use a 1080p monitor for their PC. Hell, the latest Steam survey only had 4K primary monitor resolution at 2.5% usage. You know what was higher than that by almost 3 times as much? The old outdated 1366x768 resolution at 6.2% usage. Also we are in no real danger of being 8K ready to game, even on next gen GPUs. They would need to increase their performance output by at least a factor of 3 to even start to be viable at 8K resolution with high graphical settings. We just aren't there yet and probably won't be for at least another 2 or 3 generations at the earliest. 
     
    The point I'm trying to make is that no, 16GB VRAM will not be the baseline for VRAM not for this generation. Just because a high end video card has 16GB of VRAM does not make that the new baseline. The standard will more than likely continue to be 8GB with entry level starting at 4 and 6GB VRAM. 


    I agree with you for the most part. 16GB of video ram will also come in handle for those people at home working with CAD, animation, and photoshop rendering. 

    AMD Ryzen 9 7950X,  Corsair Mp700 Pro M.2, 64GB Corsair Dominator Titanium DDR5  X670E Steel Legend, MSI RTX 4090 Associate Code: H5U80QBH6BH0AXF. I am NOT an employee of EVGA

    #4
    Brad_Hawthorne
    Insert Custom Title Here
    • Total Posts : 18001
    • Reward points : 0
    • Joined: 6/6/2004
    • Location: Dazed & Confused
    • Status: offline
    • Ribbons : 39
    Re: NVIDIA GeForce RTX 4080 Comes in 12GB and 16GB Variants Thursday, September 08, 2022 5:09 PM (permalink)
    Not a fan of a single product having different memory bus widths. That'll have impact upon performance numbers although it'll be named the same.
    #5
    Cool GTX
    EVGA Forum Moderator
    • Total Posts : 31353
    • Reward points : 0
    • Joined: 12/12/2010
    • Location: Folding for the Greater Good
    • Status: offline
    • Ribbons : 123
    Re: NVIDIA GeForce RTX 4080 Comes in 12GB and 16GB Variants Thursday, September 08, 2022 10:04 PM (permalink)
    yawn OK, someone wake me once Nvidia make an official announcement of specs, availability & price
     
    looking more like late 2022 & more likely 2023, before we can buy these, base on Nvidia Qt2 report

    Learn your way around the EVGA Forums, Rules & limits on new accounts Ultimate Self-Starter Thread For New Members

    I am a Volunteer Moderator - not an EVGA employee

    Older RIG projects RTX Project  Nibbler


     When someone does not use reason to reach their conclusion in the first place; you can't use reason to convince them otherwise!
    #6
    Jump to: