EVGA

Nvidia's Dumbest Decision (AdoredTV)

Page: 12 > Showing page 1 of 2
Author
vulcan1978
iCX Member
  • Total Posts : 284
  • Reward points : 0
  • Joined: 2014/05/25 02:18:19
  • Status: offline
  • Ribbons : 0
2020/09/15 11:57:55 (permalink)

8700k @ 5.1 GHz - 0 AVX @ 1.386v Dynamic Offset w/ EK Monoblock + Delid | Gigabyte Z370 Aorus Gaming 7 | EVGA 2080 Ti XC2 Ultra @ 2130 Mhz core, 7950 MHz memory @ 1.063v w/ 375W FTW3 vbios + Phanteks Glacier Block  | EK CE 420 + EK XE 360 | 2x16GB G-Skill Trident Z Royal 3600 MHz 17-20-20-38 | 2 TB Sabrent Rocket | Corsair RM1000x | Thermaltake View 71 | Alienware AW3418DW + Asus ROG Swift PG278Q (for 3D Vision) on Amazon Basics Arms | Win10 Pro 1809
 
philosophersbunker.blogspot.com
#1

31 Replies Related Threads

    kevinc313
    CLASSIFIED ULTRA Member
    • Total Posts : 5004
    • Reward points : 0
    • Joined: 2019/02/28 09:27:55
    • Status: offline
    • Ribbons : 22
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 12:11:37 (permalink)
    vulcan1978
    https://youtu.be/tXb-8feWoOE
     




    I'll watch it later but do you have cliff notes please?
    #2
    vulcan1978
    iCX Member
    • Total Posts : 284
    • Reward points : 0
    • Joined: 2014/05/25 02:18:19
    • Status: offline
    • Ribbons : 0
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 12:30:17 (permalink)
    kevinc313
    vulcan1978
    https://youtu.be/tXb-8feWoOE
     




    I'll watch it later but do you have cliff notes please?




    Basically NV went with Samsung 8nm EUV over TSMC 7nm because of wafer cost (~$3000 per AdoredTV's estimation vs $8000) but when you break down the yields with a 425mm2 chip (7nm equivalent to GA-102-200) the savings work out to being roughly $75 vs $47 or around $26 per yield but having made this decision NV has had to run the 8nm chips at much higher voltage and wattage (given the increased die size) and spend $155 on the FE cooler to mitigate the heat (per Igors Lab analysis) and they also have no Titan card this time around (because with the higher sized node there is no room for a larger more powerful die than the 3090) so they lost money on selling Titan cards and they will struggle to cool the mobile variants of this node in the laptops next year. They also pass the cost onto the consumer in the form of much higher electricity costs. 
     
    Jim also points out that the 3090 is essentially a rebranded $1500 80 Ti (it's clearly not a Titan card, irrespective of their allusion to the Titan in their marketing), so yes, they are basically shafting everyone again with ridiculous prices. 

    Ultimately they didn't save anything and this was a really stupid decision. 
     
    The video is absolutely worth a watch, you can turn on closed captioning if you struggle with Jim's thick Irish accent. 
     
    This is 100% spot on journalism here, there is no error prone speculating, it's very factual. 
     
    I tried to post this over at r/Nvidia and the mods took it down within 5 minutes. 
     
     

    8700k @ 5.1 GHz - 0 AVX @ 1.386v Dynamic Offset w/ EK Monoblock + Delid | Gigabyte Z370 Aorus Gaming 7 | EVGA 2080 Ti XC2 Ultra @ 2130 Mhz core, 7950 MHz memory @ 1.063v w/ 375W FTW3 vbios + Phanteks Glacier Block  | EK CE 420 + EK XE 360 | 2x16GB G-Skill Trident Z Royal 3600 MHz 17-20-20-38 | 2 TB Sabrent Rocket | Corsair RM1000x | Thermaltake View 71 | Alienware AW3418DW + Asus ROG Swift PG278Q (for 3D Vision) on Amazon Basics Arms | Win10 Pro 1809
     
    philosophersbunker.blogspot.com
    #3
    hallowen
    CLASSIFIED ULTRA Member
    • Total Posts : 5644
    • Reward points : 0
    • Joined: 2008/06/18 15:38:00
    • Location: In a Galaxy Far, Far Away...
    • Status: offline
    • Ribbons : 14
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 12:39:05 (permalink)
    Man, Now That 1/2 hour clip explains a lot that I wanted to know about Nvidia.

    ASUS: Rampage VI Extreme | i9-7940X | 2X RTX 2080 Ti Kingpin SLI | 32GB DDR4 3200MHz Memory - SAGER: NP9870-G | i7-6700K | GTX 980M 8GB | 64GB DDR4 | 950 PRO M.2 512GB | 17.3 QHD 120Hz Matte G-Sync | Prema bios - EVOC Premamod:  P870TM1 | i9-9900K-LM | RTX 2080N 8GB | Modded Vapor Chamber | 32GB 3000MHz Ripjaws | 960 EVO M.2 1TB | 17.3 3K QHD 120Hz Matte G-Sync | Intel 8265 -
     
     
    #4
    aka_STEVE_b
    EGC Admin
    • Total Posts : 17692
    • Reward points : 0
    • Joined: 2006/02/26 06:45:46
    • Location: OH
    • Status: offline
    • Ribbons : 69
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 12:42:18 (permalink)
    I could have told you Samsung has spotty Q/A & yields  , and you shouldn't put all your eggs in 1 basket with them .

    AMD RYZEN 9 5900X  12-core cpu~ ASUS ROG Crosshair VIII Dark Hero ~ EVGA RTX 3080 Ti FTW3~ G.SKILL Trident Z NEO 32GB DDR4-3600 ~ Phanteks Eclipse P400s red case ~ EVGA SuperNOVA 1000 G+ PSU ~ Intel 660p M.2 drive~ Crucial MX300 275 GB SSD ~WD 2TB SSD ~CORSAIR H115i RGB Pro XT 280mm cooler ~ CORSAIR Dark Core RGB Pro mouse ~ CORSAIR K68 Mech keyboard ~ HGST 4TB Hd.~ AOC AGON 32" monitor 1440p @ 144Hz ~ Win 10 x64
    #5
    ty_ger07
    Insert Custom Title Here
    • Total Posts : 21170
    • Reward points : 0
    • Joined: 2008/04/10 23:48:15
    • Location: traveler
    • Status: offline
    • Ribbons : 270
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 14:27:18 (permalink)
    Interesting breakdown.  Makes sense.

    ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
    My EVGA Score: 1546 • Zero Associates Points • I don't shill

    #6
    arestavo
    CLASSIFIED ULTRA Member
    • Total Posts : 6916
    • Reward points : 0
    • Joined: 2008/02/06 06:58:57
    • Location: Through the Scary Door
    • Status: offline
    • Ribbons : 76
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 14:37:35 (permalink)
    "because with the higher sized node there is no room for a larger more powerful die than the 3090"

    Tell that to the quadro version. Granted the leaks show only about 6% more Cuda cores - but it's still more.
    #7
    ty_ger07
    Insert Custom Title Here
    • Total Posts : 21170
    • Reward points : 0
    • Joined: 2008/04/10 23:48:15
    • Location: traveler
    • Status: offline
    • Ribbons : 270
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 14:46:09 (permalink)
    arestavo
    "because with the higher sized node there is no room for a larger more powerful die than the 3090"

    Tell that to the quadro version. Granted the leaks show only about 6% more Cuda cores - but it's still more.


    Did he say that in the video? I must have missed that quote.

    Near the end he concluded that there isn't room for a Titan version because the efficiency and noise is not good enough to make a marketable Titan.  To have similar power consumption and noise levels to previous Titans, the performance would be worse than a 3080. Having no room in the product stack isn't the same as having no room on the die.
     
    Do we know that the Quadro version uses the same process, node size, and Samsung? If the reported inefficiency is true, the Quadro must be using something different.
    post edited by ty_ger07 - 2020/09/15 16:44:48

    ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
    My EVGA Score: 1546 • Zero Associates Points • I don't shill

    #8
    hallowen
    CLASSIFIED ULTRA Member
    • Total Posts : 5644
    • Reward points : 0
    • Joined: 2008/06/18 15:38:00
    • Location: In a Galaxy Far, Far Away...
    • Status: offline
    • Ribbons : 14
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 14:52:41 (permalink)
    I was particularly interested in Igors Lab analysis concerning the extra amount of heat that will be generated by the air-cooled Ampere 30xx series due to the higher voltage / wattage Nvidia is implementing.
    Looks like if you don't need an extra heater in your room, You had better use a water block for cooling these cards.

    ASUS: Rampage VI Extreme | i9-7940X | 2X RTX 2080 Ti Kingpin SLI | 32GB DDR4 3200MHz Memory - SAGER: NP9870-G | i7-6700K | GTX 980M 8GB | 64GB DDR4 | 950 PRO M.2 512GB | 17.3 QHD 120Hz Matte G-Sync | Prema bios - EVOC Premamod:  P870TM1 | i9-9900K-LM | RTX 2080N 8GB | Modded Vapor Chamber | 32GB 3000MHz Ripjaws | 960 EVO M.2 1TB | 17.3 3K QHD 120Hz Matte G-Sync | Intel 8265 -
     
     
    #9
    crazyst888
    Superclocked Member
    • Total Posts : 107
    • Reward points : 0
    • Joined: 2009/03/23 22:24:59
    • Status: offline
    • Ribbons : 1
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 14:55:13 (permalink)
    hallowen
    I was particularly interested in Igors Lab analysis concerning the extra amount of heat that will be generated by the air-cooled Ampere 30xx series due to the higher voltage / wattage Nvidia is implementing.
    Looks like if you don't need an extra heater in your room, You had better use a water block for cooling these cards.


    Heat is Heat, you are pump it out of the room. 


    #10
    ty_ger07
    Insert Custom Title Here
    • Total Posts : 21170
    • Reward points : 0
    • Joined: 2008/04/10 23:48:15
    • Location: traveler
    • Status: offline
    • Ribbons : 270
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 14:55:39 (permalink)
    hallowen
    Looks like if you don't need an extra heater in your room, You had better use a water block for cooling these cards.

    Does not compute.  The amount it will heat the room is the same either way, unless you plumb the radiator outside through your wall.

    ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
    My EVGA Score: 1546 • Zero Associates Points • I don't shill

    #11
    hallowen
    CLASSIFIED ULTRA Member
    • Total Posts : 5644
    • Reward points : 0
    • Joined: 2008/06/18 15:38:00
    • Location: In a Galaxy Far, Far Away...
    • Status: offline
    • Ribbons : 14
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 14:59:50 (permalink)
    crazyst888
    hallowen
    I was particularly interested in Igors Lab analysis concerning the extra amount of heat that will be generated by the air-cooled Ampere 30xx series due to the higher voltage / wattage Nvidia is implementing.
    Looks like if you don't need an extra heater in your room, You had better use a water block for cooling these cards.


    Heat is Heat, you are pump it out of the room. 


    No Problem, I'll be using Chilled Water Cooling with Hydro Coppers to be sure.

    ASUS: Rampage VI Extreme | i9-7940X | 2X RTX 2080 Ti Kingpin SLI | 32GB DDR4 3200MHz Memory - SAGER: NP9870-G | i7-6700K | GTX 980M 8GB | 64GB DDR4 | 950 PRO M.2 512GB | 17.3 QHD 120Hz Matte G-Sync | Prema bios - EVOC Premamod:  P870TM1 | i9-9900K-LM | RTX 2080N 8GB | Modded Vapor Chamber | 32GB 3000MHz Ripjaws | 960 EVO M.2 1TB | 17.3 3K QHD 120Hz Matte G-Sync | Intel 8265 -
     
     
    #12
    ty_ger07
    Insert Custom Title Here
    • Total Posts : 21170
    • Reward points : 0
    • Joined: 2008/04/10 23:48:15
    • Location: traveler
    • Status: offline
    • Ribbons : 270
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 15:02:48 (permalink)
    hallowen
    crazyst888
    hallowen
    I was particularly interested in Igors Lab analysis concerning the extra amount of heat that will be generated by the air-cooled Ampere 30xx series due to the higher voltage / wattage Nvidia is implementing.
    Looks like if you don't need an extra heater in your room, You had better use a water block for cooling these cards.


    Heat is Heat, you are pump it out of the room. 


    No Problem, I'll be using Chilled Water Cooling with Hydro Coppers to be sure.


    Due to chiller inefficiencies, you will be increasing the heat in the room more than if you left the stock heatsink on it.  Gotcha.

    ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
    My EVGA Score: 1546 • Zero Associates Points • I don't shill

    #13
    hallowen
    CLASSIFIED ULTRA Member
    • Total Posts : 5644
    • Reward points : 0
    • Joined: 2008/06/18 15:38:00
    • Location: In a Galaxy Far, Far Away...
    • Status: offline
    • Ribbons : 14
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 15:06:53 (permalink)
    My A/C Conversion Chiller Exhausts the Heat outside my room.
    Back atcha

    ASUS: Rampage VI Extreme | i9-7940X | 2X RTX 2080 Ti Kingpin SLI | 32GB DDR4 3200MHz Memory - SAGER: NP9870-G | i7-6700K | GTX 980M 8GB | 64GB DDR4 | 950 PRO M.2 512GB | 17.3 QHD 120Hz Matte G-Sync | Prema bios - EVOC Premamod:  P870TM1 | i9-9900K-LM | RTX 2080N 8GB | Modded Vapor Chamber | 32GB 3000MHz Ripjaws | 960 EVO M.2 1TB | 17.3 3K QHD 120Hz Matte G-Sync | Intel 8265 -
     
     
    #14
    HeavyHemi
    Insert Custom Title Here
    • Total Posts : 15665
    • Reward points : 0
    • Joined: 2008/11/28 20:31:42
    • Location: Western Washington
    • Status: offline
    • Ribbons : 135
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 15:08:54 (permalink)
    Adored's track record is about 50%. He's just another popular guy on the net with no actual industry experience to speak of. His opinions should carry no more weight than any other random on the web. Progress.

    EVGA X99 FTWK / i7 6850K @ 4.5ghz / RTX 3080Ti FTW Ultra / 32GB Corsair LPX 3600mhz / Samsung 850Pro 256GB / Be Quiet BN516 Straight Power 12-1000w 80 Plus Platinum / Window 10 Pro
     
    #15
    jonkrmr
    SSC Member
    • Total Posts : 952
    • Reward points : 0
    • Joined: 2006/09/19 13:05:11
    • Location: California USA
    • Status: offline
    • Ribbons : 16
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 15:52:58 (permalink)
    He keeps talking about why all of a sudden Nvidia massively increased Cuda cores with Amphere (10496 - 3090) from the normal increase from generation to generation. 10496 is not the true Cuda core count for Amphere 3090. It is half that. Nvidia is using it as a marketing gimmick. The 3090 has 5248 true Cuda cores that can do two instructions per clock cycle so Nvidia markets it as 10496 Cuda cores.

    Intel i9-10850K @ 5 GHz
    MSI MEG Z490 Unify
    Corsair Vengeance RGB RT 32GB 3600MHz DDR4
    ASUS Strix RTX 3080 OC 12GB Gaming @ 2175 MHz core - peak \ 20004 MHz mem
    Samsung 970 EVO Plus 500GB NVMe M.2 SSD
    2x Samsung 970 EVO Plus 2TB NVMe M.2 SSD RAID 0
    SoundBlasterX AE-5 
    EVGA SuperNova 1000 P2
    Corsair Obsidian 500D SE
    Custom water cooling on CPU & GPU
    Acer XV272U 27" 2k 170Hz
    #16
    Intoxicus
    iCX Member
    • Total Posts : 406
    • Reward points : 0
    • Joined: 2009/10/23 19:03:35
    • Status: offline
    • Ribbons : 0
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 15:56:07 (permalink)
    HeavyHemi
    Adored's track record is about 50%. He's just another popular guy on the net with no actual industry experience to speak of. His opinions should carry no more weight than any other random on the web. Progress.



    Indeed. Dude is good at seeming more and intelligent and informed than he actually his. Where did he get all those numbers from? What are his verifiable sources?

    As far as I can tell it's biased poop talk that appeals to people that share the same bias. Confirmation bias goes waaayyy too far these days.

    He's not even that popular at less than 100k subs on YT. 

    "Humans are not rational animals, humans are rationalizing animals." -Robert A Heinlein
    #17
    Intoxicus
    iCX Member
    • Total Posts : 406
    • Reward points : 0
    • Joined: 2009/10/23 19:03:35
    • Status: offline
    • Ribbons : 0
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 16:03:10 (permalink)
    "The RTX 3000 cards are built on an architecture NVIDIA calls "Ampere," and its SM, in some ways, takes both the Pascal and the Turing approach. Ampere keeps the 64 FP32 cores as before, but the 64 other cores are now designated as "FP32 and INT32.” So, half the Ampere cores are dedicated to floating-point, but the other half can perform either floating-point or integer math, just like in Pascal.
    With this switch, NVIDIA is now counting each SM as containing 128 FP32 cores, rather than the 64 that Turing had. The 3070's "5,888 cuda cores" are perhaps better described as "2,944 cuda cores, and 2,944 cores that can be cuda.""

    https://www.engadget.com/...LwfJ8-uW2Sbp49C3qZNm-R|

    The truth is more interesting and nuanced. And none of those biased sources are accurately reporting what the core count actually is and what that actually means.

    "Humans are not rational animals, humans are rationalizing animals." -Robert A Heinlein
    #18
    flyingtoaster85
    iCX Member
    • Total Posts : 282
    • Reward points : 0
    • Joined: 2016/12/10 22:24:59
    • Location: Los Angeles
    • Status: offline
    • Ribbons : 0
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 16:24:38 (permalink)
    The 3000 series is shaping up to be watercooler’s dream. Unlike the power-limited pascal, I think the core speed will improve dramatically with cooling.

    Main: 10700k @ 5.4ghz, 3090 K|NGP|N Hydrocopper, 4x8 @4300 16-16-16-36, EVGA 1200 P2, MSI Z490 Unify, 5 radiators, 2 pumps. Heavily modified Evolv ATX
     
    Travel size: Zen3 5800H, RTX 3060 Laptop GPU w/unlocked bios, 2x16 Kingston HyperX 3200 C20
    #19
    kevinc313
    CLASSIFIED ULTRA Member
    • Total Posts : 5004
    • Reward points : 0
    • Joined: 2019/02/28 09:27:55
    • Status: offline
    • Ribbons : 22
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 16:37:12 (permalink)
    vulcan1978
    kevinc313
    vulcan1978
    https://youtu.be/tXb-8feWoOE
     




    I'll watch it later but do you have cliff notes please?




    Basically NV went with Samsung 8nm EUV over TSMC 7nm because of wafer cost (~$3000 per AdoredTV's estimation vs $8000) but when you break down the yields with a 425mm2 chip (7nm equivalent to GA-102-200) the savings work out to being roughly $75 vs $47 or around $26 per yield but having made this decision NV has had to run the 8nm chips at much higher voltage and wattage (given the increased die size) and spend $155 on the FE cooler to mitigate the heat (per Igors Lab analysis) and they also have no Titan card this time around (because with the higher sized node there is no room for a larger more powerful die than the 3090) so they lost money on selling Titan cards and they will struggle to cool the mobile variants of this node in the laptops next year. They also pass the cost onto the consumer in the form of much higher electricity costs. 
     
    Jim also points out that the 3090 is essentially a rebranded $1500 80 Ti (it's clearly not a Titan card, irrespective of their allusion to the Titan in their marketing), so yes, they are basically shafting everyone again with ridiculous prices. 

    Ultimately they didn't save anything and this was a really stupid decision. 
     
    The video is absolutely worth a watch, you can turn on closed captioning if you struggle with Jim's thick Irish accent. 
     
    This is 100% spot on journalism here, there is no error prone speculating, it's very factual. 
     
    I tried to post this over at r/Nvidia and the mods took it down within 5 minutes. 
     
     




    Hot damn. 14 hours till the FE 3080 review drops.
    #20
    ty_ger07
    Insert Custom Title Here
    • Total Posts : 21170
    • Reward points : 0
    • Joined: 2008/04/10 23:48:15
    • Location: traveler
    • Status: offline
    • Ribbons : 270
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 16:38:51 (permalink)
    flyingtoaster85
    The 3000 series is shaping up to be watercooler’s dream. Unlike the power-limited pascal, I think the core speed will improve dramatically with cooling.


    Rather than being artificially power limited, they may be power hungry. Hardly a plus, but I think you are right that cooling will be the key to the performance of this series.
    post edited by ty_ger07 - 2020/09/15 16:44:15

    ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
    My EVGA Score: 1546 • Zero Associates Points • I don't shill

    #21
    Delirious
    EVGA Forum Moderator
    • Total Posts : 17474
    • Reward points : 0
    • Joined: 2007/11/15 13:34:04
    • Location: at my computer
    • Status: offline
    • Ribbons : 61
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 16:45:42 (permalink)
    hallowen
    crazyst888
    hallowen
    I was particularly interested in Igors Lab analysis concerning the extra amount of heat that will be generated by the air-cooled Ampere 30xx series due to the higher voltage / wattage Nvidia is implementing.
    Looks like if you don't need an extra heater in your room, You had better use a water block for cooling these cards.


    Heat is Heat, you are pump it out of the room. 


    No Problem, I'll be using Chilled Water Cooling with Hydro Coppers to be sure.


    going the same route, minus the Chiller......winters coming.

    "Be quick to listen, slow to speak and slow to anger" 
    Affiliate Code XZUMV9TJW5
    Associate Code: 7PM43CU71IB2IAP
    education may be expensive but wait until you get the bill for ignorance
    A wise man once said that we can't make anyone feel or do anything. We can throw things into the wind, but it's up to each person to decide how they want to react, where they want to stand when things fall.
    #22
    HeavyHemi
    Insert Custom Title Here
    • Total Posts : 15665
    • Reward points : 0
    • Joined: 2008/11/28 20:31:42
    • Location: Western Washington
    • Status: offline
    • Ribbons : 135
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 17:30:15 (permalink)
    flyingtoaster85
    The 3000 series is shaping up to be watercooler’s dream. Unlike the power-limited pascal, I think the core speed will improve dramatically with cooling.

    Hmm... Pascal was primarily temp limited. I Bounce off 130% power with impunity even undervolted.  Your first temp bin is all the way down around 39C and drops like a stone after that.

    EVGA X99 FTWK / i7 6850K @ 4.5ghz / RTX 3080Ti FTW Ultra / 32GB Corsair LPX 3600mhz / Samsung 850Pro 256GB / Be Quiet BN516 Straight Power 12-1000w 80 Plus Platinum / Window 10 Pro
     
    #23
    rjbarker
    CLASSIFIED Member
    • Total Posts : 3214
    • Reward points : 0
    • Joined: 2008/03/20 10:07:05
    • Location: Vancouver Isle - Westcoast Canada
    • Status: offline
    • Ribbons : 21
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 19:07:18 (permalink)
    My 1080Ti's have had EK Blocks on em since day 1......been running at 2050 Mhz with no issues....although I have since dialed back to 2035 .....no added voltage ...no BIOS Mods.....never get to much more than 42c under full load.......really hoping the 3080 with Block can reach somewhere around the same....we'll see soon.
     
    Been a very long time since I'll have had only 1 Card in my Case !

    I9 12900K EK Velocity2 / ROG Z690 Apex/ 32G Dominator DDR5 6000/ Evga RTX 3080Ti FTW3  EK Vector / 980 Pro 512G / 980 Pro 1TB/ Samsung 860 Pro 500G/ WD 4TB Red / AX 1600i /  Corsair 900D & XSPC 480 * 360 * 240 Rads   XSPC Photon 170 Rez-Vario Pump Combo - Alienware 3440*1440p 120Hz/ W11
     
    #24
    flyingtoaster85
    iCX Member
    • Total Posts : 282
    • Reward points : 0
    • Joined: 2016/12/10 22:24:59
    • Location: Los Angeles
    • Status: offline
    • Ribbons : 0
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 19:34:00 (permalink)
    rjbarker
    My 1080Ti's have had EK Blocks on em since day 1......been running at 2050 Mhz with no issues....although I have since dialed back to 2035 .....no added voltage ...no BIOS Mods.....never get to much more than 42c under full load.......really hoping the 3080 with Block can reach somewhere around the same....we'll see soon.
     
    Been a very long time since I'll have had only 1 Card in my Case !


    Was the 15 mhz loss due to degradation?

    Main: 10700k @ 5.4ghz, 3090 K|NGP|N Hydrocopper, 4x8 @4300 16-16-16-36, EVGA 1200 P2, MSI Z490 Unify, 5 radiators, 2 pumps. Heavily modified Evolv ATX
     
    Travel size: Zen3 5800H, RTX 3060 Laptop GPU w/unlocked bios, 2x16 Kingston HyperX 3200 C20
    #25
    rjbarker
    CLASSIFIED Member
    • Total Posts : 3214
    • Reward points : 0
    • Joined: 2008/03/20 10:07:05
    • Location: Vancouver Isle - Westcoast Canada
    • Status: offline
    • Ribbons : 21
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 19:57:13 (permalink)
    flyingtoaster85
    rjbarker
    My 1080Ti's have had EK Blocks on em since day 1......been running at 2050 Mhz with no issues....although I have since dialed back to 2035 .....no added voltage ...no BIOS Mods.....never get to much more than 42c under full load.......really hoping the 3080 with Block can reach somewhere around the same....we'll see soon.
     
    Been a very long time since I'll have had only 1 Card in my Case !


    Was the 15 mhz loss due to degradation?




    No not at all.....was doing some bench runs and sniffing around for 5.3 Ghz 9900K and decided to drop my GPU's down from 2050 to 2035 Mhz ......left em there...I havent run many Benchmarks since building this 9900K System and gaming sees miniscule gains from 2000 Mhz to 2050 Mhz.......I actually had both cards running at 2065 Mhz for quite awhile when I first got em........anything more not stable.
    They literally have never seen more than 42c / 43c ..............its a pretty decent size .. Loop ;)

    I9 12900K EK Velocity2 / ROG Z690 Apex/ 32G Dominator DDR5 6000/ Evga RTX 3080Ti FTW3  EK Vector / 980 Pro 512G / 980 Pro 1TB/ Samsung 860 Pro 500G/ WD 4TB Red / AX 1600i /  Corsair 900D & XSPC 480 * 360 * 240 Rads   XSPC Photon 170 Rez-Vario Pump Combo - Alienware 3440*1440p 120Hz/ W11
     
    #26
    kialdadial
    New Member
    • Total Posts : 1
    • Reward points : 0
    • Joined: 2020/09/06 19:25:30
    • Status: offline
    • Ribbons : 0
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 20:36:06 (permalink)
    vulcan1978
    kevinc313
    vulcan1978

     




    I'll watch it later but do you have cliff notes please?




    Basically NV went with Samsung 8nm EUV over TSMC 7nm because of wafer cost (~$3000 per AdoredTV's estimation vs $8000) but when you break down the yields with a 425mm2 chip (7nm equivalent to GA-102-200) the savings work out to being roughly $75 vs $47 or around $26 per yield but having made this decision NV has had to run the 8nm chips at much higher voltage and wattage (given the increased die size) and spend $155 on the FE cooler to mitigate the heat (per Igors Lab analysis) and they also have no Titan card this time around (because with the higher sized node there is no room for a larger more powerful die than the 3090) so they lost money on selling Titan cards and they will struggle to cool the mobile variants of this node in the laptops next year. They also pass the cost onto the consumer in the form of much higher electricity costs. 
     
    Jim also points out that the 3090 is essentially a rebranded $1500 80 Ti (it's clearly not a Titan card, irrespective of their allusion to the Titan in their marketing), so yes, they are basically shafting everyone again with ridiculous prices. 

    Ultimately they didn't save anything and this was a really stupid decision. 
     
    The video is absolutely worth a watch, you can turn on closed captioning if you struggle with Jim's thick Irish accent. 
     
    This is 100% spot on journalism here, there is no error prone speculating, it's very factual. 
     
    I tried to post this over at r/Nvidia and the mods took it down within 5 minutes. 
     
     



    I wouldn't go as far as to say 100% spot on journalism here, he uses numbers he just makes up and rolls with them through the second half of the video. literally everything after the pricing of plates, that he bases off of that pricing is speculation not fact.

    I also don't agree with the rebranding of it being an 80 ti.

    if anything and they do back down, the 90 would be a new model they are putting in between the 80 series and the titan series. Reasonings
    The 80 series has room to move in the current line up for a ti and a super model down the road.
    They have marketed the 90 series as a titan class gpu, for the consumer market. Titan's weren't built for games, so why is that a point they are making now. either they are dropping the titan name and some of the specs that make a titan or they are using the 90 series as a buffer to release a titan or 90 ti variant at a later date. But it is not a 80ti equivalent 
    #27
    vulcan1978
    iCX Member
    • Total Posts : 284
    • Reward points : 0
    • Joined: 2014/05/25 02:18:19
    • Status: offline
    • Ribbons : 0
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 21:04:53 (permalink)
    kevinc313
    vulcan1978
    kevinc313
    vulcan1978
    https://youtu.be/tXb-8feWoOE
     




    I'll watch it later but do you have cliff notes please?




    Basically NV went with Samsung 8nm EUV over TSMC 7nm because of wafer cost (~$3000 per AdoredTV's estimation vs $8000) but when you break down the yields with a 425mm2 chip (7nm equivalent to GA-102-200) the savings work out to being roughly $75 vs $47 or around $26 per yield but having made this decision NV has had to run the 8nm chips at much higher voltage and wattage (given the increased die size) and spend $155 on the FE cooler to mitigate the heat (per Igors Lab analysis) and they also have no Titan card this time around (because with the higher sized node there is no room for a larger more powerful die than the 3090) so they lost money on selling Titan cards and they will struggle to cool the mobile variants of this node in the laptops next year. They also pass the cost onto the consumer in the form of much higher electricity costs. 
     
    Jim also points out that the 3090 is essentially a rebranded $1500 80 Ti (it's clearly not a Titan card, irrespective of their allusion to the Titan in their marketing), so yes, they are basically shafting everyone again with ridiculous prices. 

    Ultimately they didn't save anything and this was a really stupid decision. 
     
    The video is absolutely worth a watch, you can turn on closed captioning if you struggle with Jim's thick Irish accent. 
     
    This is 100% spot on journalism here, there is no error prone speculating, it's very factual. 
     
    I tried to post this over at r/Nvidia and the mods took it down within 5 minutes. 
     
     




    Hot damn. 14 hours till the FE 3080 review drops.




    You would like my replies over at the sister thread on OC.net (Vulcan1978): 
     
    https://www.overclock.net/threads/nvidias-dumbest-decision-adoredtv.1773304/
     

    8700k @ 5.1 GHz - 0 AVX @ 1.386v Dynamic Offset w/ EK Monoblock + Delid | Gigabyte Z370 Aorus Gaming 7 | EVGA 2080 Ti XC2 Ultra @ 2130 Mhz core, 7950 MHz memory @ 1.063v w/ 375W FTW3 vbios + Phanteks Glacier Block  | EK CE 420 + EK XE 360 | 2x16GB G-Skill Trident Z Royal 3600 MHz 17-20-20-38 | 2 TB Sabrent Rocket | Corsair RM1000x | Thermaltake View 71 | Alienware AW3418DW + Asus ROG Swift PG278Q (for 3D Vision) on Amazon Basics Arms | Win10 Pro 1809
     
    philosophersbunker.blogspot.com
    #28
    Nereus
    Captain Goodvibes
    • Total Posts : 18915
    • Reward points : 0
    • Joined: 2009/04/09 20:05:53
    • Location: Brooklyn, NYC.
    • Status: offline
    • Ribbons : 58
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/15 21:14:36 (permalink)
     
    Improved performance is improved performance.
     


      BUILD 1 2   |   MINI-ITX BUILD   |   MODSRIGS $1K WIN   |   HEATWARE 111-0-0   |   ASSOCIATE CODE CSKKXUT5Q9GVAFR

    #29
    vulcan1978
    iCX Member
    • Total Posts : 284
    • Reward points : 0
    • Joined: 2014/05/25 02:18:19
    • Status: offline
    • Ribbons : 0
    Re: Nvidia's Dumbest Decision (AdoredTV) 2020/09/16 08:01:59 (permalink)
    I'm not active here, if you want to see my thoughts and replies to similar comments go here, here's my last update though:
     
    Here's my comment from last night, before the reviews even went live: 
     
    "
    And basically, yeah Jim from AdoredTV is correct, the 3090 is ultimately the 2080 Ti all over again, this time +300 on top of $1200, and at 375 with a 450-500w maximum power draw on the efficiency curve it wont overclock like TU-102 (from 260w to 475w). You might get another 20% out of it at 450w whereas TU-102 @ 2200 MHz is like a 31% overclock (17,500-17,750 Timespy GPU). Basically NV had to pre-overclock the cards from the factory in order for Samsung 8nm EUV GA-102-300 to have a ~25% gap on TU-102 and for GA-102-200 to have a ~45% gap on it but when you overclock both cards reduce the final figure by 10% because TU-102 can overclock higher more (~30% vs ~20%). Again, 260w to 475w = 83% increase in power320 to 475w = ~48.3% increase in power Non-overclocked / shunted, top tier AIB (i.e. Kingpin): 260-320w =23% increase in power320-350 = 9% in power ("reference")320-370 = 15% in power (FE, limited run)   Overclocked 2080 Ti will be 15% slower than overclocked 3080 and 35% slower than the 3090 in rasterization. RT is a different story (leaked bench shows the 3080 doing Port Royal 45% faster than 2080 Ti, but we may as well reduce this amount by 10% at least comparing overclocked 2080 Ti to the 3080 considering that Samsung B-Die can do +1000 MHz whereas Micron runs too hot / requires more voltage and doesn't clock as high. I picked up 1k points in Port Royal solely from a memory overclock of +1000 MHz. That's fairly massive just from a memory overclock. 10,369: https://www.3dmark.com/pr/251502 vs the 3080 @ 11,412" And follow up this morning:  "Well well well, the 3080 does not clock higher just like I stated, and those leaked benches? Yeah those were legit, making the 3080 only 25% faster than the 2080 Ti in rasterization. Oh and hey, how about that non-existent overclock, a whopping +25 MHz on the core! Look at that 500 point increase in Timespy! What is this, 18,300 with an
    overclock? 
     
    Look at those screaming clocks! "1950-2000 MHz" OOOH YEAH, IT'S CLOCKING WAY HIGHER THAN TURING.
     
    https://www.guru3d.com/ar...founder_review,31.html "Absolute Hype Disappoints Absolutely" - LTT 
     
    https://youtu.be/AG_ZHi3tuyk
     
    https://www.overclock.net/threads/nvidias-dumbest-decision-adoredtv.1773304/page-2  

    8700k @ 5.1 GHz - 0 AVX @ 1.386v Dynamic Offset w/ EK Monoblock + Delid | Gigabyte Z370 Aorus Gaming 7 | EVGA 2080 Ti XC2 Ultra @ 2130 Mhz core, 7950 MHz memory @ 1.063v w/ 375W FTW3 vbios + Phanteks Glacier Block  | EK CE 420 + EK XE 360 | 2x16GB G-Skill Trident Z Royal 3600 MHz 17-20-20-38 | 2 TB Sabrent Rocket | Corsair RM1000x | Thermaltake View 71 | Alienware AW3418DW + Asus ROG Swift PG278Q (for 3D Vision) on Amazon Basics Arms | Win10 Pro 1809
     
    philosophersbunker.blogspot.com
    #30
    Page: 12 > Showing page 1 of 2
    Jump to:
  • Back to Mobile