Flint 1760
Omnipotent Enthusiast
- Total Posts : 8149
- Reward points : 0
- Joined: 2009/04/26 15:44:26
- Status: offline
- Ribbons : 45
VideoCardz - Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapter to comply with PCIe Gen5 specs “We have been covering PCIe Gen5 power connectors for a while now. The first information emerged back in October last year when rumors on GeForce RTX 3090 Ti possible specs have been leaked. Nearly 4 months later, we know a lot more about both the new NVIDIA card as well as the connector itself. The PCIe Gen5 power cable also known as 12VHPWR or 16-pin power cable is a new standard that will replace existing 8-pin 150W power cables with a single 600W power cable. In order to comply with the 600W specs, the cable needs to have 12 wires for the current and 4 data paths for the signal. The data connectors ensure that the connection has been established and the card actually requires more than 450W of power. In case those data signals are not present, the power supply will feed up to 450W of power. According to Gigabyte, next-gen PCIe Gen5 graphics cards will be powered either through a single 16-pin power cable or three 8-pin to one 16-pin cable adapters: ‘The UD1000GM PCIE 5.0 power supply supports the PCIe Gen 5.0 graphics cards and it is capable to deliver the increasing power that the high-end graphics card demand. Traditional power supplies need a three 8-pin to 16-pin adapters to support the latest PCIe Gen 5.0 graphics cards. The new UD1000GM PCIE 5.0 power supply needs only a single 16-pin cable to directly supply power to the PCIe Gen 5.0 graphics cards. Moreover, the PCIe Gen 5.0 16-pin cable provides up to 600 watts of power to the graphics card, but it also simplifies the number of cables, significantly reducing the cable clutter, making the graphics card installation easier and it helps with the airflow in the chassis. In addition, the UD1000GM PCIE 5.0 also provides four PCIe 8-pins for the graphics cards, so it can meet the needs of current or next-generation high-end graphics cards at the same time.’ - Gigabyte PR This is actually the first-ever mention of a triple 8-pin to 16-pin adapter. It is unclear whether Gigabyte refers to the 12+0pin cable or the actual 16-pin cable. The latter must have the data signals connected to comply with 600W specs, otherwise 450W power will be supplied. Official PCI-SIG PCIe Gen5 power specs are not yet available, hence it is unclear how much different is the PCIe Gen5 “12+0-pin” cable from NVIDIA 12-pin. What we do know is that they are compatible with each other, but NVIDIA never confirmed how much power can go through this cable. NVIDIA RTX 30 Founders Edition models equipped with a 12-pin connector make use of a 2×8-pin to 1x-12-pin adapter, theoretically only feeding up to 300W of power. Whereas the ASUS Thor 12-pin connector and cable (not adapter) are officially up to 450W. Due to this distinction, one must assume that they are not ‘exactly’ the same. Things should become clear once GeForce RTX 3090 Ti and PCIe Gen5 power supplies launch. Those are expected in the coming weeks.” My thoughts: Eventually we will either need adapters or new PSUs. With the power requirements that are projected for the next generation of upper end GPUs, 1200W plus PSUs are going to be a hot item.
|
Michapolys
Superclocked Member
- Total Posts : 199
- Reward points : 0
- Joined: 2021/12/25 09:26:18
- Status: offline
- Ribbons : 2
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 08:37:36
(permalink)
Three 8-pin to 16-pin seems overstated.
The whole press release looks like an infomercial advertising their new PSU.
|
Miguell
FTW Member
- Total Posts : 1112
- Reward points : 0
- Joined: 2008/04/16 14:43:51
- Location: Portugal
- Status: offline
- Ribbons : 0
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 09:39:28
(permalink)
well.. with the price power of electricity rising every year... and the obvious need to have a Chernobyl nuclear reactor on the backyard to power newer gpu's,,,, its time to stop... PC gaming has been fun these last 22 years.. altho ill wait for 2025 and see if by then they still keep this trend...
Case: Cooler Master Stacker 830Display: 32" AOC Q3279VWFD8 @2560x1440@75HzCpu: Intel Core i7-8700Cpu Cooler: Cooler Master - MasterLiquid ML120L - RGBMobo: Asus ROG Strix Z390-H GamingVga: Asus Dual RTX 4060 Ti 16GB Advanced EditionRam: 32GB DDR4 G.SKILL - RIPJAWS V @3200MhzSound: Hama uRage soundZbar 2.1 Unleashed - (Optical)Storage: 500GB SSD M.2 A2000 NVMe Kingston (OS) + 8TB (4+4) HDD X300 Toshiba (Data)Psu: SeaSonic M12 700W Os: W10 Pro 64Bit
|
ty_ger07
Insert Custom Title Here
- Total Posts : 16602
- Reward points : 0
- Joined: 2008/04/10 23:48:15
- Location: traveler
- Status: offline
- Ribbons : 271
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 10:09:15
(permalink)
Michapolys Three 8-pin to 16-pin seems overstated.
What do you mean?
ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
|
Hoggle
EVGA Forum Moderator
- Total Posts : 8899
- Reward points : 0
- Joined: 2003/10/13 22:10:45
- Location: Eugene, OR
- Status: offline
- Ribbons : 4
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 10:48:20
(permalink)
I will wait and see since it could be just a lot of marketing fluff that will not really matter. I know the connector is coming but as long as the graphics cards have an adapter in the box like the founders edition had I am not going to rush out and buy a PSU.
|
Michapolys
Superclocked Member
- Total Posts : 199
- Reward points : 0
- Joined: 2021/12/25 09:26:18
- Status: offline
- Ribbons : 2
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 11:48:44
(permalink)
ty_ger07
Michapolys Three 8-pin to 16-pin seems overstated.
What do you mean?
Exactly what I said. An adapter with three 8-pin inputs for one 16-pin output does not make sense.
|
ty_ger07
Insert Custom Title Here
- Total Posts : 16602
- Reward points : 0
- Joined: 2008/04/10 23:48:15
- Location: traveler
- Status: offline
- Ribbons : 271
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 12:09:43
(permalink)
Michapolys
ty_ger07
Michapolys Three 8-pin to 16-pin seems overstated.
What do you mean?
Exactly what I said. An adapter with three 8-pin inputs for one 16-pin output does not make sense.
I was wondering if that is what you were saying. Actually it does make sense. Specifications and standards exist for a reason. 16-pin is coming for a reason, and we will need something to bridge the gap during the transition.
ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
|
Michapolys
Superclocked Member
- Total Posts : 199
- Reward points : 0
- Joined: 2021/12/25 09:26:18
- Status: offline
- Ribbons : 2
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 12:31:55
(permalink)
ty_ger07 Actually it does make sense. Specifications and standards exist for a reason. 16-pin is coming for a reason, and we will need something to bridge the gap during the transition.
Why? Three 8-pin inputs for a single 16-pin output seem nonsensical here. Why do they make sense for you?
|
yaymz
SSC Member
- Total Posts : 736
- Reward points : 0
- Joined: 2006/09/08 07:14:31
- Status: offline
- Ribbons : 4
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 12:45:45
(permalink)
ty_ger07
Michapolys
ty_ger07
Michapolys Three 8-pin to 16-pin seems overstated.
What do you mean?
Exactly what I said. An adapter with three 8-pin inputs for one 16-pin output does not make sense.
I was wondering if that is what you were saying.
Actually it does make sense. Specifications and standards exist for a reason. 16-pin is coming for a reason, and we will need something to bridge the gap during the transition.
I believe the 16pin (12+ 4) is supposed to be on it's own dedicated 12v rail (coming out on pci-e 5.0 PSUs). The three 8-pins would use a shared rail and hence may cause power related issues, due to a number of variables within a build. I believe the article misspoke when they claim that adapters will be a viable option. Would three 8-pin connectors work? I would say probably yes.. But you increase the chance of power issues happening at high load. This is just my assumption. edit: There are also no signal lanes with an adapter, so that alone could cause issues with any upcoming pci-e 5.0 GPUs.
post edited by yaymz - 2022/02/21 12:52:22
cpu: Intel 12900k-EK Quantum Magnitude waterblockmobo: Asus z690 Apexram: G.skill DDR5 @ 6000+. gpu: MSI 4090 Suprim Liq x on EK-Quantum Vector² Trio ABP waterblockssd: Samsung m.2 980 Pro 2TB (x2)psu: beQuiet Dark Power Pro 1500wcase: Lian-li o11d xl monitor: Asus ROG Swift 27" 1440p @240hz (PG279QM).[/
|
ty_ger07
Insert Custom Title Here
- Total Posts : 16602
- Reward points : 0
- Joined: 2008/04/10 23:48:15
- Location: traveler
- Status: offline
- Ribbons : 271
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 14:04:09
(permalink)
Michapolys
ty_ger07 Actually it does make sense. Specifications and standards exist for a reason. 16-pin is coming for a reason, and we will need something to bridge the gap during the transition.
Why? Three 8-pin inputs for a single 16-pin output seem nonsensical here. Why do they make sense for you?
The rating of the connectors (contact pressure, contact area, size, metallurgy, coating) and the rating of the plastic (durability, heat resistance, flash point) dictate the amperage capability, not just the number of pins. 16 higher-rated contacts can handle what 24 lower-rated contacts can. As stated, standards mean something. We are heading to a 16-pin connector because it can handle more with less. In the mean-time, these adapters are a bridge. Would you rather that replacing the PSU was the only option? What I hope for is one day the reserved/unused pin gets used for remote voltage sensing. I think we are long past the point that the PSU should have remote voltage sensing capability in order to detect and counteract voltage droop in the wires, and shut down due to fault detection (overload) if the voltage droop becomes too great.
post edited by ty_ger07 - 2022/02/21 14:42:45
ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
|
Michapolys
Superclocked Member
- Total Posts : 199
- Reward points : 0
- Joined: 2021/12/25 09:26:18
- Status: offline
- Ribbons : 2
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 14:50:35
(permalink)
ty_ger07 The rating of the connectors (contact pressure, contact area, size, metallurgy, coating) and the rating of the plastic (durability, heat resistance, flash point) dictate the amperage capability, not just the number of pins. 16 higher-rated contacts can handle what 24 lower-rated contacts can. As stated, standards mean something. We are heading to a 16-pin connector because it can handle more with less. In the mean-time, these adapters are a bridge. Would you rather that replacing the PSU was the only option? What I hope for is one day the reserved/unused pin gets used for remote voltage sensing. I think we are long past the point that the PSU should have remote voltage sensing capability in order to detect and counteract voltage droop in the wires, and shut down due to fault detection (overload) if the voltage droop becomes too great.
This provides zero explanation on why three 8-pin inputs would make sense. The number seems nonsensical here.
|
ty_ger07
Insert Custom Title Here
- Total Posts : 16602
- Reward points : 0
- Joined: 2008/04/10 23:48:15
- Location: traveler
- Status: offline
- Ribbons : 271
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 18:23:56
(permalink)
Michapolys
ty_ger07 The rating of the connectors (contact pressure, contact area, size, metallurgy, coating) and the rating of the plastic (durability, heat resistance, flash point) dictate the amperage capability, not just the number of pins. 16 higher-rated contacts can handle what 24 lower-rated contacts can. As stated, standards mean something. We are heading to a 16-pin connector because it can handle more with less. In the mean-time, these adapters are a bridge. Would you rather that replacing the PSU was the only option? What I hope for is one day the reserved/unused pin gets used for remote voltage sensing. I think we are long past the point that the PSU should have remote voltage sensing capability in order to detect and counteract voltage droop in the wires, and shut down due to fault detection (overload) if the voltage droop becomes too great.
This provides zero explanation on why three 8-pin inputs would make sense. The number seems nonsensical here.
Because three 8-pins can do what two 8-pins can't. Are you being serious?
ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
|
Michapolys
Superclocked Member
- Total Posts : 199
- Reward points : 0
- Joined: 2021/12/25 09:26:18
- Status: offline
- Ribbons : 2
Flagged as Spam (1)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 19:26:35
(permalink)
ty_ger07 Because three 8-pins can do what two 8-pins can't. Are you being serious?
Totally serious. What can you do with three 8-pin inputs that is not possible with two or four of them?
|
ty_ger07
Insert Custom Title Here
- Total Posts : 16602
- Reward points : 0
- Joined: 2008/04/10 23:48:15
- Location: traveler
- Status: offline
- Ribbons : 271
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 19:27:42
(permalink)
ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
|
Nereus
Captain Goodvibes
- Total Posts : 18192
- Reward points : 0
- Joined: 2009/04/09 20:05:53
- Location: Brooklyn, NYC.
- Status: offline
- Ribbons : 58
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/21 22:56:17
(permalink)
Might make cable management a little easier, so there's that.
|
rjohnson11
EVGA Forum Moderator
- Total Posts : 85038
- Reward points : 0
- Joined: 2004/10/05 12:44:35
- Location: Netherlands
- Status: offline
- Ribbons : 86
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 00:06:26
(permalink)
Well I don't see the need right now to replace my PSU, but maybe worth considering in a few months.
|
the_Scarlet_one
formerly Scarlet-tech
- Total Posts : 24080
- Reward points : 0
- Joined: 2013/11/13 02:48:57
- Location: East Coast
- Status: offline
- Ribbons : 79
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 04:44:48
(permalink)
ty_ger07 Because three 8-pins can do what two 8-pins can't. Are you being serious?
I’ll be serious. The three 8 pins are 24 connections going to a 16 pin connector. You can say that it is required to make the voltage higher because pci ratings of 75w for a 6 pin and 150w for an 8 pin would be too low, but really? REALLY? We have seen GPU’s modified to pull WAY more than that. It’s a conservative rating for a reason. The 8 pin has 3 12v, 3 ground, and 2 sense. The 6 pin only has 3 12v and 3 ground. So two 8 pins, six 12v and six ground with 4 sense. You can say the mini 16 pin has different ratings, but seriously, where are the extra 12v, ground, and sense cables magically going? The cable provides the exact same power across the rest of the cable and connectors. The current mini 12 pin has 12 pins, and NVidia *Tries* to hide the fact that the two 8 pins do not use the sense connections since there is nowhere for them to go. You can act like people are stupid or ignorant, but until gigabyte provides a reason of where the extra 8 pins are actually going, just saying “ratings” isn’t going to cut anything. The 3070 founders I picked up has a single 8 pin to a full mini 12 pin. There are 12 cables running between the single 8 pin to the mini 12 pin, and 6 of the pins inside the mini 12 are removed. Why wouldn’t NVidia just run the 6 pins that are actually used, and why isn’t the card rated at 125w versus 225w? The 6 pin is providing 150w, that it isn’t rated for, because the ratings are severely outdated.
post edited by the_Scarlet_one - 2022/02/22 04:56:49
|
ty_ger07
Insert Custom Title Here
- Total Posts : 16602
- Reward points : 0
- Joined: 2008/04/10 23:48:15
- Location: traveler
- Status: offline
- Ribbons : 271
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 04:51:41
(permalink)
The 16-pin connector is rated for up to 600 watts. When used with an adapter connecting to three 8-pins, it is rated for up to 450 watts. Two 8-pins are rated for 300 watts. So, in order to reach 450 watts, three 8-pins are needed. Just because two 8-pins from a specific company may be able to safely handle 450 watts, the manufacturer cannot release a product based on that assumption. If they did, it could destroy their company and reputation. Standards mean something. It doesn't matter what we believe about where the wires go or what they connect to. What matters is that they release a product which complies with the standards so that they don't ruin their company and reputation. I assume that "extra" wires share pins, but I am not sure. I mean, EVGA released an adapter where all the pins were connected together, and that was apparently fine. Whatever the standard allows them to do, is what they will do. The important thing for all of us is that the standards make it so that things are universal; without standards, this would be chaos.
post edited by ty_ger07 - 2022/02/22 05:01:22
ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
|
the_Scarlet_one
formerly Scarlet-tech
- Total Posts : 24080
- Reward points : 0
- Joined: 2013/11/13 02:48:57
- Location: East Coast
- Status: offline
- Ribbons : 79
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 04:59:24
(permalink)
Again, three 8 pins provide 24 connections. Where are the extra 8 connections actually going?
You keep saying standards and ratings, but cables vanishing don’t actually make sense.
Again, the 3070 founders using a 6 pin with a 150w rating. 6 pins don’t have 150w ratings, JUST like you just stated.. so… back to the drawing board, why are we already using more wattage than the rated spec? Because the spec is outdated and the rating is low.
|
ty_ger07
Insert Custom Title Here
- Total Posts : 16602
- Reward points : 0
- Joined: 2008/04/10 23:48:15
- Location: traveler
- Status: offline
- Ribbons : 271
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 05:02:02
(permalink)
I assume that "extra" wires share pins, but I am not sure. Why do you think otherwise? I mean, EVGA released an adapter where all the pins were connected together, and that was apparently fine. Whatever the standard allows them to do, is what they will do. The important thing for all of us is that the standards make it so that things are universal; without standards, this would be chaos. the_Scarlet_one Again, the 3070 founders using a 6 pin with a 150w rating. 6 pins don’t have 150w ratings, JUST like you just stated.. so… back to the drawing board, why are we already using more wattage than the rated spec? Because the spec is outdated and the rating is low.
Because some companies have enough money to risk breaching spec. I think it is a super dumb thing for a company to do. ... To assume that a PSU company didn't cut corners on their 6-pin connector -- because they could -- to save money.
post edited by ty_ger07 - 2022/02/22 05:06:44
ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
|
the_Scarlet_one
formerly Scarlet-tech
- Total Posts : 24080
- Reward points : 0
- Joined: 2013/11/13 02:48:57
- Location: East Coast
- Status: offline
- Ribbons : 79
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 05:26:06
(permalink)
Could you show me the connector that EVGA released that you are referring to? If you are going to send a link to the 12 pin to dual 8 pin, I would advise to actually wire the cable before assuming.
|
ty_ger07
Insert Custom Title Here
- Total Posts : 16602
- Reward points : 0
- Joined: 2008/04/10 23:48:15
- Location: traveler
- Status: offline
- Ribbons : 271
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 05:30:26
(permalink)
Let's try a different angle. Let's assume that they are not allowed to share pins, and EVGA released a product which doesn't conform to spec. Even if they don't share pins, does this adapter make sense? Yes. How? Why? What is the difference between a 6-pin and an 8-pin PCI-E connector? Not much, huh? The number of wires which do the work are the same. So what is the difference? The difference is the rated amperage per pin which does the work. This means the gauge of the wire, the quality of the insulation, the quality of the plastic, and the quality of the connector are all chosen based on a higher amperage requirement. The 8-pin connector is physically different for the sole purpose of indicating and ensuring that it meets higher standards. Conversely, the 6-pin connector is allowed to meet lower standards across the same number of wires which do work. So, why does this adapter have three 8-pins instead of three 6-pins? Because it needs to ensure that it gets the rated power via approved connectors rated to provide that power. Since the 8-pin connector has "extra" wires which are not required to do work in the first place, it would be acceptable for those extra wires to not be used. Gigabyte could choose to use those extra wires for redundancy, if allowed, or may not use them. Either way, 8-pin connectors are needed to meet the wattage rating of the adapter.
ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
|
ty_ger07
Insert Custom Title Here
- Total Posts : 16602
- Reward points : 0
- Joined: 2008/04/10 23:48:15
- Location: traveler
- Status: offline
- Ribbons : 271
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 05:31:48
(permalink)
the_Scarlet_one Could you show me the connector that EVGA released that you are referring to?
EVGA Power Link
ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
|
the_Scarlet_one
formerly Scarlet-tech
- Total Posts : 24080
- Reward points : 0
- Joined: 2013/11/13 02:48:57
- Location: East Coast
- Status: offline
- Ribbons : 79
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 05:55:40
(permalink)
Did you just pull up a power link picture to justify the connectors on a triple 8 pin to 16 pin mini-fit? LoL. Alright, the difference between a 6 pin and an 8 pin is the sense cables. For someone trying to make everyone look ignorant in this thread, I am sure you knew that. The only thing the sense cable does it verify that the product connecting requires the extra power, just like the 12 and 16pin minifit. The 12 pin using only 12 connections. The 16 pin is adding 4 sense connections, nothing more. The current mini 12 pin only uses the 12v and ground, none of the sense pins. It is a dual 6 pin connection. Nothing more. The cables are not wired together, the pins are not rated higher, they aren’t different material, the cables aren’t different. There isn’t two 18 gauge or 16 gauge wires running into one minor fit connector. Here is a few examples, that are actually relevant to the topic. ![](https://i.imgur.com/PKBrPWz.png) ![](https://i.imgur.com/jTMd9Li.png) ![](https://i.imgur.com/pg69L3l.png) Quick note, the *only* companies trying make it look like more pins are being utilized, aren’t actually utilizing them. So, again, for those blindly defending the explanation of three 8 pins to a 16 pin, where are the extra connections terminating? Until you or the company can provide a schematic showing 24 cables terminating at 16 pins, your argument is invalid and a waste of time. Once a schematic showing 24 cables from the PSU terminating at 16 pins minifit, and how that works, I will gladly listen.
post edited by the_Scarlet_one - 2022/02/22 06:05:06
|
ty_ger07
Insert Custom Title Here
- Total Posts : 16602
- Reward points : 0
- Joined: 2008/04/10 23:48:15
- Location: traveler
- Status: offline
- Ribbons : 271
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 07:52:14
(permalink)
the_Scarlet_one Did you just pull up a power link picture to justify the connectors on a triple 8 pin to 16 pin mini-fit?
No, I did to show that EVGA shared pins, as you requested. What is your argument? An 8-pin is certified the same as a 6-pin? No. Edit: I don't understand your confusion. Let's try this another way. Let's say that a power supply has an 20-gauge 6-pin connector and a 16-gauge 8-pin connector. As a manufacturer, what do you do to try to ensure that the user connects it to the 16-gauge 8-pin connector? You make it so that it fits the 8-pin connector, and you advertise it to be used with the 8-pin connector. The 6-pin connector is not rated for that load, so you design the product to attempt to keep people from using that connector. Otherwise, things you don't have control over (PSU manufacturers who cut corners to save money, because they are allowed to) will cause your product to fail. You use standards to your advantage to ensure your product functions.
post edited by ty_ger07 - 2022/02/23 04:39:13
ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
|
the_Scarlet_one
formerly Scarlet-tech
- Total Posts : 24080
- Reward points : 0
- Joined: 2013/11/13 02:48:57
- Location: East Coast
- Status: offline
- Ribbons : 79
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 08:02:55
(permalink)
I really wish you read plain English sometimes.
You showed that EVGA utilized a PCB in a completely irrelevant product in an irrelevant context. You have proven that you don’t actually care what people are asking and only want to try and look smart while being incredibly ignorant to the question that is asked. You are just showing your true colors at this point.
|
Michapolys
Superclocked Member
- Total Posts : 199
- Reward points : 0
- Joined: 2021/12/25 09:26:18
- Status: offline
- Ribbons : 2
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 08:36:36
(permalink)
the_Scarlet_one I really wish you read plain English sometimes.
You showed that EVGA utilized a PCB in a completely irrelevant product in an irrelevant context. You have proven that you don’t actually care what people are asking and only want to try and look smart while being incredibly ignorant to the question that is asked. You are just showing your true colors at this point.
In any case, what Gigabyte states is nonsensical. You either need four 8-pin to a single 16-pin if the 8-pins follow the standard ratings for wire gauge, length and male/female pins, or you need two 8-pin to a single 16-pin if the 8-pins use hcs (high current system) pins and thicker/shorter wires. In my opinion, you are better off using dedicated 16-pin to 16-pin or 2x8-pin to 16-pin cables for each psu. People are stupid, it is only a matter of time until they use a single cable with two 8-pin connectors on a dual 8-pin to 16-pin adapter, with destructive results.
|
ty_ger07
Insert Custom Title Here
- Total Posts : 16602
- Reward points : 0
- Joined: 2008/04/10 23:48:15
- Location: traveler
- Status: offline
- Ribbons : 271
![](//www.evga.com/community/modsRigs/images/sm_mods_off.gif)
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 10:08:40
(permalink)
the_Scarlet_one I really wish you read plain English sometimes.
You showed that EVGA utilized a PCB in a completely irrelevant product in an irrelevant context. You have proven that you don’t actually care what people are asking and only want to try and look smart while being incredibly ignorant to the question that is asked. You are just showing your true colors at this point.
Huh? I do not understand what is bothering you. A 6-pin is not rated to handle what an 8-pin is rated to handle. You asked what they were going to do with the extra wires, and I said that they could be used and combined like EVGA has in the past, or they could not be used. Either option is immaterial to why a 8-pin is used instead of a 6-pin. You asked when has EVGA combined pins. I provided an example. And then you ask why I did. Because you told me to. You are driving this conversation. What is frustrating you? Drive it a different way, if that is what you want. The minimum required amperage rating per working pin of a 6-pin plug is less than the minimum required amperage rating per working pin of a 8-pin plug. Each have the same number of pins which do actual work, but the minimum quality requirement of the 8-pin plug is higher. In order to meet a 450 watt rating for the adapter, it must use three 8-pin plugs. It's a matter of standards. I don't know what is causing confusion.
post edited by ty_ger07 - 2022/02/22 10:13:02
ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium
|
Hoggle
EVGA Forum Moderator
- Total Posts : 8899
- Reward points : 0
- Joined: 2003/10/13 22:10:45
- Location: Eugene, OR
- Status: offline
- Ribbons : 4
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 10:51:50
(permalink)
rjohnson11 Well I don't see the need right now to replace my PSU, but maybe worth considering in a few months.
For me I look at this and go it would take the 3090Ti using the connector and me deciding I am going to buy one to even consider a PSU replacement this year. If the 3090Ti does make a switch and I decide not to upgrade to it from my 3080 FTW3 then I might look at a new PSU before the next generation comes out.
|
Michapolys
Superclocked Member
- Total Posts : 199
- Reward points : 0
- Joined: 2021/12/25 09:26:18
- Status: offline
- Ribbons : 2
Re: Gigabyte confirms that next-gen GPUs will require a new 16-pin power cable or an adapt
2022/02/22 10:55:03
(permalink)
ty_ger07 The minimum required amperage rating per working pin of a 6-pin plug is less than the minimum required amperage rating per working pin of a 8-pin plug. Each have the same number of pins which do actual work, but the minimum quality requirement of the 8-pin plug is higher. In order to meet a 450 watt rating for the adapter, it must use three 8-pin plugs. It's a matter of standards. I don't know what is causing confusion.
No. Just no. On both 6-pin and 8-pin pcie plugs each pin is specified for 75 watts minimum each. You are also the only one stating that the hypothetical adapter has a 450 watt rating. The problem is that instead of just answering any questions thrown at you, you use conversation derailing tactics by using a combination of deflection, projection and world salad with some catchy keywords thrown into the mix.
|