Feklar
I don't get it. All these 3090 FTW3 card use identical design and components. Why is there such a difference in plug power usage between cards even when the power supply is not the issue. Bad design? Bad components?
EVGA used a downgraded inexpensive analog voltage controller instead of a commonly-used more expensive digital voltage controller. When they did that, it is possible that their implementation became more heavily reliant on resistor, capacitor, and or/inductor values for proper analog measurement; and they may have not adequately taken into account manufacturing tolerances for those supplementary components. Additionally, if they cheapened out on the voltage controller, what's to say that they didn't decide to use cheaper supplementary components with looser manufacturing tolerances as well? There are different standardized tiers in manufacturing tolerances, and you have to pay different prices for a higher tier component with better guaranteed tolerance. Standard tolerances range from +/- 20% of rated value for cheap components, to +/- 1% of rated value for expensive components.
Manufacturing tolerances -- and not adequately engineering with manufacturing tolerances in mind -- can
easily explain why different cards read different values and some are more unbalanced than others.
mberohn
GPU PCIe +3.3V Input Power (est) is also holding firm at 0.336W. Is this too low, or is the HWiNFO64 showing the decimal in the wrong place?
As far as I know, these cards don't use the +3.3v power input, so I would expect the value indicated would be very low, and would simply be a measurement of error.