TDP stands for "Thermal Design Power" and is not an indication of the maximum amount of power the card will necessarily consume. According to Techpowerup's measurements,
each air-cooled GTX 980 Ti can consume 10 to 40 watts more than a GTX 780 under load.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980_Ti/29.html Additionally, your liquid-cooled cards will consume more power than the numbers reported above due to your cards running at lower temperatures and at higher frequencies. Cooler cards consume more power at the same voltage, frequency, and load; it has been proven many times. Adding in higher frequencies and higher and more consistent boost at cooler temperatures, I imagine that each of your GTX 980 Ti graphics cards could easily consume
at least 30 watts more than each of your GTX 780 graphics cards.
If only your computer's PSU were plugged into the UPS, and if your PSU were brand new, it would consume up to 1381 watts at max load to provide 1207 watts to your computer hardware. It would have a power factor at max load between 98% and 99% which means that you would lose another 14 to 28 watts due to the power factor and pull up to 1409 watts from your UPS. If you have anything else plugged into the UPS (such as a monitor, printer, modem, router, etc.), chances are that their power factors are much worse and closer to 80% to 90% and that they will draw another 50 to 100 watts from your UPS quite easily.
Finally, since your PSU and UPS are used, their efficiencies, power factors, and output capabilities have decreased/degraded. It is all adding up in my opinion.
I plugged all your information into an online power supply calculator using my best guess based on the information you have provided, and it recommended that your power supply be at least 1100 watts (you got that covered) and that your UPS be at least 2000 VA (another confirmation that you are probably hitting its limit).
post edited by ty_ger07 - Saturday, October 24, 2015 4:30 AM