So I have an Asus 1440p monitor and 2 Samsung 1080p monitors.
For some reason I can only get the Asus to work if I plug it in via HDMI. It still looks sharp and I am using a redmere 10.2 GB/s cable for it but I wanted to hook it up to the Displayport. When doing so, windows and the nvidia software recognizes it is there, but when I enable it, I jsut get flashing for a few seconds then back to disconnected.
I can plug it in to the displayport if I unplug one of the other two monitors. Just cannot figure it out, have used 3 different displayport cables that I verified are good, no help there.
It currently runs with one 1080P plugged into the display port with a DP to HDMI converter, the 1440p plugged in the middle HDMI slot (it is running in 1440p, looks good) and then a DVI-I to HDMI adapter to run the last monitor. All on the same #1 card.
Everything works fine in this setup and looks good, the redmere cables do make a difference (I had it running with a standard HDMI calbe and had framerate stuttering before) so I don't have any need to use the displayport, but I am curious as to what is going on.
Any ideas?
- CPU: 5900X / 3800XT
- MB: Asus Strix X570-E / Asus TUF B550
- RAM: 32 GB TridentZ 3200 CL14 (stock timings) / 64 GB Ripjaws 3600 cl 16
- GPU: EVGA 3080TI Hybrid (converted from FTW 3 Ultra) / EVGA 3080 Hybrid (converted from FTW 3 Ultra)
- Storage: Sabrent 2 TB PCIe 4.0 SSD,WD SN750 1 TB PCIe 3.0 SSD / 2x 4TB WD RED Pro, 1x Sabrent Rocket Pro 1TB, 4x Crucial MX500 2TB
- Cooling: H115i Pro Platinum / Vetroo 360mm AIO
- Case: Corsair 680X / Corsair 5000D Black