At the end of the day Afterburner's voltage curve is cool, but essentially quite useless unless there's a way to stop Nvidia Boost 3.0/4.0 from doing what it likes.
You can spend hours moving those little dots and save what you think is the perfect voltage curve, but next time you start the curve will not look anything like what you spent hours setting up.
The curve will look the same when you press the profile you saved it to in Afterburner, but when you hit that 'apply' button, all bets are off.
It doesn't matter whether you set the curve under idle temps or full load temps, Nvidia Boost will do what it likes!
If you set the curve at idle temps, then Boost will push your clock way higher than you supposedly set in your voltage curve max clock speed, and this can cause instability in gaming.
If you set the curve at full load temps, it will mess with things once again, and you will end up with a lower clock speed.
Save yourself a bunch of time and stick to the sliders.
Start with resetting everything in Afterburner and run Aida64 or whatever GPU stress test you like - for me Realbench is the most realistic in terms of replicating actual gaming loads across the entire system.
See what your GPU does running at stock, as that will give you a good idea what it 'likes'.
Lower your fan speed in Afterburner so that you can hover around the 65C mark, which for Pascal seems to be about the highest it will go assuming you have good cooling.
Then increase your clock/memory until you find it stable and save to a profile.
Then max your fan speed using the fan curve in Afterburner.
There will be many who will say the voltage curve somehow magically creates more stable overclocks, but unless you have a custom BIOS with power limits removed and higher voltages allowed, it's really a waste of time. I expect to be flamed, but I challenge anyone to prove that Afterburner's voltage curve can be locked and static to what a user set on a stock Nvidia FW, and it never changed or was ignored by Nvidia Boost.
I've spent countless hours across Pascal and Turing cards and Nvidia's Boost 3.0/4.0 has a mind of it's own.