2020/04/02 14:29:46
Jayrock
ProDigit
Just like with Nvidia, AMD allows you to lower the wattage on the GPU, at the cost of a minor PPD drop.
 
The Vega 64/Radeon VII can be ran as low as 190W, with only minimal PPD loss.
The 2080Ti I would run at 180W for core 21, and 225W for core 22. You'll still get 3.3 M PPD average (peaking at 4.4 MPPD on some beta projects, which is higher than running it stock).
 
Same with the rest of the RTX GPUs.
2060 to 2070 runs best at <129W
2070S and 2080 at <149W
Granted, that's with an overclock. One you wouldn't be able to apply at stock settings, because the GPU will run too high in frequency, and error.




Wow. I love this post. PPD/W is what I want to understand as I get back in this. Thanks.
2020/04/03 06:56:37
Cool GTX
AMD Releases Radeon Software Adrenalin 20.4.1 Beta Drivers
 
  "Running Folding@Home while also running an application using hardware acceleration of video content can cause a system hang or black screen. A potential workaround is disabling hardware acceleration for the application that has it enabled."  
2020/04/03 07:16:26
ipkha
Sure, AMD doesn't make bad cards, just bad drivers.
I hate to think I'm dunking on AMD for past mistakes, but I swear there are always ongoing issues with AMDs attempts at drivers.
Between x264 hardware encoding issues, OpenCL glitches and such it is hard to recommend AMD even though they make good hardware.  Not that Nvidia is perfect, they just seem to a better overall focus on driver quality and on supporting fully the advertised features.  And before anyone complains that Nvidia has a lot more money and developers, you can't fix quality by throwing manpower and money at a problem.  You need a culture of not releasing crap.
2020/04/03 15:33:48
turbomadman
That AMD card is folding on par. 
If anyone is wondering what card to purchase vs PPD this chart appears very accurate and helpful for me. Link
2020/04/04 11:48:58
Cool GTX
turbomadman
That AMD card is folding on par. 
If anyone is wondering what card to purchase vs PPD this chart appears very accurate and helpful for me. Link



The chart is helpful .. but, it is just - a guide - ... Pascal & Turning - PPD - seems very low compared to my experience .. though I do OC
 
  PPD -  can see big variations - depending on the specific WU
 
 
 
 
2020/04/08 03:44:21
QuintLeo
jedi95
 
AMD:
Radeon VII: 1.5M PPD (~250W)
Vega 64: 1.35M PPD (~250W)
5700 XT: 1.3M PPD (~200W)
5500 XT: 550K PPD (~110W)
 
Nvidia:
RTX 2080 Ti: 3.5M PPD (~300W)
RTX 2070 Super: 1.8M PPD (~200W)
RTX 2070: 1.55M PPD (~180W)
GTX 1650 Super: 450K PPD (~100W)
 



My GTX 1080ti cards pull anywhere from 180 to 220 watts, and routinely are in the 1.2-1.4M PPD range sometimes 1.5M despite being generation old.
I have ALL of them TPD lowered for cooling/longevity reasons.
 
One outlier on a 1x slot is only pulling 700-800K - it appears that Core 22 is a LOT more bandwidth sensitive than Core 21 was, as this is under XUbuntu LINUX.
 
AMD is still quite a bit behind NVidia in the folding competition, though they've closed the gap SOME in the last couple generations.
2020/04/09 02:47:50
ProDigit
QuintLeo
jedi95
 
AMD:
Radeon VII: 1.5M PPD (~250W)
Vega 64: 1.35M PPD (~250W)
5700 XT: 1.3M PPD (~200W)
5500 XT: 550K PPD (~110W)
 
Nvidia:
RTX 2080 Ti: 3.5M PPD (~300W)
RTX 2070 Super: 1.8M PPD (~200W)
RTX 2070: 1.55M PPD (~180W)
GTX 1650 Super: 450K PPD (~100W)
 



My GTX 1080ti cards pull anywhere from 180 to 220 watts, and routinely are in the 1.2-1.4M PPD range sometimes 1.5M despite being generation old.
I have ALL of them TPD lowered for cooling/longevity reasons.
 
One outlier on a 1x slot is only pulling 700-800K - it appears that Core 22 is a LOT more bandwidth sensitive than Core 21 was, as this is under XUbuntu LINUX.
 
AMD is still quite a bit behind NVidia in the folding competition, though they've closed the gap SOME in the last couple generations.


Yes, Core 22 saturates a PCIE 3.0 x4 slot (~80%) for anything faster than a 2080, vs <28% for core 21.
You really want Linux, as Windows will need a PCIE 8x slot.
Many motherboards nowadays only support up to 2 Nvidia GPUs anyway (even if they have more slots available).
Best still is if you can find a board running PCIE 3.0 in a x8/x4/x4 configuration (or x8 x4 x4 'x4/x1' for extended ATX). But finding a board that does a 4th GPU at x4 speeds is rare.
12

Use My Existing Forum Account

Use My Social Media Account