2016/11/07 17:24:17
delicieuxz
fightinfilipino
Troyhe98
gara024
That was a rhetorical question. Also "EVGA has investigated these reports and after extensive testing, below are our findings:
  • On ACX 3.0, EVGA focused on GPU temperature and the lowest acoustic levels possible. Running Furmark, the GPU is around 70C +/- and the fan speed is running approximately 30% duty cycle or lower.
  • However, during recent testing, the thermal temperature of the PWM and memory, in extreme circumstances, was marginally within spec and needed to be addressed."
 


This is something that never would have happened if EVGA decided not to cut corners when it came to cooling. They could have easily released this card with proper thermal padding and the entire issue would have been presented.

I think this quote, by GamersNexus, sums it up best "The ACX coolers have fallen prone to a number of design flaws over the past few years, but this one is one of the easiest to prevent".

Source: -->http://www.gamersnexus.ne...mment-comment=10005906



ffs they're not cutting corners. DON'T. USE. FURMARK.
 
http://nvidia.custhelp.com/app/answers/detail/a_id/2603/~/using-furmark-and-other-stress-tests-with-geforce-graphics-cards
 
in real world gaming scenarios, you're NEVER going to hit the heat levels Furmark generates. THAT'S THE PROBLEM.

What? Stop saying crazy stuff. You are nobody to tell people what programs they can or can't run with their GPU. It is NOT the problem that people are running a particular program. The problem is that the GPUs don't have proper cooling, and so running high-intensity programs is particularly risky.
2016/11/07 17:24:53
Troyhe98
Hadens
Are you supposed to flash both bios, or just the one? And if the case is the latter, primary or secondary?
I have the GTX 1080 FTW DT card.


If you have 2 on your card you should flash the master (primary), off your machine and switch to the slave (secondary) and flash that one as well. After that, turn off your machine, put the card back into master and you should be good to.
2016/11/08 02:06:06
Cyricor
Its a bit worse than just that EVGA decided to cut corners. Its a design flaw. Someone actually designed the cooling solution on this card that didn't have common sense.
They covered the mosfets with the midplate? One of the hottest components on the card? Whereas Asus and MSI with way more overbuild VRM solutions, that by design run cooler due to the much higher amperage they can handle, have them actively cooled.

Its a facepalm moment and worthy of my pity to be honest, if they tried to cut corners I would be angry, but all the evidence point to project failure, quality control absence and overall management bad decisions (The one responsible for the cooler design must be fired, even if he is the CEO's nephew as it seems :P ) .
Because its my firm belief that anyone with half a brain could design it better with the same or less cost.

I seriously consider dremel for the midplate to expose the mosfets to direct airflow, and attach some low profile copper heat sinks.
2016/11/08 02:06:36
Mencius_
I have now tested both the previous and the new secondary 1070 FTW BIOSes in 5 min 30 seconds of heaven. Ambient temps were 28 - 30 deg C. My HW monitoring results are below. I have included the GPU core temp, fan %, and fan tachometer columns only but can update with the rest.
 
From my testing it appears there is no (or negligible) fan speed increase in the new secondary BIOS and in fact there appears to be a slight fan speed reduction. I have only tested to the mid 60 deg C range - I'm not comfortable testing higher. The new secondary BIOS also appears slower to ramp up the fan speeds. As previously mentioned, the idle fan RPMS has also been halved from 1,000 to 500 RPM.
 
I don't really understand these changes and I will probably roll back to the previous BIOS. 
 
I would grateful if anyone has information to help me understand these changes to the 1070 FTW secondary BIOS since they appear to be simply a fan speed reduction and contrary to my understanding of the purpose of this update.
 
[Edit 1: tried to add the data as JPGs because I don't know how to add it as a .txt file or as a table but they're unreadable due to the size reduction applied, I tried adding a google sheets link but the forum doesn't allow it, I don't know how best to share HW monitoring data here but if anyone can let me know how to do it I will add it]
 
[Edit 2: to try to make the JPGs legible I have included only the start and end of the test run - cropping most of the middle]
 
Previous 1070 FTW secondary BIOS

New (current) 1070 FTW secondary BIOS

Attached Image(s)

2016/11/08 02:15:56
GFAFS
delicieuxzWhat? Stop saying crazy stuff. You are nobody to tell people what programs they can or can't run with their GPU. It is NOT the problem that people are running a particular program. The problem is that the GPUs don't have proper cooling, and so running high-intensity programs is particularly risky.

 
You're right. What he is also omitting is that the other Brands have no specific grief against furmark, and perform pretty well under intense stress. fightinfilipino is just using an old argument thrown away by EVGA at the beginning of the Crisis for everyone to chew on...a clumsy way to reject responsibilities.
 
2016/11/08 02:48:54
ilyama
When I read you all, I have the feeling that we are telling you to not playing GTA V, BF1 or another game...
 
We don't care about the fact to not launch this furmark, just play games FFS...
2016/11/08 02:54:08
GFAFS
ilyama
When I read you all, I have the feeling that we are telling you to not playing GTA V, BF1 or another game...
 
We don't care about the fact to not launch this furmark, just play games FFS...



Personally i rarely play games with my cards, i do intense computing/hashing/rendering. Way more demanding than a game.
2016/11/08 04:29:02
emsir
Cyricor
Its a bit worse than just that EVGA decided to cut corners. Its a design flaw. Someone actually designed the cooling solution on this card that didn't have common sense.
They covered the mosfets with the midplate? One of the hottest components on the card? Whereas Asus and MSI with way more overbuild VRM solutions, that by design run cooler due to the much higher amperage they can handle, have them actively cooled.

Its a facepalm moment and worthy of my pity to be honest, if they tried to cut corners I would be angry, but all the evidence point to project failure, quality control absence and overall management bad decisions (The one responsible for the cooler design must be fired, even if he is the CEO's nephew as it seems :P ) .
Because its my firm belief that anyone with half a brain could design it better with the same or less cost.

I seriously consider dremel for the midplate to expose the mosfets to direct airflow, and attach some low profile copper heat sinks.


You think you are a pro, or work in a place where they make GPU's? 
This issue is absolutely not a project failuire. No card has been damaged due to this issue. This issue is handled proffessional by EVGA. Your post is just another attempt to put you in the spotlight and have your 2 minutes of fame.
Your solution is rubbish. Why? Because the thermal pad solution and bios update works just great. There is no heat issue and people are playing games just like they would normal. The cards work just great without issues.
2016/11/08 04:33:00
emsir
GFAFS
delicieuxzWhat? Stop saying crazy stuff. You are nobody to tell people what programs they can or can't run with their GPU. It is NOT the problem that people are running a particular program. The problem is that the GPUs don't have proper cooling, and so running high-intensity programs is particularly risky.

 
You're right. What he is also omitting is that the other Brands have no specific grief against furmark, and perform pretty well under intense stress. fightinfilipino is just using an old argument thrown away by EVGA at the beginning of the Crisis for everyone to chew on...a clumsy way to reject responsibilities.
 


The Furmark problem is not just EVGA, but all Nvidia cards. It's well known that Nvidia has warned against Furmark for years. Why?  Because you'll never push your GPU to the limit when playing games. Don't blame it on VGA, blame it on Nvidia.
2016/11/08 04:45:58
GFAFS
emsir
 
The Furmark problem is not just EVGA, but all Nvidia cards. It's well known that Nvidia has warned against Furmark for years. Why?  Because you'll never push your GPU to the limit when playing games. Don't blame it on VGA, blame it on Nvidia.




I'm focusing on EVGA right now, do you mind. If EVGA do not put it in his commercial/propaganda brochures and/or packages like with a big "WARNING DO NOT do this or that", the consumer have the right to assume and furmark all day long. 

Use My Existing Forum Account

Use My Social Media Account