EVGA

AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price

Page: < 12 Showing page 2 of 2
Author
RainStryke
The Advocate
  • Total Posts : 15872
  • Reward points : 0
  • Joined: 2007/07/19 19:26:55
  • Location: Kansas
  • Status: offline
  • Ribbons : 60
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/03 09:26:56 (permalink)
lol yeah... The R9 290X 8GB has been out for a while now... It has more VRAM than a Titan Black, the VRAM on the R9 290X is also faster than the Titan Black's. And... less than half the price.

Main PC | Secondary PC
Intel i9 10900K | Intel i7 9700K

MSI MEG Z490 ACE | Gigabyte Aorus Z390 Master
ASUS TUF RTX 3090 | NVIDIA RTX 2070 Super
32GB G.Skill Trident Z Royal 4000MHz CL18 | 32GB G.Skill Trident Z RGB 4266MHz CL17
SuperFlower Platinum SE 1200w | Seasonic X-1250
Samsung EVO 970 1TB and Crucial P5 1TB | Intel 760p 1TB and Crucial MX100 512GB
Cougar Vortex CF-V12HPB x9 | Cougar Vortex CF-V12SPB-RGB x5
 
3DMark Results:Time Spy|Port Royal

#31
seta8967
FTW Member
  • Total Posts : 1813
  • Reward points : 0
  • Joined: 2010/03/03 05:18:45
  • Status: offline
  • Ribbons : 2
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/03 12:02:34 (permalink)
RainStryke
lol yeah... The R9 290X 8GB has been out for a while now... It has more VRAM than a Titan Black, the VRAM on the R9 290X is also faster than the Titan Black's. And... less than half the price.


The only thing is dpp. Which I think was a mistake to call it a geforce product. Should have been a quadro and advertised less asva gaming card.
#32
Bruno747
CLASSIFIED Member
  • Total Posts : 3909
  • Reward points : 0
  • Joined: 2010/01/13 11:00:12
  • Location: Looking on google to see what Nvidia is going to o
  • Status: offline
  • Ribbons : 5
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/03 13:52:08 (permalink)
Brad_Hawthorne
Vlada011
Titan Black is peace of Art compare to AMD Radeon.
From every side you look NVIDIA have that chip two years and AMD still no stronger card of full unlocked GK110.
If you compare chip, performance, driver, stock cooler, huge amount of video memory...
AMD never had something like that in their offer.


Keep drinking the koolaid and buying $1000 video cards. Nvidia love you long time for it.


I was thinking it, glad you said it.

X399 Designare EX, Threadripper 1950x, Overkill Water 560mm dual pass radiator. Heatkiller IV Block Dual 960 EVO 500gb Raid 0 bootable, Quad Channel 64gb DDR4 @ 2933/15-16-16-31, RTX 3090 FTW3 Ultra, Corsair RM850x, Tower 900
#33
mistermister
CLASSIFIED ULTRA Member
  • Total Posts : 5804
  • Reward points : 0
  • Joined: 2008/03/29 02:38:09
  • Location: San Diego
  • Status: offline
  • Ribbons : 13
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/04 08:38:46 (permalink)
How is he wrong though?

AMD 3700x / X-570 Aorus Ultra / RTX-3090 FTW3
#34
RainStryke
The Advocate
  • Total Posts : 15872
  • Reward points : 0
  • Joined: 2007/07/19 19:26:55
  • Location: Kansas
  • Status: offline
  • Ribbons : 60
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/04 10:19:59 (permalink)
The original Titan released in February 2013 is basically on par with performance with a R9 290X that was sold for just about half the price of a titan on release in October 2013. Titan Black came out in February 18th, 2014 which basically matched the performance of what a GTX 780Ti puts out in games, from a gaming perspective, it makes no sense to get a Titan Black if you could just get the same thing out of a GTX 780Ti.
 
The Titan Black actually released at $1099. Retail was maybe $999... but all of the stores were selling them for $1099 on release.
 
Not sure what he means about piece of art... Since the cooler on the card is basically the same as the original GTX 780 reference card. It's the same reference blower style fan that get loud when under load. Also... AMD's reference cooler is the same damn thing with different colors. lol
 
Also not sure what he's talking about when comparing drivers. If you look at the perspective of a user that wants to use multiple displays, AMD is way ahead of Nvidia. 

Main PC | Secondary PC
Intel i9 10900K | Intel i7 9700K

MSI MEG Z490 ACE | Gigabyte Aorus Z390 Master
ASUS TUF RTX 3090 | NVIDIA RTX 2070 Super
32GB G.Skill Trident Z Royal 4000MHz CL18 | 32GB G.Skill Trident Z RGB 4266MHz CL17
SuperFlower Platinum SE 1200w | Seasonic X-1250
Samsung EVO 970 1TB and Crucial P5 1TB | Intel 760p 1TB and Crucial MX100 512GB
Cougar Vortex CF-V12HPB x9 | Cougar Vortex CF-V12SPB-RGB x5
 
3DMark Results:Time Spy|Port Royal

#35
mistermister
CLASSIFIED ULTRA Member
  • Total Posts : 5804
  • Reward points : 0
  • Joined: 2008/03/29 02:38:09
  • Location: San Diego
  • Status: offline
  • Ribbons : 13
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/04 15:49:24 (permalink)
Thank you for quantifying that.

AMD 3700x / X-570 Aorus Ultra / RTX-3090 FTW3
#36
seta8967
FTW Member
  • Total Posts : 1813
  • Reward points : 0
  • Joined: 2010/03/03 05:18:45
  • Status: offline
  • Ribbons : 2
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/04 16:28:59 (permalink)
RainStryke
The original Titan released in February 2013 is basically on par with performance with a R9 290X that was sold for just about half the price of a titan on release in October 2013. Titan Black came out in February 18th, 2014 which basically matched the performance of what a GTX 780Ti puts out in games, from a gaming perspective, it makes no sense to get a Titan Black if you could just get the same thing out of a GTX 780Ti.
 
The Titan Black actually released at $1099. Retail was maybe $999... but all of the stores were selling them for $1099 on release.
 
Not sure what he means about piece of art... Since the cooler on the card is basically the same as the original GTX 780 reference card. It's the same reference blower style fan that get loud when under load. Also... AMD's reference cooler is the same damn thing with different colors. lol
 
Also not sure what he's talking about when comparing drivers. If you look at the perspective of a user that wants to use multiple displays, AMD is way ahead of Nvidia. 




I'm not sure what he means by piece of art either, but I know for my friend who was starting 3d rendering and such that use DPP, the titan was a god send. He was saving for a quadro when the titan came out and saved him a bunch of money. Really they should have marketed the Titan as a quaddro for amateurs needing DPP, and not gamers.
 
I don't think they are the same reference cooler as AMD's, as AMD's was much louder than Titans reference cooler at full throttle.  I haven't heard the titans one, but the 980 I have hear and the 290's and the 290 was indeed louder. Whether thats because it has a higher speed than the 780 idk.
#37
RainStryke
The Advocate
  • Total Posts : 15872
  • Reward points : 0
  • Joined: 2007/07/19 19:26:55
  • Location: Kansas
  • Status: offline
  • Ribbons : 60
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/04 17:12:15 (permalink)
Yeah, I have no knowledge of 3D rendering, but I guess that's a good card in that respect. I don't know much about either quattro or firepro cards. Seems to be even more of a niche market than PC gaming by far.
 
I suppose it's not as loud, but they are still loud. According to this review, the reference cooler can get up to 70dB
http://www.xbitlabs.com/articles/graphics/display/gigabyte-geforce-gtx-titan-black-ghz-edition_4.html#sect0
 
My R9 290X is nowhere near that, but I also don't have a reference card. My Tri-X is 37dB according to this review:
http://www.techpowerup.co..._290X_Tri-X_OC/23.html
post edited by RainStryke - 2015/02/04 18:11:22

Main PC | Secondary PC
Intel i9 10900K | Intel i7 9700K

MSI MEG Z490 ACE | Gigabyte Aorus Z390 Master
ASUS TUF RTX 3090 | NVIDIA RTX 2070 Super
32GB G.Skill Trident Z Royal 4000MHz CL18 | 32GB G.Skill Trident Z RGB 4266MHz CL17
SuperFlower Platinum SE 1200w | Seasonic X-1250
Samsung EVO 970 1TB and Crucial P5 1TB | Intel 760p 1TB and Crucial MX100 512GB
Cougar Vortex CF-V12HPB x9 | Cougar Vortex CF-V12SPB-RGB x5
 
3DMark Results:Time Spy|Port Royal

#38
seta8967
FTW Member
  • Total Posts : 1813
  • Reward points : 0
  • Joined: 2010/03/03 05:18:45
  • Status: offline
  • Ribbons : 2
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/04 17:18:41 (permalink)
RainStryke
Yeah, I have no knowledge of 3D rendering, but I guess that's a good card in that respect. Not even sure how they do against FirePro cards or even know anything about either quattro or firepro cards. Seems to be even more of a niche market than PC gaming by far.
 
I suppose it's not as loud, but they are still loud. According to this review, the reference cooler can get up to 70dB
http://www.xbitlabs.com/articles/graphics/display/gigabyte-geforce-gtx-titan-black-ghz-edition_4.html#sect0
 
My R9 290X is nowhere near that, but I also don't have a reference card. My Tri-X is 37dB according to this review:
http://www.techpowerup.co..._290X_Tri-X_OC/23.html




My friend said the quattro and firepro will destroy it, but for amateur rendering and designing it does well enough. He would love to have dedicated professional cards but 4k+ is too much. It is a niche market, big companies will use quattro/firepro and gamers will user geforce/radeon. I know some upstart gaming industries like Cloud Imperium Gaming use Titans, and indie devs too, but that is still such a small niche that might be growing but not as fast to demand a card for it imo.
#39
lehpron
Regular Guy
  • Total Posts : 16254
  • Reward points : 0
  • Joined: 2006/05/18 15:22:06
  • Status: offline
  • Ribbons : 191
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/04 23:01:39 (permalink)
seta8967
 
My friend said the quattro and firepro will destroy it, but for amateur rendering and designing it does well enough. He would love to have dedicated professional cards but 4k+ is too much. It is a niche market, big companies will use quattro/firepro and gamers will user geforce/radeon. I know some upstart gaming industries like Cloud Imperium Gaming use Titans, and indie devs too, but that is still such a small niche that might be growing but not as fast to demand a card for it imo.
I would have said the same.
 
 
nVidia's Titan series is like Intel's X-platform, they are ported and rebranded server/workstation parts with a few things disabled to bring the price down to consumers, but still priced higher than what most consumers could purchase-- except with enough extra features enabled to sell exclusively to those that want those features.  All the while, regular consumer parts could perform close or exceed at a cheaper price.  As such, both companies dominate their respective markets and can afford to waste high-grade dies to get extra premium customer support-- AMD can't do either for CPU or GPU.  Not being dominant means can't afford to make a consumer-version of their FirePro cards or give enthusiasts unlocked versions of their Opterons above 8-core; they would loose money from precious dies that could fetch higher premiums if they remain professional.
 
nVidia and Intel are lucky to have the opportunity to invent and maintain these new product segments; except where on CPU Intel may not get competition to their X-platform for a while, graphics is still competitive and nVidia may temporarily discontinue Titan if AMD gets back in the game.
 
nVidia doesn't have a major CPU division (Tegra is growing but not x86) and Intel doesn't have a major GPU division (Iris is growing, but not ARM), so AMD has their work cut out for them (Skybridge will be both ARM & x86) and I applaud the fact that they still exist despite what they are up against.

For Intel processors, 0.122 x TDP = Continuous Amps at 12v [source].  

Introduction to Thermoelectric Cooling
#40
seta8967
FTW Member
  • Total Posts : 1813
  • Reward points : 0
  • Joined: 2010/03/03 05:18:45
  • Status: offline
  • Ribbons : 2
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/04 23:34:35 (permalink)
lehpron
seta8967
 
My friend said the quattro and firepro will destroy it, but for amateur rendering and designing it does well enough. He would love to have dedicated professional cards but 4k+ is too much. It is a niche market, big companies will use quattro/firepro and gamers will user geforce/radeon. I know some upstart gaming industries like Cloud Imperium Gaming use Titans, and indie devs too, but that is still such a small niche that might be growing but not as fast to demand a card for it imo.
I would have said the same.
 
 
nVidia's Titan series is like Intel's X-platform, they are ported and rebranded server/workstation parts with a few things disabled to bring the price down to consumers, but still priced higher than what most consumers could purchase-- except with enough extra features enabled to sell exclusively to those that want those features.  All the while, regular consumer parts could perform close or exceed at a cheaper price.  As such, both companies dominate their respective markets and can afford to waste high-grade dies to get extra premium customer support-- AMD can't do either for CPU or GPU.  Not being dominant means can't afford to make a consumer-version of their FirePro cards or give enthusiasts unlocked versions of their Opterons above 8-core; they would loose money from precious dies that could fetch higher premiums if they remain professional.
 
nVidia and Intel are lucky to have the opportunity to invent and maintain these new product segments; except where on CPU Intel may not get competition to their X-platform for a while, graphics is still competitive and nVidia may temporarily discontinue Titan if AMD gets back in the game.
 
nVidia doesn't have a major CPU division (Tegra is growing but not x86) and Intel doesn't have a major GPU division (Iris is growing, but not ARM), so AMD has their work cut out for them (Skybridge will be both ARM & x86) and I applaud the fact that they still exist despite what they are up against.




Agreed, a lot of people think I hate AMD, its not that I hate them its that they aren't willing to take a risk or be innovative. I wish AMD would step up to Intel and Nvidia, I wish they could be as power efficient as well performing as Nvidia. They just don't do it anymore. I also wish they would let EVGA make cards and custom cards for them, and EVGA to do it.
post edited by seta8967 - 2015/02/05 00:35:54
#41
ARMYguy
FTW Member
  • Total Posts : 1050
  • Reward points : 0
  • Joined: 2005/06/23 12:06:19
  • Status: offline
  • Ribbons : 0
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/06 13:51:07 (permalink)
Nvidia is smart, by locking me into Nvidia cards cause of the shield tablet, and my Asus swift LCD with g sync, i literally couldn't buy a AMD card even if i wanted to.

Asus Strix Z790 F - Intel 13700 K - Gigabyte 4090 Windforce - 32gb DDR 5 - 1TB Samsung SSD 850 evo - Windows 10 Pro - 1TB Samsung M.2 860 - Inland 2 TB NVMe - Acer Predator X27
#42
Baltothewolf
CLASSIFIED Member
  • Total Posts : 3762
  • Reward points : 0
  • Joined: 2012/03/23 23:27:34
  • Status: offline
  • Ribbons : 5
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/06 15:56:58 (permalink)
ARMYguy
Nvidia is smart, by locking me into Nvidia cards cause of the shield tablet, and my Asus swift LCD with g sync, i literally couldn't buy a AMD card even if i wanted to.

Couldn't speak on shield, but at least us AMD users don't have to spend extra for adaptive V-sync :P

My Laptop (GE63VR-7RF):
-7700HQ.
-16GB RAM.
-GTX 1070.
-128GB SSD.
-1X 1TB 7200 spinny drive.

#43
seta8967
FTW Member
  • Total Posts : 1813
  • Reward points : 0
  • Joined: 2010/03/03 05:18:45
  • Status: offline
  • Ribbons : 2
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/06 17:57:31 (permalink)
Baltothewolf
ARMYguy
Nvidia is smart, by locking me into Nvidia cards cause of the shield tablet, and my Asus swift LCD with g sync, i literally couldn't buy a AMD card even if i wanted to.

Couldn't speak on shield, but at least us AMD users don't have to spend extra for adaptive V-sync :P


You do know that the monitors that support vesa a-sync are going to be more expensive than their non a-synch models

"Such LCD panels naturally cost more to manufacture and validate"

Source: http://support.amd.com/en-us/search/faq/225

I am curious as to how free-synch will work. On the FAQ is this.
7
"Using DisplayPort Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display."

So is it different than gsynch where it reads the gpu to set the framerate? Whenever I ask on reddit or other tech sites the fans from both sides cone spouting improper sources and misinformation.
#44
Baltothewolf
CLASSIFIED Member
  • Total Posts : 3762
  • Reward points : 0
  • Joined: 2012/03/23 23:27:34
  • Status: offline
  • Ribbons : 5
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/07 04:52:13 (permalink)
seta8967
Baltothewolf
ARMYguy
Nvidia is smart, by locking me into Nvidia cards cause of the shield tablet, and my Asus swift LCD with g sync, i literally couldn't buy a AMD card even if i wanted to.

Couldn't speak on shield, but at least us AMD users don't have to spend extra for adaptive V-sync :P


You do know that the monitors that support vesa a-sync are going to be more expensive than their non a-synch models

"Such LCD panels naturally cost more to manufacture and validate"

Source: http://support.amd.com/en-us/search/faq/225

I am curious as to how free-synch will work. On the FAQ is this.
7
"Using DisplayPort Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display."

So is it different than gsynch where it reads the gpu to set the framerate? Whenever I ask on reddit or other tech sites the fans from both sides cone spouting improper sources and misinformation.

Apparently a monitor will be able to have free-sync, and not get the certification done. Linus was talking about it in one of his CES videos. The validation is a lot like NVIDIAS for SLI. Most boards not certified with SLI still support it.

My Laptop (GE63VR-7RF):
-7700HQ.
-16GB RAM.
-GTX 1070.
-128GB SSD.
-1X 1TB 7200 spinny drive.

#45
seta8967
FTW Member
  • Total Posts : 1813
  • Reward points : 0
  • Joined: 2010/03/03 05:18:45
  • Status: offline
  • Ribbons : 2
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/07 05:13:14 (permalink)
Baltothewolf
seta8967
Baltothewolf
ARMYguy
Nvidia is smart, by locking me into Nvidia cards cause of the shield tablet, and my Asus swift LCD with g sync, i literally couldn't buy a AMD card even if i wanted to.

Couldn't speak on shield, but at least us AMD users don't have to spend extra for adaptive V-sync :P


You do know that the monitors that support vesa a-sync are going to be more expensive than their non a-synch models

"Such LCD panels naturally cost more to manufacture and validate"

Source: http://support.amd.com/en-us/search/faq/225

I am curious as to how free-synch will work. On the FAQ is this.
7
"Using DisplayPort Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display."

So is it different than gsynch where it reads the gpu to set the framerate? Whenever I ask on reddit or other tech sites the fans from both sides cone spouting improper sources and misinformation.

Apparently a monitor will be able to have free-sync, and not get the certification done. Linus was talking about it in one of his CES videos. The validation is a lot like NVIDIAS for SLI. Most boards not certified with SLI still support it.


Ahh hmmm that maybe confusing in the future for novice buyers. Still will be interesting to see how it will all be handled.
#46
peteo_85
SSC Member
  • Total Posts : 693
  • Reward points : 0
  • Joined: 2008/09/14 13:46:13
  • Location: Kadena Air Base, Japan
  • Status: offline
  • Ribbons : 3
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/07 11:34:46 (permalink)
Lmao!!

Intel Core i7-6700K Skylake 4.0GHz LGA 1150   
EVGA Z170 Stinger
Corsair Dominator Platinum DDR4 16GB
ZOTAC GTX 1080 Arctic Storm 
CORSAIR SFF 600W PSU
#47
peteo_85
SSC Member
  • Total Posts : 693
  • Reward points : 0
  • Joined: 2008/09/14 13:46:13
  • Location: Kadena Air Base, Japan
  • Status: offline
  • Ribbons : 3
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/07 11:35:58 (permalink)
Brad_Hawthorne
techpcgamer101
lol amd should not even taik at all there only doin this 2 try and sell more cards what amd needs 2 werry about is makeing a cpu that can keep up with intel and not werry about there apu and maybe make a radeom card that dose not use alot of power or heat but yet this is why i stay with nvidia and intel amd can taik all the smack thay want when thay should look them self befor taiking cuz thay are still so far behide on so much but i did start a step up fro my 970  switching it out for 980 


Man, I would of hated to be your English teacher.


Lol +1

Intel Core i7-6700K Skylake 4.0GHz LGA 1150   
EVGA Z170 Stinger
Corsair Dominator Platinum DDR4 16GB
ZOTAC GTX 1080 Arctic Storm 
CORSAIR SFF 600W PSU
#48
donta1979
Primarch
  • Total Posts : 15888
  • Reward points : 0
  • Joined: 2007/02/11 19:27:15
  • Location: In the land of Florida Man!
  • Status: offline
  • Ribbons : 72
Re: AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price 2015/02/09 19:59:22 (permalink)
RainStryke
seta8967
RainStryke
candle_86
you know what, this is insane, no one threw this much of a fit back in 03, when Radeon and FX where cheating on 3dmark




I remember when Nvidia had the upper hand on Vantage due to physx for a long time.


But thats a feature for all nvidia cards. Not a lie/cheat for demo cards falsifying scores.



No, when you had phsyx on, it assisted the CPU score and highly inflated the score. It was also falsifying scores, cause the CPU tests were for the CPU, not for the GPU and CPU.


Yes and no 3dmark is supposed to be a gamers benchmark right? Some games have physx some do not... it is a physx test, where amd could not utilize it because of no support so the load went to the cpu, where nvidia gpus could so the load went on the cpu and gpu. Just depends on what games you played at that time and your hardware setup.

Heatware   

Retired from AAA Game Industry
Jeep Wranglers, English Bulldog Rescue
USAF, USANG, US ARMY Combat Veteran
My Build
Intel Core I9 13900K@6.1ghz, ASUS ROG Ryujin III 360 ARGB, 32gb G.Skill Trident Z5 RGB 7200mhz CL34 DDR5, ASUS Rog Strix Z790-E, ASUS Rog Strix OC 4090, ASUS ROG Wingwall Graphics Card Holder, Seagate limited Edition Cyberpunk 2077 m.2, 2x Samsung 980 m.2 1TB's, 980 & 990 Pro m.2 2TB's, ASUS ROG Hyperion GR701, ASUS ROG Thor 1200W Platinum II, Cablemod RT-Series Pro ModMesh Sleeved 12VHPWR Carbon, ASUS Rog Swift PG35VQ 35", Acer EI342CKR Pbmiippx 34", ROG Harpe Ace Aim Lab Edition mouse, Rog Claymore II keyboard, TCL home entertainment Sound Bar w/Wireless Sub, Steelseries Johnny Silverhand Headset Microsoft Cyberpunk 2077 Xbox controller
#49
Page: < 12 Showing page 2 of 2
Jump to:
  • Back to Mobile