A word on whether games are CPU- or GPU-bound:First you have to research your games, not all use resources the same way. For instance, when playing a game in a CPU_bound scenario, it means that performance benefits from better CPUs, whereas a GPU-bound scenario doesn't benefit from a better CPU, rather a better GPU. If you end up playing more of one type of game, then you're are wasting money on hardware going unused. Genernally speaking: RTS, MMO, RPG, simulator games tend to be CPU-bound especially in multi-player. GPU-bound is just about
any game in single-player, max-detail, single-GPU and single display. Of course, changing the resolution or varying the game settings can shift a game from CPU- to GPU-bound and vice versa. Hence, research your games and the types you may have an interest in, Google you game title with your component in your PC to see what results appear.
The following as taken from an Anandtech review on the Haswell processor generation (Core i7-4770K) when they debuted last year. The first one is an example of CPU-bound, the second is GPU-bound, both showing frame rates using on Radeon HD7970 (the modern equivalent of GeForce GTX770) at 2560x1440 resolution; blue are Intel, red are AMD:
If you tend to play your games max detail with one display and single-player, AMD just happens to be cheaper and it isn't that much slower than Intel for the price. But online multi-payer tend to make games CPU-bound because of the internet traffic and all the characters, its more work for the CPU, might as well have Intel. Also some games don't support multiple-GPU; your hypothetical pair of dual-GPU R9-295x2 will have three GPUs unused in some games. nVidia has a
list of games that support their SLI tech, AMD's Crossfire list shouldn't be that different.
A word about video memory in multi-GPU:Tthe video memory in dual-GPU graphics cards like Titan-Z, R9-295X2, GTX690, HD7990, etc-- doesn't add up or share. For R9-295X2, it is still 4GB per GPU, you're only paying for the advantage of power savings and space savings versus getting a pair of R9-290X's. Performance doesn't scale 100% with each additional card either, it is dependent on the game; but in GPU-bound, the second card can add 85%, third would be more like 60%; and again scaling changes with different games. But the reason the third and fourth GPU loose scaling in many reviews is because the game isn't GPU-bound anymore, it is being bottle-necked by something, the CPU clockspeed or resolution becomes too low for the available rendering power (as if you put a future high-end card in an older system).
A word about processor cores: Even though it has been 8 years since the first quad-cores came to the desktop PC environment, most games still use just 4-cores. If you're going to get more cores, have something else to do than just games, because it could be another decade before popular games support more cores. Support for more has to be written into the program code itself, which is up to the developer; and as a business, they want to net the most sales so they aren't going to target their game at the minority of high-end PC gamers. Rather developers aim their games at everyone else where the dual-core with hyper-threading (Core i3) and quad-core without (Core i5) still reign supreme. We can't blame them since most people don't spend towards a game (consoles have created this complacency); so if a game's requirements are too high, most people won't buy it. There aren't as many high-end users as enthusiasts think so developers won't just focus on high-end customers specifically-- unless someone made a high-end console just for us that guaranteed max detail smooth at 4K for the next few years. I'm sure it is possible, I just don't know how many would pay for it.
A word on the differences of AMD and Intel:Just so you know, AMD and Intel processors are not the same, so don't see the FX9590 as faster just because it is running at 4.7GHz. The issue is the silicon dies themselves, AMD's current microarchitecture codenamed "Steamroller" uses less transistors per core against Intel's current "Haswell" which results in less performance, which the higher frequency is an attempt to compensate for the low IPC (known as 'instructions-per-clock'). In CPU-bound tests, AMD's latest is still 40% behind Intel; yet Intel costs more than 40%. So an AMD part would need 5.5GHz just to match the Intel at 3GHz-- this is the primary reason for the fanaticism you will ever read where people chant to the heavens about how good Intel is. AMD still makes new designs, so some hope the coming Skybridge in 2015 and "Zen" architecture is 2016 closes the gap since Intel's upcoming Skylake next summer isn't expected to be leaps and bounds ahead of Haswell, they don't have to be until AMD gets their act together.
A word on reviewer bias:That being said, even if you search for hardware reviews online to gauge their performance in certain games, those reviewers aren't being objective and telling you everything, they have bias and leave things out so you end up with the same conclusion as their opinion. Even people on forums like this will do the same, so be open-minded when researching; don't base your money on a rumor or opinion, look for facts. (Please don't flag this post as helpful just because I write a lot, give me the impression you read it, ask questions; I doubt I can read minds and answer that which you haven't asked.)
For example:
- Pro-AMD processor bias -- They will test GPU-bound games at high-resolutions and details, at that point both Intel and AMD perform the same showing no point in spending more towards Intel. But, these tests also show that there would be no point in upgrading as older processors are on par.
- Anti-AMD processor bias -- They will test CPU-bound scenarios, including office apps and media encoding where Intel has reigned supreme (beating a dead horse); while games are tested in CPU-bound (medium details at a low resolution most of us don't play at-- which also reduces the significance of the IGP in AMD processors which is usually better than Intel's IGP).
- High video memory or high-priced components -- They will not test where the products are meant for. When nVidia's Titan and Titan-Z first appeared last year, because people hated on the high price, most reviews wouldn't test them such that we'd see benefit from the 6GB per GPU or the double-precision capability. These products are hybrids meant to bridge the gap between consumer GeForce and professional Quadro lines. Yet, most reviewers would just test out one monitor and performance against a cheaper card would make it look bad in games. Plus games are written in single-precision, so the dual-precision capability was never seen and ignored. Just because someone doesn't have use for something doesn't make it useless. Truth is most people were just jealous and defaulted into an "if I can't have it, no one should" attitude where they felt anyone that bought into them were overspending. It is just another bias.
- Intel gets the same backlash for their $999 processors, since games benefit frequency and not cores, most reviews imply they are worthless for gaming at the cost of heat and power. That said, Intel's high-end gaming platform is just a ported workstation platform with unlocked and rebranded Xeons as "Core i7". Just the same, Titans are just Quadro/Tesla cards rebranded as GeForce.
A word on Status Quo:The following leads on from the previous and has nothing to do with computer hardware and more about the attitude of customers. Some follow an unspoken ideology: If you aren't with them, then you're against them. If you're not going to build your own system or have a restricted budget, these people will recommend that which you can't afford as if it's the only way; if you have a massive budget, they will recommend less as if you're a fool for spending too much. FYI, most hardware reviews assume you keep up with technology since they only compare with 1-2 previous generations. If you're the type to keep stuff for a long time, either you learn to daisy-chain multiple reviews to get an idea of what to expect, if the change is even worth it, or naively hope for the best as a reward for waiting longer.
Above all, there is a fine line between 'saying it like it is' and 'stating your opinion as fact based on experiences'.
A word on future-proofing:The reason anyone will tell you this is impossible is because their cravings are bleeding edge, such that they are forced to upgrade often to satisfy themselves. If you don't need the best, and only get what you need, you too will feel like future-proof is impossible because you have to upgrade often to maintain your less-than-best satisfaction level. But if you get way more than you need anyway
and never get used to it, that is how you future-proof, you're upgrading towards a future use that is overkill today, that is the idea. High-end folks can't experience this because there are no options beyond the best, so it is only an option for everyone else; though most that don't normally get the best can't afford much more, so very few really get to experience 'future-proofing'.
For instance, Middle Earth: Shadow of Mordor asks for a minimum graphics requirement of a GTX460-- at one time the equivalent of 3-way SLI GeForce 8800GTX back in 2007, the best at the time. Next year a game will come about needing an equivalent minimum beyond the older 3-way SLI setup-- making it officially obsolete. Therefore, a rig with those specs would have lasted 7-8 years to near-obsolescence if that person couldn't afford to upgrade it the whole time and wondered how long was practical. Hypothetically, even though 4-way SLI doesn't scale well today, getting four GTX980's today would last the next 7-8 years. It will eventually scale better as future games use more resources.