It's the GPU manufacturers who have to take the lead if PC gaming is going to become a massive force again, says James.
By James Pinnell on August 8, 2013 at 6:13 pm
Back in 1998, my love of 2D gaming was shattered forever by the arrival of the dedicated 3D graphics card — an utterly mindblowing Riva TNT2. I had watched some of my luckier brethren purchase Voodoos and Rages, but my 17th birthday present allowed me to play Starsiege: Tribes (yes, that very one!) in full 32-bit magnificence. No more software graphics for me, no sir, I had nVidia’s 4th-generation, 128-bit mean machine that became the first true mainstream introduction of the GPU.
Not only did the TNT2 feature the first “triple-play” chipset (fully 32-bit frame buffer, colour palette and Z/Stencil buffer) it was the first to offer a full affordable solution for both 2D and 3D hardware processing. Before the end of the year, almost everyone had their hands on a variation (unless you were a 3DFX man) — until the release of the GeForce 256 in 1999 — a single-chip processor with all the trimmings.
Back then, it wasn’t just nVidia and AMD — there were a host of competitors battling it out for your PCI slot, and as a result, innovation grew thick and fast. Very little love was shared between the players, ranging from the once powerful 3DFX and S3, to the now refocused Matrox, and of course our two powerhouses at ATI and nVidia. Much of the demise of competition during the early noughties ended up delivering a complete bloodbath, thanks to the sheer amount of money it took to create new chipsets and the risks required to secure partnerships with console developers and PC hardware houses, such as Dell and HP.
One high profile failure, such as 3DFX’s disastrous play for the Sega Dreamcast, was quickly followed by nVidia’s flattening of the Voodoo 4 and 5. By the end of 2003, there were only two left standing, with the corpses of their now defeated enemies chopped up and sold to various other technology houses for patents.
Nowadays, AMD and nVidia are facing enemies that are playing on different fields – Qualcomm, Intel, Samsung and others battling it out in the mobile space (using technology, ironically, that stemmed from the continued work of S3 and 3DFX), as well as each other. But diversification risks alienating the environment that put them in their positions in the first place – the desktop. The last 10 years have done little to inspire a new generation of gamers to stuff bits into a custom case while bragging about their new tech, as the seductive branches of mobile devices simply offer a simpler solution for quick and easy gaming. Consoles, once lamented as “toys” by hardcore PC gamers have now become the dominant force in AAA titles, even though they feature obsolete technology before they are even released to market.
Sure, nVidia and AMD are usually the ones filling those stockings, but they are kidding themselves if they think they can continue to sit on their laurels and live off royalties from half-decade old hardware.
The fact of the matter is that these big two have dropped the ball entirely when it comes to evangelising their products and markets. Confusing, expensive nonsense like the nVidia Shield (above) risks changing the core message when there is an original marketplace of millions just waiting to be seduced. Because, let’s face it — the way you research, buy and install your cards hasn’t changed at all. It’s still a confusing mess to easily compare models for performance, figure out whether current of future games will work with a particular series, or troubleshoot when things go wrong. I know of at least 3-4 friends who have bought consoles simply to avoid having to figure out whether a Radeon HD 7970 is better than a Geforce GTX 550. Sometimes a higher number means high performance. Sometimes it doesn’t! How is anybody supposed to understand this easily?
Then there’s the process of getting the right cases and power supplies, sorting out cooling, benchmarking and overclocking… I know a lot of you are reading this and screaming at me “BUT THIS IS WHY WE LOVE IT”, and I agree — I love putting together builds myself. But in all honesty, you shouldn’t have to read a ton of reviews, post on forums and check benchmarks to find a graphics card that suits your needs and budget. Many people expect your average punter to drop $400 on a card when they probably only need something that costs $200. That’s an enormous difference in cost — listen to the slapfights when the Xbox One was announced at $100 more than the PS4! (And that’s with a Kinect on top!)
Then there’s the lack of changes in form factor – cards are getting bigger, louder and hotter. You shouldn’t have to buy something that is the size of your forearm and stuff it into your case in such a way that it doesn’t overheat. I get the innovations in power — but what about the innovations in size, or rethinking how stock cards are naturally cooled?
Then there’s the software. AMD has arguably been on a downhill slope of epic proportions when it comes to the state of their Catalyst software suite. The quality of the drivers is absolutely dreadful — each release is rushed (usually thanks to a developer push for included optimisation on release), along with the various versions (stable, beta, profiles included, catalyst display software). Full of bugs, game breaking glitches and the constant “20%” performance increases — each release seems worse than the previous one. Then there’s the fact that the installer still doesn’t offer automatic updates, but instead is filled with completely convoluted and useless settings over those that actually effect games, and it actively seems like the developers are more focused on pushing out the next product than supporting the one they released 3 months ago. Why the hell do I have to download a game profile? Why doesn’t it just do it for me?
nVidia, for all it’s faults, has dramatically improved over the past 5 years. They have cleaned up their site, and introduced GeForce Experience — a product I initially wrote off as PR garbage — as a fantastic first step towards simplifying and streamlining the PC gaming experience. It automatically checks for driver updates, optimises supported games based on your cards proven performance, and will (soon) be able to automatically record game footage without any external software. These are actual functions that actual gamers want in their GPUs. While the more nerdy of us folk enjoy overclocking and tinkering, the large majority just want BF3 to run without stuttering, crashing or requiring a special profile to download.
But nVidia is not off the hook just yet. As previously mentioned, much of its mobile deviation risks pulling resources from innovation in the desktop, and ultimately console, gaming fields — where arguably the large bulk of games are more profitable and more engaging. A graphics card company that is not pushing boundaries is failing its base and thinking far too much about the short game. Both companies need to work together with case makers to ensure building systems means no short cables or lack of air circulation. Cards need to be smaller and cooled innovatively in order to fit smaller, more portable, systems to meet a changing market. There needs to be huge strides in improving drivers, UI and the entire experience from the moment a new machine boots up.
Otherwise, sadly, consumers will move towards developers that understand the new world order when it comes to gaming – and that’s away from the PC.
Header image courtesy Tbreak.