Saturday, April 12, 2014

Who cares about DirectX 12?

So are you excited about DirectX 12?

Does the promise of better efficiency, reduced CPU load and new features make you want to run out and scoop up that new Nvidia Titan Black edition in anticipation of its release...someday.  You do know you'll have to wait till holiday 2015...maybe.   Microsoft is saying that 50% of current hardware will be able to take advantage of it.  They don't mention which 50%, however...

Maybe AMD's "Mantle" is more your speed which pretty much offers the same thing so long as you have the hardware to support it.   If you haven't bought an AMD GPU in the past 2 years you're pretty much out of luck.  Only AMD 7000 series or above built with their proprietary GCN (Graphics Core Next) architecture need apply.

At least Mantle is available now while DirectX 12 is still vaporware.  Assuming you meet all the prerequisites you can try it out on Battlefield 4 and Thief.  Reviewers are claiming marked improvements using Mantle although they point out that it's not consistent across all PC's. 

Performance claims don't hold much water, however, as Battlefield 4's continuing issues make any  claimed benefit suspect.  Anyone who regularly plays it knows that keeping it running more than 5 minutes without crashing is an accomplishment at this point.  Don't even get me started on the whole Netcode thing...
Here's the rub...

I could honestly care less...

How many times in the past decade have we been forced to throw out perfectly good hardware because a highly anticipated game requires an API our hardware doesn't support.  Worse, aside from efficiency improvements (hmm, sounds familiar) we never really saw much of an improvement in gaming.  The first time I got bit was the release of Battlefield 2.  That game required shader model 3.0 meaning my "old" card with shader model 2.0 was destined for the bin.

Did Battlefield 2 really look that much better than Battlefield 1942?  Let's be honest, it didn't but it marked the beginning of the upgrade treadmill for me and thousands of others.  Hundreds of dollars thrown to the wind for a piece of $40 software we'd tire of in a few months.  A practice repeated at least half a dozen times.

You have to remember that it wasn't always this way.  Back in the 90's, DirectX was just one of a handful of API's used for gaming.  It was only the rise of Windows (specifically Windows 95) that pushed it to the forefront. 

Veteran gamers remember the days of DOS when OpenGL and Dos4GW were the foundation of 3D gaming.  3DFx was the enthusiasts' choice creating now commonplace features like SLI and a graphics API that allowed direct access (via Glide) to special hardware functionality.


While technically superior compared to the early days of DirectX and OpenGL the ability to program to an interface instead of proprietary hardware was a boon for game developers.  Soon GPU vendors found themselves entirely reliant on Microsoft's DirectX specification.  Even if they could do better they were hamstrung by what had become an industry standard. 

So everything old is indeed new again.  One of the promises of both Mantle and DirectX 12 is closer if not direct access to GPU hardware.  It's an old idea.  Get the middleman out of the way and performance will naturally increase.

Except..

We're right back where we started with proprietary hardware.  It's 3DFx all over again.  DirectX is currently a universal API that any GPU can use so long as it supports its basic feature set.  With DirectX 12 squarely aimed at countering Mantle (or perhaps AMD forcing Microsoft's hand) we're rapidly approaching the resurgence of proprietary hardware standards.  This time around, however, it's Microsoft and to a lesser degree AMD calling the shots. 

Even more interesting is how much really isn't happening with all these purported "advancements."

Early reviews of Mantle show that while some improvement is seen with enthusiast grade graphics hardware the real story is in the more pedestrian examples.  Both Mantle and the upcoming DirectX 12 are designed to take advantage of current generation graphics hardware.  That means getting more cozy with the graphics hardware and getting the rest of the system out of the equation as much as possible.  The result lightens the load on your system's CPU thus freeing resources and eliminating possible CPU bottlenecking. 
So is it really any surprise that cheap video cards benefit?  It isn't about any great advance in API development.  It's a page taken from the days of 3DFx when the best performance came from leveraging the hardware and getting clutter out of the way. 

Great but let's not forget that Microsoft caused the problem in the first place.  They made development easier but stuck a layer of abstraction between the game and the hardware.  A trade-off to ensure that games continued to be developed for the platform regardless of how flawed the end result.

It's not just about performance either.  Anyone who plays online knows that cheating is a persistent problem and with DirectX hacks are easy.  If you have a chunky API between the player and the game you have plenty of opportunity to exploit the condition. 

Take for example a common hack in FPS games known as the "Wall Hack"  What makes it possible is that DirectX requires all the information about an active game to be available all the time whether its local or online.  Hack into the area of memory containing current player locations and you can literally see and maybe even shoot through walls. 

There's not a lot that can be done about it either.  If the developer cares at all they may issue a patch but everything they do is after the fact.  The sin can't be prevented only addressed when committed.  This is where utility trumps function and gaming has suffered for it.

Let's also not forget about Microsoft's shady history with DirectX...

DirectX 11 was introduced in 2009 and with only marginal changes has remained largely untouched.  Meaning graphics vendors will have waited 7 years before they finally get to fully leverage their current hardware designs.

Remember DirectX 10?  Back in 2007 if your video card didn't support it you might as well throw away your keyboard.  Or so we believed.  It had a relatively short shelf life once Windows 7 launched and introduced DirectX 11.  In spite of the hype there wasn't a whole lot of difference between them but almost overnight graphics vendors found themselves with useless hardware selling for bargain basement prices.

And for what? 

It took years before developers took DirectX 11 seriously.  Even recent blockbuster titles like Bioshock Infinite were written for DirectX 9.  Battlefield 4 with its shiny new game engine (Frostbite 3) works perfectly fine in DirectX 10 just like its predecessor Battlefield 3!



So what can we take from this latest announcement from Redmond?

Not much.  Some say it's a reaction to AMD's Mantle while others claim it's been in the works for years.  Regardless, don't expect anyone to take DirectX 12 seriously for at least 2 years.  That's 2017 for most of us and with it will come a graphics market even more influenced by Microsoft. 

That's fine if you like things the way they are now but don't expect much competition.  The money in enthusiast hardware is still on the Windows platform (which includes Xbox) regardless of whether or not the SteamBox takes off.  That translates into Microsoft dictating hardware designs even more than they do now. 


Which means nothing has really changed and we can still expect baseless hype, inflated graphics card prices and empty promises.  At least the cheap seats will see some benefit and that's good news for all those board makers who make most of their money on mainstream graphics hardware.  After all, you can only sell so many Titan Blacks.