Ok, so I'm an old man. Irritable, opinionated and curmudgeonly at times. Mind you, that's not my general disposition, I only get that way when I'm force fed a load of marketing BS.
I hate hype, you know that. What I hate even more is when hype is used to cover something far more sinister. That being an intentional misdirection to present the subject as something it's not.
I'll suffer the barbs from the fanboys for expecting to get 5 years out of a hardware platform before pronouncing it obsolete. That's because I know I'm right. A position vindicated by those few who chose to dig deeper.
I remember having a conversation with the owner one day. He was shaking his head as he surveyed a sea of developers feverishly coding the latest release in Borland Delphi. He said, "When I started this company I could write an entire accounting system in 64K, what a waste."
He hated visual programming environments and the overhead they introduced that in his estimation did nothing to make the code any better.
That was almost 20 years ago. I understood what he meant but I was also a big fan of "easy." If he saw the amount of cruft that goes into software development these days, however, he'd just give up.
So who cares about some boring accounting package from 20 years ago. What does that have to do with gaming??
Everything...
You see regardless of whether the software is an accounting package, game or hardware driver, there's a layer of abstraction that takes you further and further from optimal use of resources.
That's the case with Nvidia and AMD drivers. Even if you forego the installation of all that extra Gameworks or Crimson crap you're still dealing with the layer of abstraction introduced by Direct X or OpenGL. That not only causes extra overhead but allows entry points for hacks and cheats. Another thing that brings out the curmudgeon in me....
"But Dudz! All that stuff (that I really don't understand) makes game development easier and makes games come out faster. Now developers don't have to produce custom code for every possible graphics adapter! "
Yes and No. Remember that nothing's free and all that convenience has rapidly become an excuse for what can only be described as sloppy coding. Add in all of the "customization" that driver add-ons like Gameworks stack on top of it and suddenly we're back to the bad old days of having games coded for specific graphics hardware with the bonus of extraneous crap to slow it down even more.
I can't accept anything that exists for it's own sake and that's what current gen gaming is.
I can't accept anything that exists for it's own sake and that's what current gen gaming is.
If you're wondering, the video above does an excellent job of describing one particular vendor's "optimizations" that have effectively built-in an artificial obsolescence. It's all for no other reason that to sell newer video cards. There's also the added benefit of making the competition's hardware look bad because it can't use all that "custom" code.
This goes back to my original point. If you watched the video above you see an example of a tessellation effect from Nvidia that is marketed as adding greater realism. Problem is the games have to be coded to take full advantage of the effect which requires using their development kits which only benefit their own hardware.
But the requirement is a fallacy. Be it the driver mods or sloppy coding the net result is wasting resources for no reason. For example, rendering an entire unseen ocean when all that's visible is a small lake. I mean c'mon now, is it really necessary to add tessellation effects to a bare concrete wall or to a texture I can't even see?
Worse, because older hardware may not have the capabilities to deal with this waste of resources you suddenly end up with an artificial obsolescence.
Meaning for no other reasons than marketing hype and/or sloppy coding, your last generation card (or competitor's card) either won't run the game or run it very poorly.
I cannot and will not accept the rate of hardware churn when there's nothing real driving it. It's like flattening the tires on your car and telling you the only fix is to buy a whole new car.
That's insane but that's the model the gaming world is built upon.
I could care less about how much better a 980 Ti runs Witcher 3 versus an R9290X because contrary to the assumption, all things are not equal. Even so-called "benchmarks" are affected by what can only be described as an artificial metric!
That's insane but that's the model the gaming world is built upon.
I could care less about how much better a 980 Ti runs Witcher 3 versus an R9290X because contrary to the assumption, all things are not equal. Even so-called "benchmarks" are affected by what can only be described as an artificial metric!
What needs to happen is that drivers need to be drivers and software needs to be software. We're going to be stuck with hardware abstraction layers and bloated development libraries even on consoles but there should be no need to upgrade hardware just because somebody is too lazy to tighten up their code or worse builds in an artificial limitation for the sake of unit sales.
It seems hardware vendors are as full of BS as game publishers now. I don't know if there's much hope for the future if it continues.