Thursday, February 24, 2011

Imadirtycheater - Cheating in games


Yes, I admit it, I'm a cheater.
The title of this post comes from a cheat entered at the console command line of the old Battlezone game for the PC.  It allowed the use of console commands that gave you the ability to be invulnerable to attack (godmode) or fly through the scenery (no clipping) amongst other things.
These commands are usually accessible by activating an in-game console left over from the game's development. The console was usually used to test out gameplay mechanics not normally accessible with playing the game "straight up".
Maybe the developer wanted to see just how the game reacted with the maximum amount of action on the screen.  If you're in god mode for example you could conceivably have hundreds of opponents on screen at the same time.  That isn't going to happen if your character is dying every 5 seconds.
Other uses for the console include running benchmark tests or game customizations like mods.  Mods are generally produced by a community who enjoy the game but want to change the experience to something other than the "as delivered" condition.
So is it ethical to go the easy route? 
Definitely maybe could be kinda yes or no...
Ok, to be serious, it depends.  If you're concerned about climbing to the top of the leaderboards then you may be tempted to gain a little advantage.  Early on, this was more of a problem than it is now with players posting unrealistically high scores to the dismay of non-cheaters.  

Games are much more sophisticated now and have anti-cheating routines written right into them.  It could be as simple as disabling the console or utilizing anti-cheating programs such as punkbuster which constantly monitor for known exploits. Another option is to do nothing at all but invalidate any accomplishment gained while using the cheat.
These days you see less concentration on cheating for single player games since you're only really cheating yourself.  It's a tradeoff since you still get to play the game and explore all its dimensions but you don't get the bragging rights.  That allows you to both enjoy the carrot but still get beaten by the stick.  

Steam is probably the best example of this since most of its games have single and multiplayer  level achievements that function in exactly this way.  Sometimes it means that you don't get the "Big Gun" or are ineligible for a tournament.
There are games that are almost unplayable for a lot of people because they're simply too difficult for anyone but the most adept game controller freak to master.  Then there are games that are so easy that it's just insanely stupid to cheat.  Personally I feel that if I plunked down my hard earned cash I have the right to do whatever I want with the game.  However, I don't try to pass off my "assisted" accomplishments as anything but cheating. 
Sounds kind of harsh but that's what it boils down to.  Do what you like but don't expect the same rewards for someone who's invested more effort.  To me it's a game and it's supposed to be fun, not a career.  

8 hours trying to beat a level is a waste of time and I don't invest that much time in anything that doesn't put money in my wallet or add value to my investments.  To someone else, however, it may be a lifestyle and I wouldn't deprive them of their "hardcore" version of the gaming experience.

So does that mean I'm totally against cheating?  Not at all but I make it conditional.  I'm at an age where I choose how I spend my time very carefully.  I like games to be challenging but not impossible.  I'd rather figure out a puzzle than a complex control sequence.  This happened when I played Portal.  I got through 3/4 of the game without a cheat but I hit one room that I couldn't get past after hours of trying.  At that point I enabled the cheat mode and got through it. 

I love Portal and the concept is infinitely entertaining but there are parts of it that come down to jumping around in a certain sequence that ruined the experience for me.  There was a great backstory and ambience to the game that didn't lend itself well to "Donkey Kong" gameplay.

So as far as single player gameplay, I'm in the anything goes camp.  The picture changes quite a bit for Multiplayer gameplay however.
I will drop a game faster than a handful of scorpions if I experience cheating in multiplayer.  Multiplayer gameplay should be free of anything but a level playing field.
I'm reminded of a few years back when I was playing Test Drive Unlimited.  This was a gorgeous game visually but it had some faults. The most glaring was the bad car control.  It took me months to get used to how game control worked but it was a worthwhile effort.  So I decided to try the multiplayer component one day and soon found that so many players were using cheats that you almost couldn't find a match without someone cheating.  Atari (the publisher) tried in vain to prevent multiplayer cheating but was never successful.  

Apparently I wasn't alone in my opinion because a few months later there was nobody on the multiplayer servers.  Atari had to clear the leaderboards at least 3 times because of it.  When the average speed of the top 10 players is 99999 you know something's up.
You shouldn't run into much cheating in multiplayer these days (in newer games) * see note since there's so much technology to prevent it and the multiplayer maps are generally hosted on a server under the developers influence.  

Multiplayer is the big cash cow for game companies and anyone who "games the game" threatens that.  If left unchecked it will eventually translate into a drop off in new sales.  Lately the industry has moved toward what I like to call "Dynasty Games".  

Dynasty games are games that follow a common theme like Need For Speed, Call of Duty or Diablo but offer a new experience every few years in the form of a sequel or new incarnation of the franchise.  It's not unlike those movie sequels that seem to come out every year.  I'm sure SAW 94 is due out any day now :)
If a game gets a reputation for cheaters, however, it can threaten the franchise and thus the cash cow.  So in the end it's all about the cash and why World of Warcraft is a Billion dollar franchise and Test Drive Unlimited died within a year of its release mostly because of (In my opinion) Atari's inability to deal effectively with cheating among other issues.  

*2015 Note:  I was a bit naive about cheating in 2011...lol

Thursday, February 17, 2011

Co-Op gameplay and video game promotion

I'm sick of it...

I've written (and said) before that I enjoy cooperative game play in PC games.  RPG's like Dungeon Siege and FPS's like Call of Duty have had support for cooperative multiplayer game play modes for years.  So why does it seem like most game developers don't include it?  I know Blizzard's made billions with World of Warcraft but the model has been copied to death.  (If you haven't guessed, I don't like it or play it)

Co-op does not mean joining up with your friends online to go against other players online either.  That's just multiplayer, not co-op.  Yes, real people should be more interesting to play against than AI but most of the time they're not.  You get the Rambo types that think they're the next "Fatal1ty", players in "clans" that take themselves way too seriously and noobs who are more "nooby" than me.  Any or all of which make me regret even bothering with the game in the first place.

I guess I'm the type of gamer that likes his gaming to be more like playing a part in a movie than thrashing around in a mosh pit.  I understand that there's a place for MMO-whatever games but it's not for me.  I've played some truly exemplary titles that offered incredible single player game play and good multiplayer but had NO cooperative mode.  If you already have a good AI system how hard is it to just allow your buddy to go through a campaign mission with you??!  It can't be that hard because Battlefield 2 had no co-op till someone figured out how to join a single player game over a LAN.

You can't assume that because one game follows the formula of another that it will have the same options even if it's from the same developer either.  Call of Duty:Black Ops is a good example of this.  You only get cooperative game play in a tacked on zombie mode.  TreyArch was the developer and also did Call of Duty: World at War two years earlier.  World at war had zombie AND campaign co-op modes so what happened with Black Ops?!  Was it so hard to put a co-op campaign mode in?  Call of Duty: Modern Warfare 2 developed by Infinty Ward had an excellent co-op mode that paralleled the single player campaign and received widespread acclaim.  It's still a favorite of mine.

Then you get to the "Mystery Games"
Take the new game BREACH for example.  This is your run of the mill Call of Duty clone with a neat twist on destructible objects.  So why did I have to spend 30 minutes trolling the Internet to find out if it had a Co-Op mode or not???! BTW, it doesn't have co-op OR single player.

So LET ME KNOW THAT UP FRONT! I don't like this bait and switch.  It's a sorry business model that relies on half truths and misdirection.  You wouldn't buy a car without knowing what features it had or canned goods without labels so what makes it OK to hide specifics about a game that's going to cost $20 to $60.

I have a solution; Either put the information on your website or release a demo like the old days of shareware titles.  Money's harder to come by these days and I can't be the only one with a short fuse for marketing trickery.

Game Controllers





I'm going to try to shorten up my blog entries since I have this nasty habit of writing the great American novel for every post and I know most gamers have a touch of ADD or at least I do....:)


That said...


I've found there are 4 main types of game control these days with some crossover between PC and console gaming.


1. Keyboard
2. Game pad
3. Joystick
4. Motion sensing


Keyboards are the classic PC game controller eclipsed only by the joystick (at least till DOOM came out). It's the choice of most gamers on PC's (not that I've seen a keyboard on a game system since Colecovision).   A whole enthusiast market has grown up around ever cooler features like custom keys, LCD displays and backlighting so you can see the keys in the dark (most PC gamers aren't touch typists ).


Gamepads showed up with the first modern personal game systems like the Nintendo NES with similar controls to the arcade shooters of the 80's.  Gamepads were a successor to the joystick as games developed more complex and numerous controls than it could feasibly handle.  Before gamepads got popular there were (and still are) joysticks that had upwards of a dozen special buttons to try to accommodate newer games with more control options.  

You can get gamepads that work with PC's but you won't find much use for them unless you're a console gamer playing a multiplatform game on both PC and console. Some people swear by gamepads, I swear at them mostly because they're always too small for my big hands and it doesn't feel natural to me.  That's just me, however and your mileage may vary.


Joysticks are the undisputed classic multiplatform game controller dating all the way back to the first Atari VCS systems.  Back when most games were based on 2 dimensions and simple control options these were the ideal choice offering the most accurate control available. There were game "paddles" and "trackball" controllers but they were just a variation on the joystick  targeted at specific games like Pac-Man, Centipede or Missile Command. (Am I dating myself here?)
Joysticks are great but are relegated to secondary status these days because they just don't work out as well for FPS or RPG games and are largely relegated to driving titles and flying simulations. 

I'd have to add Steering Wheel controllers in this group as well since most simply offer an alternate physical controller that nonetheless sends the same signals as a joystick controller.  I personally prefer wheel and pedal setups for driving games since it's more  natural and allows for more subtle movement than a joystick can offer.


Motion sensing got popular with the Nintendo Wii and is probably responsible for more broken TV screens than any other gaming platform :)  The whole idea behind motion controllers is that your body movements control the action on the screen instead of a key press or pressure on a control.  

The Wii controller is more like a wireless joystick from the standpoint of having to actually hold on to something to control your game but that's where the similarity ends.  It's got internal accelerometers, Bluetooth and infrared emitters that transmit signals created by your movements to the Wii.  This video explains it more in depth Wii Remote  The new Sony Playstation Move is a similar system to the Wii remote utilizing a hand held controller setup.


The newest popular motion controller is Microsoft's Kinect.  Kinect doesn't use a separate controller but rather can track body movements using an infrared based motion sensing system kind of like motion detectors in an alarm system but a lot more sensitive with a lot more processing going on to track movement.  

If you were to view a darkened room with with infrared goggles while a Kinect was active you'd see most of the room front of it painted with thousands of small dots.



Motion sensing may be the way to the future so long as it's sensitive enough to pick up the subtleties of current conventional controllers.  For most gaming this probably won't be a problem but for those used to complex keystrokes in their FPS games it may take some adjustment and patience.

Me, I'm still a joystick and keyboard guy.  I play FPS games with keyboard and mouse but I use a trackball  mouse which I believe gives me a slight edge in FPS titles since the cursor motion is more precise and sensitive with less wasted motion on my part.  However, someone playing the same title with a motion controller has a more natural relationship to game control since there's less translation in the way you think about moving around in a game.  For example; In most FPS games to move right I'm going to probably have to hit the "D" key on the keyboard and the "W" key to move forward.


The same movement sequence on a motion controller can literally be a hand gesture to the right and then to the front. It's also possible that with adequate tracking of player movement you could easily surpass a keyboard simply because  a keyboard is not that different from the old joystick controllers(1) when used for movement in a game.  It's a simple button that's either on or off without many subtleties other than how fast you press that button.  A motion controller that's sensitive enough could pick up far more information by orders of magnitude compared to ON and OFF while offering more precise control.

 I've already seen applications of the Kinect being used with PC's for desktop control and even as a stand-in for a mouse.  There's a music video whose video portion was made almost entirely of sensor data captured from the Kinect sensor.   It reminded me of those pop videos from the 80's that had strange video effects.


Kinect Music Video

So there's no doubt motion sensing is a popular technology.  The idea has been popular for years with TV shows like CSI:Miami using motion sensitive control to manipulate screen images.  It was the stuff of fantasy 5 years ago but it appears it's a mature enough technology to be almost mainstream now.


I'm hoping for a gaming future with motion sensing and holographic projection for display!
1 Old joysticks had 4 pressure sensitive button contacts activated by a plastic tab connected to the actual stick portion of the joystick.  Diagonal movement was achieved by adding together the input from two of those button contacts.

Sunday, February 13, 2011

Re-playability

Had another great gaming night.  It was great because my friend and I finally completed a particularly difficult Co-op mission in Call of Duty:Modern Warfare 2 called "Snatch and Grab".

This mission was difficult because you never really know what's coming at you next.  The map is set in an airplane graveyard affording more than ample cover for you and your opponents.  Add to this the most difficult opponent known as the Juggernaut.  Juggernauts are enemy combat soldiers who are outfitted in a explosive and small arms fire resistant suit that can literally absorb rockets.  They look like a cross between the Michelin Man and an astronaut.  

They annoy me to no end...

Mostly because you have to go up against them usually with weapons that aren't up to the task.  The analogy of bringing a knife to a gun fight comes to mind. 
We had been trying to beat this map for months and always got close but never seemed to be able to close the deal because one of these bulging annoyances always showed up just when we thought the worst was over.
For months we had got within sight of the endgame but always got cut down by one of the bulbous baddies leaving us pounding our fists on the desk and falling back to weak praise about lesser victories during the exercise.  Nothing but victory over this blasted map would bring real satisfaction, however.  We were determined to win.

Last Friday night we did it!  My friend suggested an alternate tactic and I enthusiastically agreed.  So we employed this tactic and found it was working quite well.  We got closer to victory than ever before but still lost.  Undeterred we tried again, once more into the breach as they say...

WE WON!  Hooray for us!  That moment was so satisfying and worth the effort.  Right there is the very definition of replayability.  This game kept beating us but kept our interest so that we'd keep trying.  Every loss to that point still had great moments and kept us coming back for more instead of leaving us disgruntled and murmuring obscenities. 

I've played games that I've literally given up on after less than an hour.  It's why I like demos.  If it's something I'll get a few months out of without breaking any furniture then I'll make the purchase.  If it's just an endless string of boring puzzles or  weird control sequences I'm done with it in 5 minutes.  I want immersion, I want to feel like I'm participating in something.  That's what games are for.  They're a window into another perspective or even another reality that would be otherwise impossible to experience.  In the process you may even learn something. 

Now to some what I'm about to say may sound hollow but bear with me, there's a point here...
Since I've been playing games like Call of Duty and Battlefield 1942, I've gained more appreciation for those who had to experience the real thing.  In games like those, the concept of war isn't as abstract as just blowing things up for points or map control.  In these games you have to deal with overwhelming numbers of opposition, badly functioning weapons and all the challenges of less than accommodating locations such as the jungles of Vietnam or the shooting gallery of Omaha beach.  In trying and often failing to undertake these virtual missions it gave me more of a sense of the hopelessness and ultimately determination to succeed. 

There's a lesson there; more than just the old "If somethings worth having it's worth working for".  To win at these games there has to be a sense of how the real thing works.  Look in any of the credits and you'll see plenty of military advisers and scholarly types who lent their experiences to the creation of the game. 

It's that attention to detail and thoughtfulness of design that makes these games valuable beyond just the initial pass through the single player campaigns.  It's kept my friend and I playing games like these long after the buzz has subsided over them.  It doesn't just have to be FPS games either. We often play flatout 2 in the Stunt mode tournament and have great fun with it.  That game is about 5 years old now but we still play it on a regular basis because it still holds our interest.  In some games you can figure out the formula for success and just ignore the rest and easily win every time.  That doesn't apply in Flatout 2 as repeatability is offset by factors beyond your control.  There's a randomness to every game situation that keeps the game interesting.  You won't win by driving a certain line on the track or repeating a controller sequence.  My friend is actually better at this game than I am but can still be surprised when a sure win is suddenly snatched away by a variable that was unaccounted for. 

There's the key to a good game, replayability.  Long after the shrink wrap has been tossed away a good game will keep our interest and offer something new or unexpected.  This is why I'm not a fan of pure multi-player online games.  After a while they get stale and aside from maps and game resources there's little that makes it interesting.  Games like that eventually become training grounds for groups of players who organize themselves into "clans" making the underlying game itself less appealing to new players.  There's an almost mechanical vibe to such gaming that ruins the experience.  When gaming is just about who is best on their keyboard or game controller I usually take a pass.  If I'm going to invest my time in something it better test more than just my reflexes.

Friday, February 11, 2011

Online Gaming - Specifically Battlefield Bad Company 2

So I finished Call of Duty: Black Ops a few weeks back.  Decent game, very playable and my buddies and I have a great time with the Zombie maps every Saturday.  Finished Battlefield: Bad Company 2 about a week later and now I'm exploring the Multi-player gameplay.
Well, Typical for the Battlefield franchise the controls are overly complicated but you kind of get used to them.  The single player was amusing with all the chit-chat going on behind me in every mission.  After I finished the game I was going through the Battlefield  forums and I saw one post that said, "You owe it to yourself to try the multi-player"

Ok, so I tried it.  It's kind of funny because I've been playing the Battlefield games since Battlefield:1942 and it's almost always the same experience.  Of course it looks better than it did back then but the gameplay is the same for the most part and entirely dependent on the server you join.  Yes you have the spawn killers, Superhero rambo types and guys who maintain multiple accounts just so they can join a "noob" server and wax everyone who's still trying to figure out what keystroke switches to their next weapon. Ok, Whatever...

So I went into the multiplayer not expecting much.  Battlefield has had helicopters you can fly since Battlefield:2 but they seem a bit easier in Bad Company 2 even though I crashed the first time I got in one.  Actually the best control of helicopters I ever had in the Battlefield franchise was in a custom mod for Battlefield:1942 called "DC Final" that was set primarily in the desert maps from the 1942 game.  In fact, gameplay and control was so good in that I went out and bought Battlefield:2 hoping for the same experience with better graphics since most of the development team from "DC Final" was brought in for Battlefield:2 (Dice).  I don't know what happened between DC Final and Battlefield:2  but I was disappointed.  In fact it was so bad that I didn't bother with another Battlefield game till Bad Company 2.

I picked up Bad Company 2 for the single player and because Steam had a sale.  Note that there's nothing about online in that last sentence.  I knew there was some hope when glancing at the control setup screen I noticed that Bad Company 2 has the ability to use a game controller for flight control.  I personally think flying machines should be controlled with Joysticks not keyboards or gamepads.  At least Dice (the developer) finally figured this out and that's a good thing.

Bad company 2's multiplayer gameplay must be better than my previous experiences.  I've actually invested some time in it and almost enjoy it which is rare for me.  See, I usually don't enjoy online multiplayer because of the previously mentioned superheroes and spawn killers.  That's not tolerated as much in the multiplayer component of Bad company 2.  I usually stick to co-op play against AI opponents and I still find it preferable to straight PvP.  In fact I was  incensed when I found out that Call of Duty Black Ops had no option to undertake missions in co-op mode.  Modern Warfare 2 and World at War allowed it but not Black Ops.  Sure the zombie thing is fun but it would have been more fun to have the co-op option.  My friends and I still enjoy Call of Duty Modern Warfare 2 in co-op mode especially the support missions like "Big Brother" Where you or your buddy can take control of the gun on an attack chopper and try to support the other guy who's on the ground facing hordes of baddies.

Now remember, I'm no kid.  My gaming experiences go beyond the game itself and are actually social experiences with other real live people in the same room.  I can't count how many times my buddy on the other side of the office has reminded me that I have a perfectly good hot cup of coffee that's quickly turning to the iced variety because I'm so engrossed in the game.

I sometimes think that game developers think that the only people buying their games are 15 year olds with no friends.  Reality Check guys, the population is getting older and we don't get our thrills on the playground anymore.  Sometimes it's fun to dive into the masses but the experience isn't enough to justify the 30 to 60 dollar price of admission.  I mean, how hard can it BE!  Most good FPS titles have a decent Single Player component with ever more challenging AI.  So how hard is it to just allow your buddy to do a mission with you?

Battlefield 2 had the same issue till someone figured out  a hack to allow another player to join in a single player mission.
Anyway, I saw some improvement in online multi-player but I still want to see more co-op gameplay.  I just hope someone comes up with a  Bad Company 2 mod that allows co-op and maybe allows gameplay on some of the multiplayer only maps!

Observations from a 40 something video gamer

Maybe I'm too old for this stuff...

Maybe not.  See, I like to play video games.

Shouldn't be that much of a surprise for someone my age considering I grew up with them.  So I'm old enough to remember (and probably played all of) the classics.  So everything old is new again.  Now there's a subculture and even subscription services to play games I still have in a box somewhere in the garage.

Now before you run off and start assuming I'm one of those strange people who live in their mother's basement saving their pennies to attend the next quake con in some outfit modeled after my favorite character I invite you to STOP RIGHT THERE!  I don't even have a copy of WOW nor do I want one.  It's not my thing.  My general opinion of MMORPG's is that they're for folks who need something to do when they tire of splattering their life on their Facebook page for the day.

I'm a normal guy, I do computer consulting for a living and  often wish I'd chosen something else for a career.  I like working on my old El Camino more than removing the latest rootkit that someone downloaded.  I'd like to think I'm reasonably  intelligent and sociable and I get a bit stir crazy if I don't get out amongst the populace on a regular basis.  So I'd like to think I'm fairly average albeit a bit off-center from the norm.

So what about  the gaming?  Well, I'm not an early adopter... of anything.  I don't buy the first model year of a new car and by extension don't buy games when they first come out.  For one thing I think it's ridiculous to pay $60 or more for a product I don't even own.  You never own a game, you're "licensed" to use it which means a perpetual rental that some faceless corporation can pull the plug on at any time.  So I like to wait till the hype simmers down and pick up something that seems interesting if steam happens to have a sale on it.  Don't believe it?  Try to play Need For Speed: Carbon online sometime.

Anyway, that's how I picked up the entire Half Life 2 collection.  I don't care that I wasn't the first to see all the cut scenes.  I like the fact that by the time I got it I had a boundless treasure of walk-throughs, cheats and YouTube videos to help me over the rough parts.  Hey, I appreciate the work that went into the game and the long hours it took the developers to figure out how to confound me.  In then end, however, I figure I paid for the right to play this game and invest my time in it.  If I get stuck, I need to get past it and get on with it.  Nothing is more irritating than having to give up in the middle of a game because you can't get past one stupid level.  I paid full price (meaning whatever Steam charged me) so I expect to see the ending credits at some point before I ring in the new year.  So I do what I must.

For me time is precious and valuable.  Look, I want to be entertained and immersed in someone's vision of an impossible reality.  I'm not looking to make this a career.  Now if I were twenty years younger and indeed lived in my mom's basement maybe I'd be a bit more tolerant but as it is... I'm not.  I enjoy a good driving simulation like Grid or Need For Speed: Shift as well as the more arcade leaning NFS: Most Wanted, or Flatout 2.  I also enjoy a good First Person Shooter so long as i don't have to be 14 to last  more than 5 seconds online.

I look at gaming as a semi-social event.  Not unlike the typical poker night. Which means I prefer to be in the company of other people when I'm doing it.  It's  important to me to have a good selection of titles with long replay value.  It's also critical that there's a good co-op component.  I don't believe that online gaming means being thrown in with the masses.  I enjoy going up against AI opponents with a few friends.  I also don't appreciate confusing controls or being subject to "Games for Windows" consoles that ruin and frequently block the gaming experience.  Ask anyone who's played Dirt 2 on Windows XP 64 and you'll see what I mean.

So that's it.  A basic profile of a more "mature" gamer.
I've got lot's of observations and I've finally decided to start dropping them on the world.

  

Let me Be the Bad Guy...

So I was at a friend's house recently during our regular Saturday gaming night  (Think poker night with keyboards and LCD panels) and during one of the breaks he was telling me about his progress through Dragon Age.  Dragon age is an Role Playing Game (RPG) in the vein of Elder Scrolls: Oblivion, Dungeon Siege and the like.  Anyway he  was lamenting the apparent one-sided view of game development in such titles ie; always being the good guy.  Now don't get the wrong idea, my friend doesn't have goat skulls hanging above his mantle or run around in black robes chanting bible verses backwards.  His observation shines a light on the undeniable fact that RPG's follow a preset formula that hasn't changed since the first incarnation of  Zork on a DEC Mainframe.

It seems all these games want to cast you as the squeaky clean hero or at best a Clint Eastwood-esque anti-hero.   Regardless of the spin, you're the good guy out to foil  the nefarious plans of the stereotypical bad guy.
Some RPG's allow more depth and character development than others such as Dragon age's inclusion of a type of character relationship engine that influence your game experience as a direct consequence of your interactions with other players.  Still, the balance is tipped in favor of being the bearer of all things nice.

There have been RPG's that allowed you to take the good or evil path but it's just shades of grey.  Dungeon's and Dragons online, Fallout, etc.  The question is where is the real character development?  Pick the good side and you're on some great quest to save whatever.  Pick evil and your goal is to screw up previously mentioned good guy quest.  Yay...  Is there no more depth to being the bad guy than egotistical monologues and a bad attitude?

How bout an RPG that has nothing to do with the aforementioned "good guy" quest?  Architecting the downfall of civilization takes an effort!  Mostly because of the efforts of those pesky good guys running around.  To be honest, I cannot think of a single RPG that has developed the dark side of gameplay beyond just being an obstacle to the hero character.  Even if you have the choice to be an "evil" hero you are still just playing a slightly modified version of  the hero's quest.  It's the same basic experience except your character gets to wear cooler clothes...

Ever notice how cool the bad guy's digs are?  They always have complex dungeons, massive castles or impenetrable fortresses.  How'd they get all that?  What adventures did they undertake to acquire such bounty?  Chances are you'll never find out.  OH yeah, maybe some 20 second cutscene here and there but where's the fun in that?

Perhaps there's some fear of societal backlash against becoming involved in a game that demands large amounts of time thinking about how to do bad things to nice people.  I remember a time when the original Dungeon's and Dragon's RPG (with dice not computers) was accused of being affiliated with Satanism and all things unsavory.  Never mind that most of the character types were variants from pagan, roman and greek mythology.  Show one big Red guy with horns and everyone is suddenly at risk of losing their mortal soul I guess.

I'd hope that as a society we were beyond such judgements but in a world where video games have been blamed for everything from petty theft to murder it's understandable why this spin on the typical RPG is less common.  Of course I'd argue that society can't grow without exploring ALL sides of the human condition and that includes the undeniable truth that deep inside all of us is the desire to explore the darker side of our nature and beat up on the nice guy.

A different but similar type of gameplay is the First Person Shooter or FPS.   The big difference here is of course the point of view while playing.  Instead of what I call the "helicopter view" you play as though you are on the playing field.  That big gun is YOUR big gun in YOUR hands.  No "God Games" here.  It's mano y mano soldier!

Most of these games involve some variant of armed conflict ranging from the stone age to some far-flung future with anti-grav boots.  This type of game is primarily concerned with action and less about inventory or character interaction beyond shooting or being shot.  Still some of the same biases apply.   In almost all cases you start out as the good guy going after the bad guy with whatever weapons you can procure.  Perhaps you're an allied GI in World War 2 pursuing a Germain soldier across a battlefield.  Maybe you have the option to play as the German soldier.  Other than a change of uniform and weapons it's still the same experience either way.

I have noticed a recent (albeit small) change in this type of game however.  Specifically, I am referencing the relase of Dead Space 2.  In their Multiplayer campaign you can actually play as the evil bad guy(s).  Now granted, it's not the same amount of committment as an RPG game but it's the first time I can remember where you can actually play as something you would normally be mowing down with your super turbo minigun.  Yes, I know that in most FPS games you could always play one side or the other but from the point of view of an FPS, sides don't involve morality just a change of uniform.  Even in World War 2 themed games you may play as a german soldier but there will  never be any perks or achievements for genocide.  In my view that's as it should be.  Exploration of that dark of a psyche is better left out of the realm of a game and placed squarely in the office of your psychiatrist.

What I'm getting at is that you can now BE the zombie death machine instead of just mowing down waves of waves of staggering AI baddies with the intelligence of a rotten banana.  So what's the big deal?  Well if you ARE the zombie you have to be at least as much of a threat as the good guy.  In fact I remember another recent conversation after playing another FPS game my friend and I enjoy called Killing Floor.  It's a zombie shooter that doesn't try to be anything more than that which in my view makes it far superior to the Left for Dead franchise which I personally find boring.  Theres' only one thing that could make Killing floor even better.  The ability to BECOME a zombie.  Say, for instance you die in a cooperative multiplayer mission.  So now your character is dead from the zombie attack.  Well, what if you actually became a zombie with all the abilities and weapons you had when you were so hastily cut down.  Now your former teammate has to deal with an equal instead of the aforementioned waves of AI.  That would add an entirely new dimension to the game.  In that case you go from good guy to bad guy which could be very entertaining.

I'm not advocating some kind of virtual evil academy in the guise of a video game.  I'm just looking for a little creativity in game design.

Eyefinity, Is It worth it?

A friend of mine recently decided to take the plunge and try out an Eyefinity multiple monitor setup on his Intel Core I7 950 based gaming computer.  He has an ATI Radeon 6950 and switched from a DELL 24" LCD to 3 Dell 21" IPS LCD panels which is pretty much the minimum for Eyefinity's Eye Candy.

 My initial impression upon seeing the 3 displays on his desk was one of surprise.  I knew he was toying with the idea but I  wasn't sure if he was going to really take the plunge and do it.  A great deal on the displays was the primary catalyst for him to pull the trigger on this upgrade.

My surprise sprang from the fact that my friend never seemed that excited about multiple display setups.  I've been a multiple display aficionado for years and find it difficult to work without more than one monitor. He was always more interested in picture quality and found LCD panels to be deficient in performance.  In fact until recently he was running on a 21" CRT which he used for years because he just didn't feel a LCD could match its performance.  He's changed his opinion somewhat since LCD's have improved and he had a good experience with his 24" Dell.

The 3 displays my friend now own are organized on his desk in a semi-circular arrangement and they seem to just fit with a bit of the monitor stand on the end units hanging off the edges.  The aesthetics of the Eyefinity configuration reminded me of those video arcade racing games in the 80's  that you'd actually climb in to like the Atari TX1 or Sega's Outrun.

So what's the first game he shows me? Of course it had to be a racing game and the best choice in this case was EA's Need for Speed: Shift.  Probably one of the better racing games to test out an Eyefinity setup since it's new enough to be aware of the Eyefinity technology but old enough to not require an overly expensive gaming rig to use higher quality graphics settings in your game.

I have to say this,  As far as gaming goes Eyefinity is literally made for racing games.  My friend sat me down and let me run a few races in both Shift and Codemaster's Race Driver: Grid.  While my attention was still focused on the center monitor during gameplay the side monitors provided me a more immersive experience. 

Eyefinity appears to do more than just scale up the size of the viewable display area as would be the case with just using a larger monitor.  Instead it appears to take advantage of extra information already present in the video card's memory but normally not viewable with a single display.  For example while playing a driving game I had the sensation of not only seeing the racetrack directly in front of me but was aware of what was happening in the lanes next to me.  This can be a real advantage especially if you've ever been frustrated by being beaten by an opponent that seemed to have come out of nowhere.  For FPS games like Call of Duty this can be an advantage for those annoying online games where someone always sneaks up on you and ruins your day.

Come to think of it, with a USB steering wheel and pedals this kind of setup would probably make a great training simulator for new drivers.   I'd imagine the same would be true if you were using a flight simulator to help study for your pilot's license.

All was not perfect however.  My friend had tested a number of games and found issues with titles that weren't Eyefinity aware (as I like to call it).  One of the most glaring was Killing Floor which had no setting to allow for the larger display resolutions utilized by Eyefinity.  My friend found an article on the Internet that involved changing some configuration files to allow the higher resolutions but the result was less than satisfying.  The image was vertically compressed and the top and bottom of the image was cut off making gameplay unsatisfactory.

Other games such as Company of Heroes, Dragon Age, and Modern Warfare 2 exhibited similar issues although not to the degree of Killing floor.  Unreal Tournament 3 also exhibited the same image compression issues as Killing Floor.  Killing Floor uses the same Unreal gaming engine as Unreal Tournament 3 which indicates that the game engine and thus anything that uses it will have similar problems with Eyefinity. 

I should mention that for all of this gameplay we had the monitors in a Landscape orientation which is the default for most monitor setups.  Eyefinity can be configured to use the monitors in a Portrait orientation as well which may help with the display issues we had with some of the games tested.  The stretched and cropped visuals described could be minimized or eliminated due to the less severe aspect ratio imposed on the image.  In Landscape mode, resolutions of 5760 wide by 1080 tall on a 22" LCD panel aren't uncommon.  Rotating the displays 90 degrees can change the resolution to 3240 wide by 1920 tall.  This is something my friend will be testing out.

Other issues that cropped up were hardware related.  My friend told me that on initial setup of his monitors he could not get all three monitors to show up in the windows display panel applet.  If all the monitors don't show up in the   Windows display applet it's not going to work with Eyefinity.  He found the fix was related to the monitor connections for the additional 2 monitors.  He wanted the center monitor to use the DVI connection with the flanking monitors connected via displayport but this configuration confused Windows.  Moving the center monitor to a displayport connection corrected the problem.  This may have something to do with the way Windows enumerates displays.  I've found that in my own multi-monitor setups the monitor with the Digital connection always defaults to the primary monitor position when an analog connection is used for the other monitor.  Perhaps the displayport connection takes preference over the DVI connection.


Another strange issue happened when he powered off his system.  My friend has LED power switches on his hard drives and instead of the LED's going out when he powers off the system, they stay on.  The motherboard has a status LED that also stays on with the power off.  The only way to get the lights to go out is to either unplug the PC from the wall or remove the displayport connectors from the video card.  Can't say if this is a problem with Eyefinity but the issue didn't show up with the old single monitor setup.
So, is it worth it?  With the prices of higher end ATI video cards and LCD panels at reasonable levels it's hard not to at least entertain the idea.  If you're an avid gamer that can take advantage of the benefits of Eyefinity then there's probably no question that Eyefinity is in your future.  Please take note of my use of the word "reasonable", however.  Reasonable means spending $600 on three displays and at least another $300 for an ATI card that can drive games at higher resolutions with better image quality. 

For the average user who just wants some extra screen real estate you can do just fine for 1/3 of that money.  As a gamer I'd definitely entertain the idea but it is a luxury and it's not without some minor issues.  There's nothing about it that makes it a necessity for the average pc user or even the average gamer.  Still, multiple monitors are more commonplace on desktops now than they used to be.  It's entirely possible that Eyefinity's extension of multiple monitor technology is simply the next step.
Even with technologies like Eyefinity and 3D Vision it's my view that display technology is starting to hit the wall.  You may have a larger or more displays but your eyes are still forced to stare at a screen of some type.  Display technologies will need to move past evolution and on to revolution to justify the  price premiums.   Let's get on to holographic  gaming and leave the monitors to the more mundane tasks.

UPDATE:
My friend did some checking and found his power feedback issue may have been related to the displayport cables.  He switched them out and the problem disappeared.  Ultimately though, he decided to send his monitors back and instead ordered up a Dell 30" LCD.  It seems Eyefinity works out better on larger monitors and the costs involved outweigh the benefits.


The Upgrade Mill


Ok, I admit it...

I too have an affinity for shiny objects.  Doesn't have to be computer related either.  In fact today I saw somebody driving a brand new Chevy Camaro SS with the temporary registration still attached to the license plate frame.  Now I really like and want to get a new Camaro in the near future and specifically an SS because I've had way too many wannabe V6's in my past.  So I should have been excited but instead all I could think of was...yuk!

See, this Camaro was White and I mean boring refrigerator white that  belongs on a Celica or something.  It had stripes but they were some off grey color that was almost invisible.  It didn't matter that drivetrain-wise it was outfitted exactly as I would order one; it just didn't -pop- for me.  It  should've been orange or red or blue or black, not white.  I mean who buys an SS Camaro (with a 5 to 10K price premium) just to blend in with the rest of rush hour!  I own a red mustang GT because I like V8's, pony car styling and got tired of being almost run over from people who didn't see me.

So what does this have to do with upgrades? Well, the fact that I like Camaros and have owned 4 of them doesn't negate the fact that part of my affection is driven by hype.  I mean how many Camaros, Mustangs and Challengers would sell if there were no marketing hype?   I firmly believe that at least 50% of PC hardware upgrades are driven by marketing hype directed firmly at our ingrained affinity for shiny objects.  We get lulled into believing that the more flashy the upgrade the better the product is even if the payoff is something less than the promise.  Case in point the most recent round of video cards.

The video card wars have two primary players; Nvidia and AMD/ATI.  For years these two goliaths of pixel pushing have been engaged in a war of hype eclipsed only by the Motherboard platforms they plug in to.  The crown passes between them but the hype never stops.  If one doesn't have the best offering the hype will more than make up for it.  It's a flurry of statistics, features and benchmarks designed to blur the fact that to 90% of consumers there will be no noticeable difference from one to the other.


I frequent sites like PC Perspective, HardOCP and Tom's Hardware as well as read publications like CPU and Maximum PC.  If you happen to have a preference for one vendor over another and you don't care about raw facts that contradict the latest competitors offering you will at some point be labeled a "Fanboy".  The fanboy phenomenon is really nothing more than an extension of the marketing hype designed to capture your affinity for .... wait for it.... Yes!  The affinity for shiny objects.

You can apply this formula to any product but I think the computer industry does it best.  I can think of few industries whose sales are driven almost completely by the deisre to own a shinier object than your neighbor with little if any basis in fact for that contention.  Yes, in the case of video cards there was a time when ATI had a marked deficit in performance compared to Nvidia's offerings ( and vice versa ) but that didn't stop the hype or prevent sales of current products in the channels due to a halo effect.  In the absence of a better product companies turn to promises of a superior product, " just around the corner" that may or may not materialize.  

You'll see engineering mockups hit all the popular review sites with great fanfare.  Even if the sample is little more than a piece of wood with a spiffy paint job and a few flashing LED's, a flurry of imaginary statistics will always be close behind.  They'll write pages of supposition based on little more than masterfully placed rumor undoubtedly created by marketing departments.  This is the  "vaporware" phenomenon.  It's great for sales forecasts but lousy for the consumer who often ends up with a less capable product or no product at all.

For me the most irritating example has been the recent crop of video cards released over the last 3 years.  The hype machine has been in full swing from ATI and Nvidia and product cycles are literally shorter than 6 months before a product is reduced to "legacy" status.  That's fine if there's real innovation going on but I've seen outrageous hype over a 5% performance premium versus a competitor or even the preceding product from the same vendor.  Never mind glossing over that for the supposed "upgrade" in performance you end up with a higher power requirement or more heat inside your pc case.

Allow me to make a comparison to another popular product, namely cars.

I love American cars, I'm stalwart against imports for a variety of reasons but I will be first to admit that there was a time when American cars by and large were no better than Yugo's in terms of quality and workmanship.  Still, in the 60's and 70's Americans shunned imports.  It took an oil crisis ( real or manufactured ) to turn the American public to vehicles from other shores that at least "seemed" to be of better quality and cost less to operate.  

The fact is that many of them weren't and I can't think of too many 73 Datsun's or Toyota's still running around but it began a new halo effect.  Even in the face of evidence to the contrary there are many people who will claim supremacy of the imported car over the domestic product.  This is nothing more than a very well tuned marketing construct designed to ensure a loyal consumer. 

In the case of cars, the 70's not only brought the invasion of the import but mastering of the concept of churn.  Yes, there was always strong marketing to get the newest model but most people up to that point would buy a car and expect to have it at least 5 years.   It wasn't uncommon to  have "your father's Oldsmobile" passed down to you as your first car after many years of service to ol' dad.  With the advent of imports we shortened that cycle.  Engineered obsolescence and relentless marketing was designed to make the consumer feel bad about holding on to that inferior older model even if it was perfectly serviceable.  Think about how many cars of the first half of the 20th century survived to this day in good enough shape to be restored and returned to service.  Not so with a 70's or 80's car that was rusting to the ground  in 3 years of Michigan winter.


This is the exact same formula applied to pc upgrades.  Even if your current hardware is performing perfectly the never-ending hype machine will work on you until you rationalize a purchase by fitting your perceived need into the hype.  I've done it and regretted it.  That's why benchmarks are nice because they can at least give you something to go by so long as you're the one doing the comparisons.  I have literally built a brand new gaming rig from the ground up because I believed the newer platform was better.  In a limited way it was, by about 5%.  It cost me much more than 5% to build that new rig, however, and I felt a bit cheated.  I learned my lesson the hard way.

I'm not saying that you won't see remarkable gains from moving from a Pentium 4 to a Core I7, you will and I have.  The difference is that I don't upgrade because of glossy magazine ads and cool flash laden websites.  I like to upgrade when there are hard facts to justify the cost and trouble.  It is only when I start to hit the performance wall that I start looking for upgrades.  For example; I've completely ignored the latest crop of Nvidia cards starting with the 480 series and continuing the current 580 series because for the purported gains the cost of entry makes no sense.  If I'm playing games at less than 1900x1200 resolutions then what benefit do I get from the extra power and heat tax these cards exact?  My current gaming rig is a Core I7 940 with 2 Nvidia GTS 285's and I've yet to tax them with any game I currently run on them.  Oh! but to listen to the hype, I'm supposed to run out and get a new P/H 67 Sandy Bridge based motherboard and CPU ( and we know how well that's working out for early adopters ) Get a pair of NVIDIA 580's or ATI 6970's and all will be right with the world. 

I DON'T THINK SO.  For one thing the latest and greatest Intel offerings are called "mainstream" products which means that by the time the "enthusiast" products come out in 6 months your shiny new motherboard ( which you won't get back from RMA till April ) will be old and somehow not as good as the "new" stuff  you'll pay a premium for.  Aside from bypassing the early adopter growing pains you'll likely be in the 90% of consumers who will see virtually no benefit in your gaming experience from the investment should you give in to the hype.

Ahh but churn is a fact of life in the marketplace but that doesn't mean you have to be a slave to it!