Wednesday, July 6, 2011

The Upgrade Cycle Looms...

I'm a fairly eclectic gamer -- I have never considered myself to be exclusively a PC gamer or a console gamer, and I appreciate the strengths of both.  For purposes of this discussion, I'm going to ignore many differentiating factors, like controllers, networking support, moddability (is that a word?) and software libraries.  I'm mostly interested in how the respective PC and console hardware life cycles operate.

Consoles are meant to be affordable and simple to use -- every game published for a specific machine will should run on every version of the console available, with no fundamental performance differences.  The box itself needs to be refreshed every so often to keep up with technology standards, of course, and each new generation usually means buying a whole new set of accessories and games.  Still, there's something to be said for plugging a new console in, hooking it to the TV, loading up a game and jumping into the action. 

PCs do a lot of the pushing of gaming technology standards -- horsepower that gamers and critics are thrilled to see on a new console is usually at least a few years behind the state of the art on the PC.  When consoles do occasionally take the lead in some area, like the SNES' sample-based music chip or the original Playstation's dedicated 3-D hardware, PCs are usually able to catch up with and surpass those new features within a year or two.  But this also means that PCs are never completely "current" -- there is always some new hardware, some new investment that can be made to make the gaming experience that much more "complete."

I'm thinking about this lately because console gamers don't have to deal with "max settings" and frame rates -- either a game looks good and plays smoothly or it doesn't, and there's usually nothing that can be done to fix any issues.  PC gamers, on the other hand, often end up playing the endless "performance tweaking" game more than any other.  I have spent many hours adjusting both PC and game settings, trying to get the most out of whatever hardware I'm using at the time, and then testing out familiar games, relishing any discernible improvement after I've added new hardware or think I've established a better performance balance.

What I'm dealing with at the moment is this: I do most of my PC gaming (and a lot of my old-school emulated console gaming for this blog) on a laptop computer.  It's a decent mid-range 2009 machine, with dual-core CPUs and enough memory to eliminate a lot of the "wait time" that always used to annoy me about Windows.  It runs my standard tools and utilities with aplomb, and I rarely feel like I'm waiting on the computer; I'm very happy with it for general purposes.  But given space and heat limitations, it's inevitable that my laptop's integrated graphics hardware is far from state of the art -- it can manage a passable 50 frames-per-second with 3D graphics using the Passmark Medium Complexity 3D test, but falls precipitously to a chuggy, unplayable 9 fps on a High Complexity test.

Now, I'm not playing Crysis 2 or anything that really pushes the standards on my laptop, but I do play a lot of the current adventure games and might play some other types of games if my hardware were up to it.  But even with something graphically straightforward like Tales of Monkey Island, I can't turn the detail all the way up without seriously impacting the frame rate.  And any scene with fog or smoke effects tends to make my hardware stutter -- it will do its best, and I can make just about anything playable if I lower the resolution and settings, but it's a far cry from what I'm used to on the XBox 360, and revives those old PC gamer cravings for just a bit more power.

The larger issue is that laptops are self-contained and not generally very upgradable; they're also designed to use minimal electricity, which is why they usually don't have powerful graphics chips.  So aside from playing with the built-in graphics hardware's settings, I can't really do very much about this situation.  So I find myself contemplating a new hardware purchase -- but not a gaming laptop, as they are expensive and my current machine is fine for everything else I do. 

I am considering buying a relatively cheap tower PC with a decent amount of memory, buying and installing a relatively current graphics card with fast, dedicated memory, hooking it to the TV with an HDMI cable, and using a remote keyboard, mouse and controller to make the experience as console-like as I can. 

This project, of course, will entail quite a few decisions about cost vs. power.  But I enjoy doing that kind of research every once in a while, and I suspect an inexpensive desktop machine today will be more capable than my laptop of two years ago, especially with a graphics hardware upgrade.  And, to be honest, all I truly care about is being able to run, say, Back to the Future: The Game at 720p or 1080p with full detail settings; that's demanding by my current hardware's standards, but it shouldn't be a lot to ask on a desktop machine.

So this may be a new little project for me to tackle.  Of course, I will also have to do a little self-analysis in the process, i.e., wondering whether I'm tempted to do this because there isn't a new console launch on the horizon this year, and subconsciously I really just want a new toy to tinker with in the meanwhile.  That can be a whole game in and of itself.  I'll keep you posted.

No comments:

Post a Comment