The PC Master Race Is Ending, Not In A Bang, But A Derp

I was looking at Old Man Murray the other day and noticed that they shared a common beef with me in that they absolutely hated what the PC gaming platform had become: a clone-ridden mess.  Virtually every one of their articles were about lampooning this, just as virtually every one of my blog entries are an ongoing jouney of desperately scrabbling for an exception to this terrible norm.  But computer technology is a pretty fast-growing industry, and a lot has changed since the last entry on that site.
http://sasukekun17.deviantart.com/art/Glorious-PC-Gaming-Master-Race-460847076

For starters, tablet gaming is well on its way to becoming bigger than PC gaming.  This is, if it isn't already: I recently ran across an article discussing micro-transactions in a relatively obscure mobile app, and they mentioned having 1.7 million players.  1.7 million players is about a hundred times more players than you can expect to ever try a relatively obscure PC game, so it makes me suspect that most people have simply moved on.

That is just speculation, of course.  One thing I can see quite clearly is that the PC gamer master race is getting kicked right in the most precious jewels they have: our beefy hardware specs.  I recently bought an NVIDIA Shield Tablet with 206,400 more pixels being displayed on its eight inch display than my 1920 x 1080 LCD PC monitor.  It also has a quad-core 2.2 Ghz processor that puts it at a reasonable fraction of the processing power to my PC's AMD FX-8120. 

A decent showing, but surely the PC is more powerful, right?  Well, this dandy little tablet only set me back $300, which is less than I would pay for my CPU and motherboard alone, nevermind the case I would need to put it in, the ram and the hard drive, the display device, the mouse and keyboard... all of which the tablet has built in.

Granted, this tablet is probably the most powerful Android device out there, or close to it, and there's barely any software to vindicate this much power.  However, for those keeping score, in the 60s we had computers the size of rooms, in the 80s they had moved to our desktops, and now they're so small that we hold them in our hands as we use them.
http://www.gizmobeat.info/nvidia-shield-tablet-review
Sooner than you think, we will be able to buy a tiny little headset that maybe looks something like the Google Glass except its display beams directly to our retinas.  Despite their diminutive size, they will have such computational firepower that it makes our current computers look like Commodore 64s in comparison.  Our mouse will be a gesture recognition mechanic that works so well that it puts the Kinect to shame, and our keyboard a flawless non-invasive brain computer interface.  I find this incredible device to be feasible because these features are all present-day existing technologies, and all they lack is the refinement to work well enough to become mainstream.  Given how technology only comes down in price over time, the whole package will eventually cost less than a microwave oven.

When that happens, PC computing as we know it will be gone, the commonplace definition of "personal computer" having changed to subtly mean, "Something something we wear on our person."   No longer will there a big block-like device in our offices or bedrooms, nor will we be using our computers in there unless we wanted to sit or lay down while using them.  Good thing our cars will be driving themselves, as I know what I would rather be doing.
http://en.wikipedia.org/wiki/Google_Glass
In this way, the hardware has proven easy enough for the human spirit to conquer: we just keep finding more ways to make them smaller and more user friendly.  Unfortunately, we seem to be having a serious problem with how we're using it.

First off, computer security is deplorable.
  • Invasive software is out of control.  My poor elderly parents keep getting their computers infected with malware because virtually every download site on the internet is littered with false positive scams.  Several pieces of software I want to install, including Adobe Flash, require I opt out of other pieces of software that come with them. Some don't even bother to ask, and obnoxious popup advertisement software is finding its way into people's computers, resisting all attempts to be removed.  Buy a new laptop from BestBuy, you will have dozens of articles of bloatware offers already installed. 
  • Internet criminal activities are out of control.  While I was playing WildStar and ArcheAge, there were a ton of invasive gold farmer activity going on, they were being banned by the moderators left and right, but it barely slowed them down because they were hacking player accounts effortlesslySoftware piracy is rampant, especially on the PC platform.  DDOS attacks are being done by bored script kiddies looking to get attention anywhere they think they will be noticed.  Password database leaks are now so common that the news hardly bothers to report them anymore.
Now, imagine all that coming from a powerful computing device you are wearing on your head, and I could conceive World War III would not be caused by geopolitical pressures but rather border skirmishes from people who snapped under the pressure of the unchecked activities of obnoxious Internet marketers and cyber criminals.  Legislation needed to get a lid on all of these issues during the last decade but, thus far, have demonstrated that they are barely warming up to the idea of what a personal computer is.

Second and possibly just as important as averting major catastrophe, we need something of worth to run on our computers.  This is the major sticking point that drove me to write this entry to begin with.
While hardware has been easy enough to progress in, software has been a dreadful mirror of how stupid we are as a species.  I say this because, if you look at the average game released these days, what we're getting is worst versions of what has already been made a decade ago.  For example:
  • I replayed Hostile Waters at the end of 2012, it was originally a 2001 game, and there's simply not been better made since.  
  • The same could be said of the Descent Freespace series, the last also made in 2001.  
  • Lately, I have been doubting that World of Warcraft really did a better job than EverQuest.  Yes, WoW had severalfold more players and generally smoother more refined gameplay all around, but was it as good of a virtual world, or was its quest-hub centric approach too much of a throwback to Diablo?  
  • How about that Windows 8?  It seems even the company responsible for the most popular operating system on the PC platform would rather make software for tablets.  Never have I been so tempted to install Linux.
I'm sure many people can come up with their own examples to supplement these, but the deeper problem is that the primary thing that has changed from the days of Old Man Murray is we no longer consider clones to be a problem, our desire for something new has been crushed under the far easier task of idiots trying to recreate the wheel, and failing.

You can point out how modern games have better looking textures and more triangles than ever, but beauty is only skin deep, and this is meaningless when the gameplay is no better than it was ten years ago.  The fact that there were more creative games in 1980s arcades than I can expect to see in 90% of the Steam library is grounds to accuse humanity of just getting stupider in general.

At the rate things are going, we will probably end up using our incredible futuristic computer-in-a-pair-of-glasses to play a clone of Bejewelled or Flappy Bird.  You would think we could manage at least a GPS-and-accelerometer-supplemented first person shooter, but the point I am making here is that, while hardware has been moving forward, we've been intellectually stagnant, and it is getting increasingly harder to justify why we need new hardware at all.

Comments

Popular Posts