Tuesday, November 7, 2017

Is The Single Player Game A Dying Breed

Did Internet Kill the Single Player Star?

If you can, think back to the nineties, think back the golden age of games. Too far back? Well, think back to the early 2000s before online matchmaking became a staple. Still too early? Wow, I feel old, but to an extent that's part of the reason behind the steady change in gaming culture. Anyway, let me take you back into the nineties when Nintendo ruled the market and the world wide web was just beginning to become a house hold tool.

During the nineties, the term multiplayer didn't refer to a pool of thousands of players to play against, it was closer to simply 1 to 4 players. And the reach games had wasn't world wide, but instead it went as far as who was sitting as the same couch as you. But obviously games, and more importantly, the hardware that make these games possible have grown, and the reason is very simple. As any good business, the gaming market place continued to raise the bar to entice a larger and larger player base.

And so, game consoles continued to evolve alongside the demands of the market. What was single player, became couch co-op, which became LAN parties, and soon became online matchmaking. The consumers' lust for convenience and the companies' ability to read the need within their market and act has created what we see today. This might sound like a win-win, but somewhere along the way something got lost.

Today, any game that's main selling point is their campaign seems to need some sort of added value. The easiest of which is adding multiplayer to an otherwise perfectly good game. We've seen it in plenty of games such as Tomb Raider, Uncharted, and even Bioshock. A good single player story no longer seems to equal a $60 investment to the average consumer. There have only been a couple exceptions, and they have either failed or succeeded due to in part the history attached to the game that resonated with the now older gamer market.

This may seem like progress to the average consumer, but how many games have been scared away or rejected because of this new mentality? Obviously that's a speculatory question that can't be answered, but it is still one that deserves our attention. Because due to our own greed, a game can't exist without having another layer to it, whether it be multiplayer, open world, or a shared world. Part of the reason for this may be because of the ever-changing gaming age.

Today's younger gamers are used to this. To them, a game without replay-ability isn't worth their time or their parents' money, especially when there are plenty of fun free-to-play games available. It's as if you were raised on getting two bowls of ice cream every night, and then one day you only got one. I can imagine your reaction would immediately be anger, frustration, on confusion as to why you only got one when two is better. But you never took the time to notice the extra care put into the single bowl of ice cream. The single bowl has several flavors and pieces of sugar cone sprinkled over it. If you tried it, you might realize that the care put into that single bowl is better then receiving two. But after receiving two bowls for so long, receiving anything less immediately associates itself with being worse.

Thankfully, there's still hope. And although it may seem a bit more rare, there are single player games out there that hinge on the games quality and still survive. Games like the new Wolfenstein 2 as well as the entire horror game genre still survives despite its reliance on single player story telling. Hopefully we continue see to games like these thrive and encourage others to not turn away just because they don't have a second bowl of ice cream to offer.


What do you think? Can a game entertain without a form of multiplayer?

No comments:

Post a Comment

Comments system