Contributed by: Jeff Rivett (site admin) Friday, June 26 2015 @ 07:49 AM -08
A recent post on The Verge[*1] reports that game developers are increasingly less likely to include split-screen multiplayer in their games. According to the article, the main reason for this is that split-screen gaming doubles the hardware resource requirements of a game.
In fact, while the hardware resources required for split-screen are greater, the difference is usually not that large. That's because a lot of the work being done by the console only needs to be done once, not multiple times. Of course, this depends on the quality of the programmers and the development process used. If split-screen is added to a game as an afterthought, it's likely to be much less efficient.
But the real reason game producers are moving away from split-screen is pressure from management, which sees split-screen gaming as lost revenue. If I can go to my friend's house and play a split-screen game with him, that's one less game sold. Game producers would much rather force us to each buy a copy of the game and play on two separate consoles. The extra expense involved (multiple copies of a game, multiple consoles, multiple online gaming subscriptions) is great for game producers, but a crappy deal for people who just want to play games with their friends.
In any case, this trend is certainly sad. As noted in the Verge article, some of the best gaming experiences come from playing alongside friends in the same game, while sitting next to each other on the couch. It's a much more social experience than online gaming.
Of course, some games will always include the ability to play with and against friends on one screen. Many sports games, like EA Sports' NHL Hockey series, have always allowed this kind of play, and they typically do it without even needing a split screen. This no doubt contributes to the long-term success of such games.