There are two main categories of games, based on their hardware requirements. There is a category of games that require souped-up Alienware machines that are constantly upgraded, showing the user the most realistic and technically advanced graphics and sound possible. And there is the camp of games that use lower-end graphics and sound, but work on any dinosaur made in this century (and sometimes even older).
I started EverQuest in late April 1999, shortly after its release. When the Shadows of Luclin expansion was released in December 2001, it caused me and a number of my friends problems. We found that our current hardware was unable to run the new expansion without significant performance problems. This meant that we had to all either pay for upgraded hardware or quit the game. In effect, it added about $200-300 to the cost of playing the game in addition to the monthly subscription fee. Many others simply quit and went to another game.
There are many people who will gladly pay the extra hundreds to play (and sometimes keep playing) the latest graphic-intensive games. The problem is that this group of people are generally limited to younger people with sufficient discretionary income and free time. There are millions more who have families, bills and mortgages to give their time and money to. The other problem is that it is extremely expensive to develop these games, frequently approaching the cost of producing a feature film. That combination leads to lower profits and even losses for most.
The pushback on this game model is increasing, in favor of the “anything thrown from the back of a truck in the last five years” model, as described by Dana Massey (and also discussed by Richard Aihoshi). One of the reasons World of Warcraft became so popular is because a computer with any video card, motherboard, reasonably minimal memory and an old copy of XP will run it fine, even on an ancient dial-up connection.
As people lose more jobs in this economy, less of them will have an extra $100 to throw at an upgraded video card to play a game (and realistically, what other reason is there to ever upgrade a video card?). Many won’t be putting money into increasing RAM and getting faster motherboards either. I don’t see the economics working out for anyone wanting to release any new graphics-intense game without also having a very strong pre-existing consumer base.
Another angle is how the ubiquitous browsers are becoming even more ubiquitous with smartphones, netbooks and Google’s desire to use the Chrome browser as an OS. These new, cheaper netbook computers won’t be able to run any of the games that require special graphics cards, but they will play any Flash or Silverlight game. Why raise barriers to playing that may deter paying customers?
The growth and the profits are in games that anyone can easily learn and play for cheap, but have a hidden complexity that keeps them coming back for more. Take the simple game of Rock, Paper, Scissors. It is easy enough to teach to a young schoolchild in minutes, yet has international competitions (and yes, even has a unique F2P Facebook version called StripPRS). It does not require an souped-up video card to enjoy.
The future of gaming is Free-to-Play games that have low barriers to entry, including low hardware requirements, easy to learn, and no commitment/subscription, which is why they are growing like wildfires in Los Angeles.