Cloud computing is the future, but keep an eye on those monthly bills
There was a time when all computing was work computing. Back in the days before personal computers, nearly all computer technology was for some sort of business use. Yes, there was also educational use, but the educational use of that primordial computing technology was either to solve scientific problems, manage the school (so, business), or teach computing technology.
Even then, there were glimmers of the recreational in computing technology. Some of the earliest games[1], like the ancient Tic-Tac-Toe and Spacewar[2], ran on minicomputers and mainframes. In fact, early versions of many of the recreational computing activities we perform now existed back in the ancient times of the 1960s and 1970s.
Early forms of social networking can be traced back to Usenet[3], along with early forums, and even early MUDs[4]. Many of these recreational computing activities originated in universities, but they were played after hours at work. The thing is, until people were able to bring computers home[5], recreational computing rarely occurred at home.
The rise of personal computing
The personal computer changed all that. It doesn't matter whether we're talking about an Altair 8800[6], Apple II[7], Macintosh[8], or Windows-based machine, the one-computer-one-person model helped personalize computing activities. People began using computers at home to manage finances, communicate with friends and family, play games, and support creative endeavors.
What made personal computers interesting was that they were useful for both work and home activities. The Apple II became the poster child of home computers[9], but its true popularity blossomed because