Ben Schwalb | Das Coding
Cloud computing: Too big to fail
Published: Tuesday, April 24, 2012
Updated: Tuesday, April 24, 2012 05:04
The standard definition of cloud computing is a service that allows you to store your files in the cloud (i.e., on a server connected to the Internet) so that you can access them anywhere, anytime. But of course, there’s a little more to it than that.
The idea of using another computer to store your stuff is actually incredibly old. The very first computers had to be programmed in person. Of course, everybody wanted a turn, so people wrote programs at home and waited until it was their turn to run their program. But soon people realized that a computer could manage the list of programmers waiting for their turn better than an administrator.
Thus, so−called timesharing computers were born. The system could handle multiple “users” at different terminals networked to the mainframe computer, running their programs on the computer at once. If Jack was running a program and Jill told the computer to run hers, the computer would wait until Jack was done and then run Jill’s program.
As computers became smarter and more powerful, many even ran multiple programs at once by quickly switching back and forth between each program. This would allow Jack and Jill to get intermediate results relatively quickly and allow Joe to pause both their programs and run his if it was more important (assuming his account was allowed to do that).
Computing power is now fairly widespread via personal computers, and the next big change is high−speed Internet everywhere. Most well−populated areas are accounted for, and the Internet is spreading to more rural areas and cell phone networks. These faster and more widespread connections allow people to use their computers and other devices to access the cloud.
“Other devices” is a key cloud computing phrase. One of the reasons businesses love the cloud is that it isn’t limited to one operating system or device. An Android app can’t run on a Mac without being changed, but both devices can access the cloud via an Internet browser. This means that companies can now pay their programmers to make one website instead of making a program for PCs, Macs, iPhones, smartphones and other devices. In addition, consumers can access the same data via all of their devices.
But in addition to platform independence, cloud computing offers elasticity. This principle is best explained by the example of Netflix. The movie streaming function on Netflix is very prone to peaks and lulls. They host their movie streaming on Amazon’s Elastic Computing Cloud which, come Sunday afternoon (peak viewing time) allocates more of its computers to dealing with the increased number of users. It would be too expensive for Netflix itself to buy and maintain the servers needed for peak times, but Amazon can afford to do this because its other clients have peaks during Netflix’s lulls.
Amazon’s cloud is good for the little guy as well. The myriad small startups can all run their websites on a single Amazon computer for a low price and be sure that the system will automatically add the needed computing power should their site go viral.
You might be thinking that an entire computer is probably too much for one small startup website, and you’re right. Amazon takes advantage of a technology called virtualization, which allows one physical computer to act like, or “emulate,” multiple computers, each with their own operating system, hard drive, etc.
Using virtualization and the scale of the cloud, a developer can have the experience of using one personal server whether they are sharing one with multiple people or using multiple servers to host a huge website or run a data−intensive program.
Ben Schwalb is a member of the Class of 2012 who majored in computer science. He can be reached at Benjamin.Schwalb@tufts.edu.