Article Image
News Link • Media: Internet

What Moore’s Law Means for the Future of the Cloud

Something has been bothering me about cloud computing for two years now, and whenever I read a study like the most recent one from CSC, which reveals that the shift to cloud is not actually saving most IT departments very much money, I think to myself, “Yes, this is exactly right — cloud should be more expensive for most enterprise applications, given that transistors just keep getting cheaper while prices for cloud services hold steady. Moore’s Law means that in 18 months it will make more sense to own that block of transistors that you’re currently renting as an hourly service.”

The CSC survey gives some indication that cloud services don’t actually save many businesses very much money, and in some cases they boost costs because they require companies to add new staff with the relevant cloud expertise. But on the other hand, there are some very prominent counterexamples, like Netflix, which claim that using Amazon’s public cloud saves the company a ton of money.

So what’s going on here? Is renting transistors in the cloud cheaper than owning them, or is it more expensive? And if cloud is cheaper for some users and not for others, then why is that the case? Finally, how does the private cloud fit into this picture, and does it pose any threat to the seemingly unlimited growth potential of Amazon?


Join us on our Social Networks:


Share this page with your friends on your favorite social network:

Attorney For Freedom