I was rightly reminded that I didn’t finish my cost comparison on cloud computing; I am duly reprimanded, and I will continue it here.
I’m not going to make any guarantees about stability or reliability; that was covered last post, and I’ll only be dealing with cost comparison here. As far as I can tell, cloud computing is pricier initially than renting individual servers, but may pay off as it scales. Just taking a look at LayeredTech’s offerings reveal that their cheapest grid layer system starts at $1696/month- this includes the resources of 4 servers with 3.0 GHz processors, 2 GB RAM, and 500GB hard drives. It also includes each one’s amount of bandwidth. Let’s price that by ordering four individual servers at those specs: A similar deal seems to start at $169/month per server, making the individual servers come out to be far cheaper- in fact, less than half the price. The major draw here, however, is that the grid layer system offers far easier scalability and much more interoperability. The grid layer system comes with four nodes but can scale up to 48 as needed- I can’t find pricing information on the website as to how much it would cost to scale, but the wording seems to imply that the majority of the price comes from the setup. As well, the grid layer comes equipped with managed firewalls and load balancers, ensuring that your system is always up and running efficiently.
I would be interested in paying the premium simply to avoid the headaches involved with adding and configuring different servers; manually maintaining load balancers and firewalls isn’t the way I’d like to go. There you have it, however- the cost comparison of the cloud versus the dedicated server! Whether or not the benefits of the system outweigh the price is, of course, entirely up to your discretion (I’d almost be inclined to say yes, but that’s a matter of individual position).