|
Google just published its Q3 financial results. So, what is Google spending on IT, and how much servers would that buy? This is one of their best kept secrets. In this post I give ballpark estimates based on back-of-the-envelop calculations, similar to the ‘guestimates’ I made 5 years ago. Some quotes from Google’s statement:
Other cost of revenues, which is comprised primarily of data center operational expenses, amortization of intangible assets, content acquisition costs as well as credit card processing charges, increased to $747 million, or 10% of revenues, in the third quarter of 2010
and
In the third quarter of 2010, capital expenditures were $757 million, the majority of which was related to IT infrastructure investments, including data centers, servers, and networking equipment.
So let us modestly assume that half the capital and half the operational expense is server related, $400 million each. Let us assume a cheap Google server costs $1000, and the associated network, datacenter facilities and such, another $1000. The run cost of the datacenter (power, cooling, etc.) could match that. This leads to an investment pattern of 200,000 servers per quarter, 800,000 per year. With an average lifetime of 3 years, this puts the ballpark estimate of the size of Google’s server farm at 2.4 million servers. There are entire countries that do not have that many servers. There are entire countries that do not have that many PCs.
Since 2004, the server farm increased in size by a factor of 16, while revenue increased 10 fold (per my 2005 estimates). Once more, Google increases the amount of compute power that goes into a dollar of revenue, Moore’s law notwithstanding.
Sponsored byVerisign
Sponsored byCSC
Sponsored byIPv4.Global
Sponsored byDNIB.com
Sponsored byRadix
Sponsored byWhoisXML API
Sponsored byVerisign