An interesting read for those with interest, but not much background in Grid computing.
Grid computing is the name for the resource sharing technologies being used by many scientific organizations and a few companies to make the resources of many smaller computers work together as a larger system.
If you have run Seti@Home or one of the Anthrax Sequencing programs on your computers, you've participated in a primitive grid.
IBM has been discussing recently its belief that big computer users will be calling on utility-style storage and compute servers provided by organizations (companies) that follow standards allowing the customer to move from one utility to another (similar to how you can theoretically do this with electrical power in some states). In fact, they've published a Red Paper on the topic, which you can download in PDF form from their site. They have also published a strategy statement on the topic.
Recently, much of the focus has been on the Globus Project, a project for building the software necessary to use the infrastructure. The software they are creating is funded partially by the government and appears to be licensed as freeware (don't sue us and you can use it).
Further, my old stomping grounds, NCSA (the NSF-funded National Center for Supercomputing Applications) has been involved in grid computing in order to establish the TeraGrid a research and scientific grid network aimed at providing 20 Teraflops of computing power and 1 petabyte of data storage over a 40GB backbone between 5 sites (looking like mostly the old NSF Supercomputing Centers).
Food for thought.