I saw a headline. "Microsoft Calls for Open Cloud Standards". I love standards. The problem is putting standards before there are enough really good examples to honestly have a good standard; chicken before the egg or the cart before the horse.
I have actually changed my stance on this over the years. I used to be all about a standards body, and everyone getting together and hashing out the details. The problem in doing this before there are really good real world examples, more than a couple, is there are a lot of details we don't actually know about such needs, requirements, use cases, etc which simply are not feasible to just throw together or to try to formulate; our industry always assumes we know more about all of the above than we do.
The needed information only comes with real application experience, and that will not come from developers in an office purely working on infrastructure who never have to use what they design in real implementations; I have seen it way to many times.
There are two big things at work here. Yes, you must have the infrastructure and developers for that. Part of what comes from this is knowing what will work (think more about programming and low level details...not end user use cases), how the pieces fit together to provide different features of the system, and something which can be used; this is really iterative.
Next, you have to have those building solutions with that system. It must be put through its paces. On top of that, those end developers will add their own features, provide feedback, and tell you what does and does not work. We are really just talking about agile processes, but seemingly this gets lost in drives for standards.
Sure, we have many examples of grid computing and other very similar technologies. Too, there are clouds that have been available for a while now. Virtualization has been around for a good long while too.
OK, I concede all of that ahead of time. But, too many standards bind a project to implement some very under used features, require resources to implement, and then change so much over time as to essentially cause some of the fuss and confusion they are trying to avoid before everyone involved, including the users, know the real details about that which is being standardized.
Were innovators to do what they do and innovate. Then, only after a market has formed around multiple implementations of similar technologies, try to form a consensus on a standard. The standard would be more complete up front, and groups wouldn't have a hard time getting behind it or accepting it as input would be available from those with their noses to the grind stones to really give the standard the respect something we should call a standard deserves.