San Francisco — Confusion over pricing models and a lack of industry-wide standards may delay the adoption of utility computing among some CIOs, vendors admitted at the Intel Developer Forum.
This year’s conference has highlighted a number of technologies, including virtualization software and
dual-core processors, which could help organizations create an IT infrastructure that would see computing resources activated only when needed. That form of resource allocation — which some vendors call utility computing but others call on-demand or organic — has been aggressively marketed over the past two years by vendors including IBM, HP and Veritas. But in a panel discussion hosted by the chipmaker, several manufacturers told the audience they had yet to come up with a software licensing scheme that satisfied the bulk of their customers.
Bits and bytes
“”It seems easy to create a data centre — you buy a lot of different bits and pieces, put it all in there and it’s done,”” said Vadim Rosenberg, director of technical marketing at BEA. “”Things get expensive when you start paying for a licence.””
The ideal, Rosenberg said, would be for software vendors to take a lesson from cellular carriers, who demand a flat fee for basic services as well as per-use fees for specific additional services. “”If we could offer that kind of pricing, that would be great,”” he said.
Not all organizations see it that way, however. Nick van der Zweep, director of virtualization at HP, pointed out that the monitoring and metering necessary to offer utility computing would require enterprises to transmit information about what they’re using, and government customers in particular have not wanted to do that.
“”Our customers couldn’t handle the variability (in per-use pricing) because they didn’t want a sudden spike in their cost,”” he added.
De facto spikes
Pricing may have to vary from customer to customer, but unless the industry can agree on common ways to enable utility computing it will be hard to see real benefits, said Roger Reich, senior technical director at Veritas. “”De facto standards emerge among companies with the closest business relationship,”” he said. “”We have to figure out to what degree we can develop interoperable standards in advance of private industry swaps.””
One problem with utility computing is that enterprise IT managers sometimes don’t know what to ask for, said Tom Kucharvy, president of research firm Summit Strategies. That lack of vocabulary remains a key issue. “”(Vendors) are using different terms to say the same thing,”” he said. “”But that’s because many companies want to avoid comparisons with their competitors.””
Billboard’s hot 101
Rosenberg said it’s important that vendors committed to utility computing separate the reality from the fluff. “”You can drive along the 101 and see all these billboards advertising grid computing or virtualization,”” he said. “”When you start digging, you see that’s all it is — just billboards.””
Some enterprises have already started on the path to utility models whether they recognize it or not, added van der Zweep. “”Our server consolidation customers are clearly virtualizing,”” he said. “”They’ve stepped into that realm and they’re calling it consolidation, but they’re incorporating virtualization into that.””
Intel is also using its Intel Developers Forum event to launch the Cross-Platform Manageability Program, which will seek to create standard ways of supporting common and consistent manageability capabilities, interfaces and protocols across all Intel platforms, including cell phones and servers.
Paul Otellini, Intel president and COO, said he expects the company to release a public industry specification with input from developers at the spring Intel Developers Forum in 2005.