Bandwidth, software and other issues
Simply migrating to multicore systems does not guarantee efficiency, however, cautions Gartner's Reynolds. IT managers must do some upfront planning to ensure that they have enough network bandwidth to keep the additional processor cores fed, and they must make sure their applications are optimized to take full advantage of multicore environments.
"IT managers don't get fired because the electricity bill is too high," Reynolds said. "They get fired because they can't deliver the computing requirements of the organization."
Businesses also need to thoroughly evaluate the effects of software licensing as they move to servers with multicore processors, he explained. While Microsoft has already promised that it will continue to base its licenses on sockets and not the number of processing cores, the licensing path for other applications is not as clear.
"Each business needs to ensure they are not going to get hit with a big license upgrade fee as they move from two-core to four-core systems," Reynolds said.
In response, the chip makers say that licensing costs are not going to be a huge deal. "The big hump was in going from single to dual core, but now we have a pretty solid understanding of most licensing strategies in the market," said Pat Patla, director of Opteron marketing at AMD.
Stori Waugh, senior manager at Dell's server product group, said the company has been working closely with all major application and operating system vendors to advance the "license by socket, not by core" strategy. As many as 90 percent of software vendors will adopt a per-socket licensing model, she said.
Another question is how efficient software can be on multicore processors when the applications were designed for earlier generations of hardware. The processor manufacturers maintain that the bulk of the application optimization for multicore environments is done; they say it was completed during the transition from single- to dual-core systems.
Brookwood agreed that the "heavy lifting" for migrating software to multicore environments was completed in the dual-core transition, but he noted that generational fine-tuning will be needed to get optimum performance out of the new processors.
"It will always be dependent on the specific software package," Brookwood said. Virtualization is one example of the ongoing work between third-party software vendors and the chip makers. AMD and Intel have rolled out dual-core, x86 processors with hardware-assisted virtualization features within the past year, he says. Companies such as VMware Inc. and Microsoft Corp. continue to work to optimize their virtualization software to make the most of latest processors.
"The ultimate test is always whether it works for IT professional and makes their applications better," Brookwood said.
Markus Levy, an analyst who serves as president of the Multicore Association and the Embedded Microprocessor Benchmark Consortium, said an increasing number of applications will require higher-level optimization efforts. In other words, he said, it won't be enough to simply run existing software on next-generation processors with larger available core density.
"Even when Intel goes to 16 cores, there will be a need for additional acceleration technologies," Levy said. "As we add more and more cores, we'll also see that general purpose processors can only do so much for some tasks, and the need for specialized acceleration technology will increase."