Menu
The 64-bit question

The 64-bit question

Back in the day when dinosaur micro­computers roamed the Earth, their brains could handle only 8 bits of data at a time, and their memory capacity was a paltry 64 kilobytes. Then Intel came along with a chip, the 8086, that doubled the amount of instructions a computer could chew on at once, and expanded the amount of memory a program could access to a whopping 1 megabyte. Deciding to upgrade was a no-brainer.

Now we’re at another historic juncture. Intel , Advanced Micro Devices (AMD), and Apple/IBM have introduced new 64-bit processors that will power the next generation of high-performance computing systems. The challenge for everybody is figuring out which processor will deliver the best price/performance to meet their computing needs. And educating both the channel and the public about the new options

The new 64-bit chips are the Itanium from Intel, the Opteron from AMD, and the PowerPC G5, based on IBM’s design and starring in Apple’s new G5 PowerMac. In theory, all three offer substantial raw performance improvements over 32-bit processors.

In practice, the true performance will depend on many factors, including the amount of memory a company is willing to purchase and whether applications are optimised to take advantage of the processor’s capabilities. Additionally, if the 64-bit systems are part of a cluster, the efficiency and performance of the interconnection software and switches in the cluster will also affect performance.

That said, the main advantage of using a 64-bit system is that much more data can be put into memory - which means faster results.

A 32-bit system can access only 4GB of memory. (There are ways around this limitation, but these solutions are typically complicated and expensive.) In many common applications — searches of a huge database, for example — this 4GB limit can be a real performance stopper.

To work with a large database or other big application requires the use of virtual memory, which sets aside a portion of a hard drive’s space to temporarily store additional data for the application. But accessing and manipulating data stored in hard drive virtual memory in this way takes about 40 times longer than when the data are stored in random access memory (RAM).

A 64-bit system circumvents this obstacle because it can accommodate a much larger amount of data in memory — up to 16 terabytes (TB). By loading an entire database into memory, any calculations, searches, and manipulations can be performed more quickly than when accessing the same data residing in virtual memory on a hard disk drive. Grabbing a piece of data from RAM is like calling it up in your own brain; snagging it from a hard disk is like reaching over and pulling the information off a bookshelf.

This is one of the most compelling reasons many businesses are eyeing off 64-bit systems.

The old compatibility issue

Beyond the memory support for much larger databases, two other factors — compatibility and pricing — are guiding life science choices when considering 64-bit systems.

A recent study commissioned by AMD and carried out by Gartner found that existing 32-bit applications are of major concern when considering the issue of moving to 64-bit technology. More than 80 per cent of the IT managers surveyed said the capability to run both 32-bit and 64-bit applications while migrating to 64-bit systems was important.

At the heart of this compatibility issue is whether the performance boost realised when running existing applications justifies the higher costs of a 64-bit system. Interestingly, AMD and Intel have taken vastly different approaches to this situation.

The first Itanium-based systems introduced last year ran many 32-bit programs slower than these same programs ran on existing high-end Pentium systems — and the Itanium systems were significantly more expensive.

Such early results, widely reported in computer trade publications, left a bad taste in many IT managers’ mouths.

“We realised the Itanium was a new architecture, one that offered some interesting benefits for the future,” said the manager of a US company who did not want to be identified.

“But the high costs and poor performance with current applications caused me to wait. It was as if someone introduced a new high-performance car, but it didn’t run well on gasoline.”

Intel has addressed this issue with a software emulation program that runs 32-bit applications at speeds comparable to Pentium systems.

On the other hand, AMD’s Opteron runs both 32-bit and 64-bit applications in their native modes. This, and the fact that Opteron systems are priced only slightly higher than high-end Pentium systems, has led many to consider the AMD CPU.

“Right now, the Opteron-based systems look like a no-brainer for any company adding computational power for common bioinformatics tasks like performing BLAST runs or alignment programs,” independent IT consultant, David O’Neill, said.

The Opteron systems “give a company a migration path to take advantage of new 64-bit applications as they become available,” he said. “And these systems are just about the same price as [higher-end] versions of many existing servers.”

Who needs 64 bits?

The new 64-bit systems will have an impact ranging from the desktop for the individual researcher all the way up to the core of an entire enterprise’s computational efforts. Other applications include large-scale data mining (and searching), knowledge management, and visualisation

While all these applications could run faster using the increased memory afforded by 64-bit applications, not all 64-bit systems are suitable for every application. Selecting the right system depends on the mix of applications that must be supported. It also depends on where an application runs today — on a single server or workstation, on a departmental cluster, or as part of an enterprise computing facility.

Itanium systems are commanding a lot of attention from companies interested in building an infrastructure to support life science research for as far out as a decade. In this scenario, the Itanium is seen more as a long-term investment.

“We’re looking for architectures that will last three to five years, maybe even 10 years,” director of high-performance computing at the Ohio Supercomputing Center. Al Stutz, said. “We think the Intel architecture is that architecture.”

The research collaborative called the TeraGrid is standardising on distributed clusters of Itanium servers running Linux. The TeraGrid, once completed, will have more than 20 teraFLOPs (20 trillion floating-point operations per second) of processing power and about 1 petabyte of storage capacity distributed over five sites.

Itanium and Opteron (to a lesser degree) are also drawing attention as substitutes for high-powered but proprietary systems. For example, some industry experts believe Itaniums will, in many cases, replace high-end machines from Sun and HP which have been using chips based on 64-bit reduced instruction set computer (RISC) architectures. These RISC processors include Sun’s UltraSPARC and HP’s Alpha.

RISC-based servers are used to run many simulation programs as well as perform modelling, simulations, and data mining on extremely large data sets. Because the new 64-bit processors can support memory in excess of 4GB, servers with the newer chips could compete with the RISC-based systems.

RISC workstations have dominated the data visualisation market. Usually, such machines are quite pricey — often costing more than $US10,000. The Apple Power Mac G5 is a likely, less expensive candidate to replace some of these high-end systems. Keys to adoption

Many IT managers are sticking with Pentium systems for now, given the relatively early development of 64-bit processors and the lack of software tuned to take advantage of their capabilities. But there are many areas where systems based on the Itanium, Opteron, and PowerPC G5 make sense. Indeed, early adopters are leaning on their systems vendors and the developer community to optimise their applications to perform better.

On the systems side, most major players (IBM, Dell, HP, and so on) are selling products based on Itanium and Opteron. And other companies, such as Linux NetworX, that have expertise in both the life sciences and higher-performance computing, are offering 64-bit systems.

Of course software is a big part of the equation. Applications have to be engineered to take advantage of the faster hardware. Groups such as Gelato.org are dedicated to advancing the development of optimised Linux applications that run on Itanium-based systems. In September, the US Department of Energy’s Pacific Northwest National Labs, a member of Gelato, upgraded its centre to include a 2000-Itanium Linux cluster capable of delivering 11.8 teraFLOPs — among the top 10 most powerful supercomputers in the world.

Such efforts by developers and systems companies will likely speed the adoption of 64-bit systems by a broad cross-section of the IT industry.

But many companies will need to see clear price/performance benefits for their specific applications before large-scale deployment occurs.


Follow Us

Join the newsletter!

Error: Please check your email address.
Show Comments