Menu
Cleaning up the server house

Cleaning up the server house

The concept of server consolidation in the data centre isn’t as black and white as it sounds. In fact, getting everything in the one place can prove very difficult. There are risks and challenges — including increased complexity within the remaining servers once the numbers have been reduced — that ha to be resolved.

This is particularly true when the organisation is dealing with multiple departments and business groups. How work systems interconnect, and how work flows are also crucial elements.

According to Gartner, there is tremendous pressure to optimise costs and to show ongoing improvement in IT operations while still enabling new business capabilities.

“Configuring and deploying servers used to be an infrequent project. Now it is a daily or weekly requirement, taxing system administration resources when they are already stretched,” a Gartner paper reports, adding that server consolidation remains one of the most important issues for data centre executives.

And while the latest buzz is around server consolidation — or what Meta Group analyst, Kevin McIsaac, calls adaptive infrastructure — the overall goal is to align the IT infrastructure with current and future business requirements. A server foundation needs to be built for new solution deployment including e-business and business intelligence.

“It’s about aligning the infrastructure: taking into account capacity, agility and services levels to the appropriate economics,” McIsaac said.

The concept involves more than consolidating hardware — software and services must also be factored into the equation and blended together with systems management disciplines and best practices.

“The problem with server consolidation is many people have the idea that you take 10 boxes and reduce it to one,” he said. “Everybody is interested in that, but not exactly sure how to get there.”

But there was help on the horizon, McIsaac said, once several factors were taken into account including how to simplify servers, storage, databases, applications, networks, systems management and processes.

Indeed, as organisations attempt to squeeze the most out of IT resources, there are a host of considerations before consolidating the server environment — and that is where a reseller can help with analysis and come up with a data centre game plan.

Resellers could help with capacity planning — analysing peaks in demands — along with deciding which applications were suitable candidates for consolidation, HP Australia’s marketing manager for industry standard servers, Angus Jones, said.

“When deploying this type of strategy, other questions to ask include: ‘Are you going to change your operating system? Are you going to go to the latest version of the application?’”

Organisations needed to sort out the back-end systems and the risk perspective, he said.

Capturing the bigger picture — and pushing through the resistance to change the data infrastructure — was key, McIsaac said.

“You need to look at the whole server environment and unify it — and the path that we see starts by reducing the number of data centres,” he said.

It was a balancing act, requiring organisations to trim down the fat to a size that made sense, while not losing sight of business continuity and disaster recovery planning, McIsaac said.

“You have to look at why you have all of this stuff,” he said. “Certainly collocation into one or two or an appropriate number that your organisation might need is something to consider.”

The next obvious move, McIsaac said, was to consider storage consolidation.

“This is centred around SAN-type principles where you grab the storage and put it into one large pool,” he said. “There are cost benefits because it can increase utilisation.”

And since storage was moved into one pool, organisations were better able to allocate spare resources for new applications or new capabilities like replication or disaster recovery, McIsaac said.

Product manager of enterprise solutions at Optima Technology, Ole Mortensen, said in the SMB space — which was primarily addressed for cost saving reasons — a typical approach was looking at centralising storage.

“When moving this way, keep the back-up requirements in mind,” Mortensen said. “The central back-up unit has many merits, but it is often associated with relatively high costs.”

He also suggested looking at the current system load and the expected development of the organisation. “Consolidation is often implemented in combination with a migration — from an NT 4.0 to a Windows 2003 environment — and onto new and stronger hardware,” he said. “So make sure that this is scaled properly.”

Another important step on the consolidation path was to rationalise the vendor selection, McIsaac said, and to be aware it was not just about fewer servers, but rather using fewer kinds of servers and databases.

“Recognise it’s not the number of boxes that kills you, but actually the diversity,” he said. “So if I have 20 boxes and they are all the same manufacturer running the same OS, the same rev levels, and databases and apps, then the complexity is low. But the problem is each box is a unique beast built by somebody different.”

Once vendor selection was taken into account, the benefits were obvious, McIsaac said. By having fewer vendors (and contracts) to deal with, it lowered support costs and offered operational benefits. For example, when you had to add a patch due to security or performance problems, the process was much less complex.

But once the clean-up process begins, don’t go to extremes or jump in head first, he said. “Think about limited, very specific server consolidation.

Don’t adopt the Highlander there-can-be-only-one approach in the data centre, and say, ‘Let’s just have one big Unix box or one NT box or one big main frame,” McIsaac said.

“Instead, what you need to do is reduce the complexity as much as possible, and in the process of taking down the numbers, ensure you haven’t made it more tricky or expensive.”

And while there wasn’t a solution for server consolidation, HP’s Jones said that, in the quest for simplification, one way users could consolidate was to scale up (where you take a number of two-way boxes and replace them with eight-way).

“Making things simpler is the main goal,” Jones said. “Things get out of control thanks to a companies rapid growth or bad planning, and they end up with a network infrastructure that has a huge number of servers and different manufactures, along with old equipment.”

Once the consolidation process was underway, Jones said the monitoring, testing, operational and forecasting procedures of the IT environment — which was now a shared environment rather than an isolated one — must change.

Upping the ante

Given the sweeping performance improvements in server technology, organisations can adopt a long-term management approach.

Intel-based servers — two-way, four-way and eight-way — have become good enough and scaleable enough for 90 per cent of what most people do, McIsaac said.

“We’re advocating that people strongly consider when the old hardware comes up for a refresh, why not move to Intel or an AMD,” he said. “Frankly, whether you go to a traditional x86 architecture or Opteron or Itanium, it really doesn’t matter. It’s all kind of like doilies on the edge of the border. What we’re talking about here is the infrastructure being good enough, not the specific server.”

AMD’s John Robinson said the push towards 64-bit computing was encouraging the server consolidation trend, particularly in the government space.

“With server consolidation, people are looking for the next round of performance and longevity,” he said.

Enhancements in the Opteron processor technology, including the rollout of the direct connect architecture which beefs up memory, along with enhanced security features (thanks to the virus protection protection) are boosting the overall server environment.

Optima’s Mortensen said Opteron and Xeon-based servers would continue to dominate the local market for the second half of 2004, with enough performance in reserve to enable cost-saving measures such as server consolidation in small to medium organisations.

“Sixty-four-bit computing is here and is the next logical step up for the server market, but the process involves heavy investments in applications at a time when companies are looking to consolidate,” Mortensen said. “This could trigger a jump up to 64-bit platforms as rewritten applications become available that allow companies to consolidate their server farms without any loss of processor power or bandwidth.”

Vital role

Server platforms based on the new Intel Xeon processor were slated for release in the coming months.

In addition to processing power, memory was another key consideration when talking about server consolidation, which could increase performance and reliability while reducing the complexity of data management and storage, Australian country manager for Kingston Technology, Keith Hamilton, said.

When evaluating consolidation, the inclusion of adequate memory was critical to success, he said.

“Memory plays a vital role in server performance, yet it is a component frequently overlooked,” Hamilton said. “While Intel’s Itanium and AMD’s Opteron-based servers compete to deliver 64-bit technology, adding memory remains one of the best methods to optimise server performance.”

Microsoft Australia product manager, Michael Leworthy, said server virtualisation — which had largely come into favour in the past 12 months — let users run different operating systems on a single hardware server.

Essentially, it is a way to take care of misbehaving applications.

“Server consolidation is becoming a mainstream focus in the data centre,” Leworthy said, given the advances in hardware, the uptake in storage and subsequent changes in the underlying infrastructure.

Microsoft is set to launch its upcoming Virtual Server 2005 software, which will help customers migrate from NT 4.0 systems over to Windows Server 2000 and 2003.

It is expected to launch two editions of the technology — enterprise (up to 32 processors) and standard (up to four processors) — at its Canberra TechEd show in August.

With virtualisation, partners can help organisations boost operational efficiency and drive down costs, Leworthy said. Enhanced security was another consolidation driver.

IBM is also cranking up the heat in the virtual server space.It recently released virtualisation engine micro-partitioning technology for the Unix space that’s powering the eServer p5 series.


Follow Us

Join the newsletter!

Or
Error: Please check your email address.
Show Comments