ARN

Networking's greatest debates in Management

Classic debates include Outsourcing vs. keeping it in-house, Industry standards vs. proprietary technologies and Frameworks vs. point products

Outsourcing vs. keeping it in-house

To outsource or not to outsource, that was the question. While it still is an unpopular action for corporations to take, most large companies - and increasingly small-to-midsize firms - outsource work at some point or another. And for good reason: financially -- IT shops that outsource infrastructure management and application services can expect to save 12% to 17% annually on average, which means U.S. companies are sitting on about US$10 billion in potential savings, according to a recent Forrester Research report.

Savings are certainly the main draw of outsourcing, but PricewaterhouseCoopers research recently noted that other reasons for outsourcing are on the rise. For instance, more than 40% of respondents to its recent outsourcing study said they would outsource to improve customer relationships. Another 37% said outsourcing could help them develop new products or services, and about one-third said outsourcing would be important in helping companies expand into geographies they couldn't otherwise enter.

The survey also revealed that executives increasingly are willing to outsource functions considered core to the business. "Many respondents (53%) indicated they outsource activities that they consider to be core," the report stated. Although IT remained the most outsourced activity, with about 60% sending those duties outside their companies, 70% of respondents outsource one or more of what could be considered strategic functions. More than half outsource production or delivery of core products and services, about one-third outsource sales and marketing, and another 32% outsource R&D activities.

There are functions companies are reluctant to outsource, financial and security applications among them. But at least on the security side, that is changing too. Outsourced security services have become more popular in the last few years, as companies look to outsource everything from monitoring of intrusion detection and firewall devices to watching access control lists and handling e-mail security; says Andreas Antonopoulos, senior vice president and founding partner of Nemertes Research and a Network World columnist.

What has changed over time is that companies don't outsource all functions but rather pick and choose. For example, while some billion dollar mega-deals are still getting signed by companies such as General Motors and Johnson & Johnson, most deals are not that large. The average size of the billion-plus contract in the first quarter of last year was US$9.6 billion, but in the third quarter of 2007, it was down to US$2.4 billion, researchers at TPI said recently. The total contract value of outsourcing contracts signed in the third quarter was down 16%, TPI said. -Michael Cooney

Industry standards vs. proprietary technologies

Standards have won over corporate buyers, not that proprietary technologies don't have their benefits.

It's hard to imagine now, but there used to be a rigorous debate about which strategy was best for corporate IT buyers: industry standards or proprietary technology. Standards have won this debate, but that doesn't mean there weren't advantages to buying proprietary technology.

Proprietary technology is first-to-market with innovative features. But going the proprietary route locks buyers into a single vendor, which is too risky for most CIOs given the topsy-turvy nature of the network industry.

Proprietary technology still rules the desktop, where Microsoft's Windows and Office software dominates. But standards-based software has made huge inroads.

The Internet is essentially a collection of standards that makes everything from the Web to e-mail work. The open source movement takes standards one step further by creating communities of developers willing to work together to add features and fix bugs in software such as Linux.

One drawback is that creating standards is a slow, messy process with many technical tradeoffs, especially in open communities such as the Internet Engineering Task Force. Sometimes, the standards bodies can't develop an industry standard fast enough. That's what happened in instant messaging, where attempts to corral industry leaders AOL, Yahoo and Microsoft failed to result in a single, unified standard.

Other times, it's hard to tell whether a technology is proprietary or standard. The PC industry's decision to "standardize" on Intel chips and Microsoft. Windows software fueled the market for years and made Apple a niche player. But that decision made virtual monopolies out of Intel and Microsoft, which isn't what standards are all about.

For the foreseeable future, the network industry is likely to remain a mix of standards and proprietary technology. -Carolyn Duffy Marsan

Page Break

X.500 vs. LDAP

This architectural argument would pack networking conference sessions, divide the room and ignite heated shouting matches in the early-to-mid-1990s. It was a case of the student overtaking the mentor as the Lightweight Directory Access Protocol was at first a simple alternative to X.500's Directory Access Protocol (DAP). LDAP was used for accessing X.500 directories via the TCP/IP protocol. With the advent of the Internet and its reliance on TCP/IP, X.500 faded into the background even though it was later modified for use over TCP/IP.

X.500 didn't have it. In addition, X.500, developed in the 1980s with input from telecom firms, required an OSI stack and an X.500 Server.To go with the client protocol, LDAP Directory Servers soon popped up that had vestiges of X.500 still lurking in their depths. But like villagers in the comedy classic "Monty Python and the Holy Grail," X.500 is not dead yet.

Some of its supporting protocols remain important directory security constructs, namely the X.509 authentication framework that is the cornerstone of PKI-based certificates. And LDAP has had its own evolutionary issues. LDAPv3, the last iteration of the protocol, lacks widely adopted access control and back-end integration extensions, namely replication, that have kept the protocol largely behind the firewall. -John Fontana

Agent-based vs. agentless

When it comes to software agents, most IT managers would rather live with the little gremlins on their machines than opt for the alternative.

The small pieces of software code work with network management software to collect information from and take action on managed devices. But configuring, deploying and updating thousands of agents across client and server systems isn't appealing. And in some cases, performance and security degrade when machines become overloaded with agent software from multiple vendors.

But without agents, IT managers would have to physically visit desktops and servers to carry out simple tasks such as software updates. That's why many IT managers choose to place a few hand-picked agents on managed machines, reducing manual effort and helping secure the machine with antivirus tools.

"There are risks in putting too many agents on any one device, so I've had to set hard limits on how many agents we send out to our endpoints," said William Bell, director of information security at CWIE, an Internet-based Web-hosting company in Tempe, Ariz. "Some people will tell you agents are botnets waiting to happen, but if you have ever tried to patch thousands of machines without agents, you know agents have their place. It's a judgment call."

For now at least it seems the judgment will usually fall in favor of agents despite the headache of keeping them up-to-date.

"Agents offer value. They allow you to extend your policy outside of your network and to control activities on endpoints no matter where they are. But there is a need to reduce the complexity of agents," says Charles Kolodgy, a vice president at IDC. "You have to be diligent and vigilant with the agents that are required. Vendors must provide smart management with their agents."

Going forward, industry watchers expect that multiple functions will be instrumented into a universal agent of sorts and others predict the capabilities will become embedded into operating systems and loaded onto equipment. Management vendors today have started work on standardizing agent technology across products to reduce the administrative burden agents put on customers, while also giving them the capabilities agents provide on the managed machine.

"Because endpoints are changing to include handheld devices, vendors know that an agent on each device is not feasible in the long term," says George Hamilton, director of Yankee Group's enabling-technologies enterprise group. -Denise Dubie

Page Break

Frameworks vs. point products

Everyone loves an underdog story, but this isn't one.

BMC, CA, HP and IBM came to power in the management software market in the 1990s with tools to manage network devices and software designed to keep mainframe systems humming along. Being among the few choices at the time, the vendors dominated the market and customers endured product implementations that could run up to 18 months and spent well into the millions of dollars to get management software in place. But in too many cases, the technology didn't deliver.

As stories of network management framework buyer's remorse echoed throughout the industry, newcomers such as Concord, Micromuse, Riversoft and SMARTS emerged to offer customers easy-to-install, low-cost alternatives to the monolithic, cumbersome products on which the big vendors built their software businesses.

And while the innovative players put up a fight and scrapped their way into some customer accounts, the smaller companies no longer exist on their own and their technologies live on inside the management Goliaths, who remain the market leaders.

Over the past several years, CA acquired Concord (which had acquired onetime framework contender and Cabletron spin-off Aprisma Management Technologies); IBM acquired Micromuse; HP acquired a license for RiverSoft technology (and Micromuse acquired RiverSoft before becoming part of Big Blue); and EMC picked up SMARTS. According to industry watchers, the start-ups never had a chance for long-term success.

"Many IT organizations have become adverse to risk. Unless there is a compelling reason to acquire technology from a small and unproven vendor, IT organizations will favor viability over technology," Jean-Pierre Garbani, a research vice president at Forrester Research, said in 2003. But the battles fought by the start-ups weren't all for naught. The foundation of their business models -- less expensive, lower cost software that actually works -- resonated too much with customers who carry on the battle cry with framework vendors to simplify their software and offer it at reasonable prices.

Framework has become a dirty word in the network management industry, and vendors now prefer to call their massive product portfolios integrated suites or management platforms. And in most cases, the integrated applications now offered by one vendor are in fact best-of-breed products collected over the years from the industry's more innovative start-ups.

"What IT rightly seems to be moving toward is a central point of automation and integration for multiple products from multiple vendors. That's a lot different than the frameworks' total philosophical and monetary commitment to a single brand that offered very little in the way of real integration and almost no satisfactory levels of automation," Dennis Drogseth, research vice president at Enterprise Management Associates, said in 2006. -Denise Dubie

Page Break

Enterprise control of mobile devices vs. employee ownership

A recent survey on mobile workforce security confirmed what we've known for a long time. Giving your mobile employees notebooks and smartphones is like giving your teenaged kids the keys to the car: once they're out the door, there's nothing you can do about what they do.

A majority of the 450 IT managers surveyed by management software vendor BigFix say they believe the mobile workforce makes their enterprise networks more susceptible to malware and other threats. And in some cases, these IT managers think their existing systems management tools have contributed to mobile devices falling victim to a worm or virus.

Even more personal than the personal computer, handheld computers and smartphones (and even iPods) are also even more dependent on both the corporate network and intervening provider networks. So far, enterprise networks are ill-prepared to secure and manage the devices themselves, the data on them, or their access to corporate networks.

The issues in the argument are complex: end-user behaviors and habits, securing data on the devices, protecting the devices from malware infections, protecting the data on them if they're lost or stolen, monitoring data copying or file transfers to USB devices, streamlining access to the corporate network, integrating them with Network Access Control (NAC) products. Right now, companies have to stitch together an array of third-party software products and appliances to address this complexity.

This is one argument where the solution seems to be to find the common ground that makes a comprehensive solution possible. John Cox

Read Networking's greatest debates in Security
Read Networking's greatest debates in Software
Read Newtorking's greatest debates in the Data Center
Read Networking's greatest debates in LANs and WANs