X.500 vs. LDAP
This architectural argument would pack networking conference sessions, divide the room and ignite heated shouting matches in the early-to-mid-1990s. It was a case of the student overtaking the mentor as the Lightweight Directory Access Protocol was at first a simple alternative to X.500's Directory Access Protocol (DAP). LDAP was used for accessing X.500 directories via the TCP/IP protocol. With the advent of the Internet and its reliance on TCP/IP, X.500 faded into the background even though it was later modified for use over TCP/IP.
X.500 didn't have it. In addition, X.500, developed in the 1980s with input from telecom firms, required an OSI stack and an X.500 Server.To go with the client protocol, LDAP Directory Servers soon popped up that had vestiges of X.500 still lurking in their depths. But like villagers in the comedy classic "Monty Python and the Holy Grail," X.500 is not dead yet.
Some of its supporting protocols remain important directory security constructs, namely the X.509 authentication framework that is the cornerstone of PKI-based certificates. And LDAP has had its own evolutionary issues. LDAPv3, the last iteration of the protocol, lacks widely adopted access control and back-end integration extensions, namely replication, that have kept the protocol largely behind the firewall. -John Fontana
Agent-based vs. agentless
When it comes to software agents, most IT managers would rather live with the little gremlins on their machines than opt for the alternative.
The small pieces of software code work with network management software to collect information from and take action on managed devices. But configuring, deploying and updating thousands of agents across client and server systems isn't appealing. And in some cases, performance and security degrade when machines become overloaded with agent software from multiple vendors.
But without agents, IT managers would have to physically visit desktops and servers to carry out simple tasks such as software updates. That's why many IT managers choose to place a few hand-picked agents on managed machines, reducing manual effort and helping secure the machine with antivirus tools.
"There are risks in putting too many agents on any one device, so I've had to set hard limits on how many agents we send out to our endpoints," said William Bell, director of information security at CWIE, an Internet-based Web-hosting company in Tempe, Ariz. "Some people will tell you agents are botnets waiting to happen, but if you have ever tried to patch thousands of machines without agents, you know agents have their place. It's a judgment call."
For now at least it seems the judgment will usually fall in favor of agents despite the headache of keeping them up-to-date.
"Agents offer value. They allow you to extend your policy outside of your network and to control activities on endpoints no matter where they are. But there is a need to reduce the complexity of agents," says Charles Kolodgy, a vice president at IDC. "You have to be diligent and vigilant with the agents that are required. Vendors must provide smart management with their agents."
Going forward, industry watchers expect that multiple functions will be instrumented into a universal agent of sorts and others predict the capabilities will become embedded into operating systems and loaded onto equipment. Management vendors today have started work on standardizing agent technology across products to reduce the administrative burden agents put on customers, while also giving them the capabilities agents provide on the managed machine.
"Because endpoints are changing to include handheld devices, vendors know that an agent on each device is not feasible in the long term," says George Hamilton, director of Yankee Group's enabling-technologies enterprise group. -Denise Dubie