Computer Associates is making a splash in the network management arena this month with the planned arrival of its Neugents agent technology, which utilises artificial intelligence to determine future network performance. IDG's Paul Krill recently spoke to Steve Mann, CA vice president of product strategy, at CA's headquarters about critical networking issues.
IDG: Is the Neugents technology a step toward automatic pilot networks, in which networks manage themselves?
Mann: Neugents are definitely a step toward self-healing environments. The first step in any self-healing environment is the capability to predict an event before it happens, and that's exactly what Neugents do.
I don't think that you'll ever take the administrator out of the loop. I trust technology to a great extent, but I trust human beings more.
How much of an impact has the year 2000 had at CA?
Well, it's had two impacts, really. The first was to make us go back and make sure that all of the 500 solutions that we provide to our clients are Y2K-compliant, and, if they weren't, to make sure that they are as soon as possible.
The second issue is that we understand that there is a refocus of client budgetary dollars on solving the Y2K problem. And from talking to CIOs, what they've told us is that it's not necessarily going to slow down their purchases of new solutions, but it will slow down their implementation of solutions.
With the growing popularity of SANs [storage area networks], do you see network and storage management converging?
Yes. You've got large distributed environments that have remote pools of data and that data needs to be as secured as any central repository of information. SANs help you do that and you need to manage the availability of data. That's what the network management component does.
What impact do you think outsourcing has on network management? For example, what about a company hiring a third party to manage their network?
Well, we provide the solution, and whether or not a client chooses to get a separate group to do the actual management has less of an impact on us than you would think.
Our issue, though, is to make sure that the client is satisfied with the solution that we're providing. So if the outsourcer they choose is not a strong outsourcer, we'll tell them that we feel they should use a different organisation. We'll let them know that we feel that one organisation is stronger than another in terms of providing, because we want to focus on providing them with optimal availability and the product does that. But you also need the intelligence behind the product in order to implement it effectively.
So is outsourcing becoming more prevalent?
It's interesting. We are not seeing outsourcing where we originally thought that we would see it. For example, smaller environments and even medium environments are bringing these things in-house, and that's good to see. That's why there's been such success with the IQ products that we have. We are seeing outsourcing to a larger degree in Fortune 500 environments.
But even there, one of the issues is that organisations consider the use of these technologies to be so competitive - such a competitive advantage - and they could even consider outsourcing to another organisation a breach.
It's like opening the door and inviting the competition in sometimes. So some organisations that we deal with are very, very cautious about what technologies they outsource and what they won't, and they really outsource only the peripheral technologies. Enterprise management is not a peripheral technology; it's a core component.
Do you see IP eventually overtaking all the other network protocols to the point where you will need to eventually develop only IP products?
You mean in the same way as today's programming languages are overtaking Cobol? Yes, IP is predominant in most organisations these days, but organisations also have such significant investment in technologies that are 15 to 20 years old that I don't see those going away any time soon. And we're going to continue supporting them.