Network modelling tools help plan for the future. We looked at three, only one of which is currently distributed in Australia. While good, none knocked our socks off.
In Greek mythology, Zeus appointed the King of Troy's son to be the official judge of a beauty contest among three goddesses. In doing so, he inadvertently set off a chain of events - including sex, bribery, and other ancient Greek pastimes - that eventually led to the Trojan War. If only he'd known.
Unlike Zeus, you have access to tools that can help you predict the future. As a network administrator, it's your job to make sure your users can access your business applications quickly. On many networks, the quickness of these applications depends on where in the enterprise your SQL server is located.
A modelling tool can help you place your SQL server in the best possible location. It takes a picture of your network and lets you simulate different configurations to find out which one performs best.
To compete in this comparison, a solution had to provide an easy way to either automatically discover our network topologies and devices or let us manually re-create them. And naturally, because these are modelling tools, we expected them to simulate different network conditions, such as error rate, utilisation, and bandwidth. Four solutions met these criteria. Two of them - one from Make Systems and another from Optimal Networks - make recommendations based on real data gathered from your network. The remaining two solutions - from Zitel and ImageNet - work from a model of your network that you create by hand. Unfortunately, we had only enough time to compare one of these do-it-yourself tools. So we chose to include Zitel's NetArchitect 1.1.5 because we believed it was the best representative of such tools. But at the last minute, we had a chance to review ImageNet's CANE, Version 2.
No stroll in the park
The computer industry has a penchant for acronyms and buzzwords. This comparison isn't fraught with acronyms, but the topic's close cousin - capacity planning - is the buzzword du jour. Capacity planning is understanding user traffic patterns and managing usage to ensure that your users have fast access to the data they need. It also means anticipating user traffic so that you can plan upgrades and purchases to meet the demand for resources.
It's not good enough to discover where to place your server. For these products to be useful, they must let you run experiments on your network. In short, capacity planning must be done in conjunction with server placement. Capacity planning played a large role in our tests by allowing us to try different configurations of network hardware.
We ran these tools through a battery of tests that included installation and configuration, modelling and simulation, accuracy, and graphs and reports. To accommodate each solution's approach to solving our problem, we made modelling and simulation a catch-all category, evaluating everything from tariff databases to network discovery and planning tools.
To test accuracy, we recorded a typical client/server conversation on our local Ethernet network. We then moved that conversation across our WAN and recorded the delay time over the WAN link, which we used as the baseline delay time. For each solution, we modelled the conversation on the simulated Ethernet network, drawing the packet delay from the simulation. The last step was to take the same conversation, model it over our WAN connection, and record the simulated delay. We scored how closely the simulated delay compared with the baseline delay.
Any modelling and simulation tool worth its weight can generate graphs and reports - ideally, ones that are executive-ready.
We also evaluated documentation, support, and pricing. These products aren't exactly easy to use, so along with the actual price of the software, our price score includes a basic training class (per the vendor's recommendation) and a one-year maintenance contract. Tech support and training opportunities may vary in Australia, and where appropriate, these details are also included.
NetArchitect 1.1.5 - Zitel
NetMaker XA 2.6 - Make Systems
Optimal Performance 2.03
Optimal Surveyor 2.03 - Optimal NetworksTESTING STRATEGYThe question: Your company's business depends on your SQL server application, but you're not sure where to put it to get the best response times. Which modelling tool is best for server placement?
The issues: Ease of implementation, simulation and modelling capabilities, accuracy of the recommendation, graphing and reporting capabilities, documentation, support, and price.
The options: NetArchitect solution, NetMaker solution, Optimal solutionThe answer: Both the Optimal and NetMaker solutions are good for server placement. But the best one for you depends entirely on your criteria.
Results at a glance
We put three network-modelling tools through their paces, and the results were a draw - primarily because both winners had very distinct strengths and weaknesses. In the end, we found that the best tool for server placement depends entirely on your criteria. For a medium-size Windows shop, then Optimal Networks' Optimal solution is a good choice. But if your environment is enterprise-wide and heterogeneous, then Make Systems' NetMaker solution is a better choice. Keep in mind that both the NetMaker and Optimal solutions are capable of a lot more than we tested. Both solutions have capacity planning capabilities, which we hope to test more thoroughly in the near future.
In the end, none of the solutions were impressive enough to bowl us over. Rather, the ideal network-modelling tool would have included key features from each of the solutions: the Optimal solution's accuracy; the NetMaker solution's geographical network views and accounting module; and the NetArchitect solution's end-to-end modelling capabilities.
The fact that many network-modelling tools are available only in a 16-bit environment proves that these products are still in their infancy. And before they have a chance to mature, the market will probably be well on its way to integrating modelling tools with RMon tools such as Bay Networks' Optivity Enterprise, as well as management tools such as Hewlett-Packard's OpenView. To be sure, such a merger will be a welcome relief to the network manager, because administering today's enterprise network still requires many different tools from many different vendors.
Bottom Line 6.3
Of the three solutions we tested, NetMaker is the most complete network-modelling tool - a good choice for the enterprise network. NetMaker far outperformed the competition in representing our network. A geographical view and campus view are available. Plus, every device can be given a geographical coordinate, such as longitude and latitude, which ensured that the LAN and WAN were accurately represented.
Unfortunately, NetMaker turned in the most inaccurate application-response times of the bunch. Make Systems says this is because of the way NetMaker models small traffic conversations. The company adds that accuracy will improve as the number of conversations increases.
Finally, NetMaker supports the use of tariff databases and comes with an accounting module that helps greatly when figuring out the cost of a particular upgrade.
Provides valuable geographical and campus-level views of the networkSupports tariff databasesAccounting module is availableGenerates helpful, detailed reportsCONS:
Inaccurate response times
Makes no "quick fix" recommendations
Bottom Line 6.3
The Optimal solution tied with the NetMaker solution because of its excellent performance in our accuracy category, coming within three seconds of our baseline for a typical conversation. The Optimal solution was also the only one to make recommendations on a simulated model, helping network managers make quick fixes to improve network performance. Like the NetMaker solution, the Optimal solution has a traffic-import tool, plus it had the most complete device library of the tools we compared. Because its discovery engine retrieves information on clients, it also has a head start on NetMaker in providing end-to-end modelling. Several times, though, when we were about to give Optimal credit for a well-implemented feature, we found an annoying feature - or lack of a feature entirely - that cancelled it out. For example, the Optimal solution was easy to install, but it lacked wizards or help features to make the process easier. Also, it accurately modelled our network, but working with the model was difficult because the user interface was buggy. Finally, it could generate some cool reports, but they lacked detail.
Excellent results in our accuracy test
Makes recommendations on simulated modelsMost comprehensive device libraryIncludes a network traffic-import toolGenerates useful reportsCONS:
Difficult to use, buggy user interface
Poor technical support policies
Mediocre technical support
Lacks context-sensitive help
Bottom Line 5.2
Zitel's NetArchitect isn't for the faint of heart. It doesn't have a discovery engine, so you must input every device on your network by hand - a daunting task if you have an enterprise-wide network. It also lacks a traffic importer, which means you have to understand and interpret your traffic patterns so NetArchitect can use them to generate an accurate model.
NetArchitect, however, was the only end-to-end network-modelling tool we compared. So, unlike the others, it factors the client and server components into the model you create - not just the network. Of course, these components, too, must be entered by hand, so one could argue that such a feature is a disadvantage in NetArchitect's case. Obviously, the more information required by the tool from the user, the more room there is for human error.
NetArchitect currently runs in a 16-bit environment. According to Zitel, a discovery engine and a 32-bit version are in the works.
End-to-end network-modelling capabilitiesInexpensive and easy to installProduces exceptional dynamic graphsCONS:
Lacks a network-discovery engine
Lacks a traffic-import tool
Cannot generate reports
Requires the greatest understanding of network theoryMust manually derive traffic characteristics to create networksNetArchitecture solutionNetArchitecture 1.1.5Zitel's NetArchitect is the only solution we compared that is entirely isolated from the network. Unlike the Optimal and NetMaker solutions, which automatically discover your network and can import live traffic data, the NetArchitect solution requires you to input every device on your network manually - from client CPU to server I/O throughput. In short, for the tool to be effective, you must have a better-than-average understanding of your network infrastructure and traffic patterns. You will also need a reliable calculator to derive the convoluted traffic characteristics that NetArchitect needs to create an accurate model.
Furthermore, NetArchitect currently runs in a 16-bit Windows environment, which limits its appeal and capabilities. According to Zitel, future releases will include a much-needed discovery engine and will be available for a 32-bit environment. For now, though, it's suited for small networks with predictable traffic patterns. The fact that it is by far the least expensive package we compared makes it even more appealing to smaller shops.
At first glance, NetArchitect looks like nothing more than an out-of-date network-drawing program that should have been retired years ago. To NetArchitect's credit, however, it is one of the few end-to-end modelling tools on the market (and the only one in this comparison). This means that the solution will factor not only the network but also the client and server components into the model you create. Although this can work to your advantage, because you've accounted for every component of your enterprise, the sheer quantity of information required to do a model or simulation can be a disadvantage. Clearly, the more variables there are for you to calculate by hand, the more room there is for human error. Neither the NetMaker nor Optimal solutions depends on the user to this degree, because they both gather information from Network General's Sniffer traffic sessions.
Going the distance
Unlike the NetMaker solution, the NetArchitect solution has no tariff database, so tracking the real-world cost of your models is time-consuming and tedious. This minimalism carries over to its device libraries as well. As with both the Optimal and NetMaker solutions, the NetArchitect solution's device libraries leave much to be desired. Only the most popular devices, such as those from Cisco, Bay Networks, and 3Com, are supported.
Despite the end-to-end capabilities of the NetArchitect solution's simulation engine, we tested only the network portions in order to compare results more fairly with the other solutions in this comparison.
To our surprise, the NetArchitect solution's performance numbers were on par with NetMaker's, despite the greater risk of making a mistake with NetArchitect. Both solutions sport unique features that make it easier to determine accuracy: the NetMaker solution breaks down each link's packet delay, and the NetArchitect solution offers a comprehensive breakdown of the delay itself. In particular, the NetArchitect solution breaks down the delay into both media-contention time and media time. And, because it is an end-to-end modelling tool, it can break down the delay even further into CPU time, I/O time, connector time, and so on, making it easy to find bottlenecks.
Of the three solutions, the NetArchitect solution produced the best graphs, though it lacked the capability to produce reports. We found the graphs created during simulations to be relevant and useful; unfortunately, the solution limits you to views on a segment-by-segment basis. Its competitors made up for a lack of graph capabilities by providing excellent reports that support views of multiple segments at a time.
NetMaker XA 2.6
Make Systems' NetMaker is a nearly complete network-modelling tool that is well-suited for the large, geographically dispersed enterprise. It comprises three primary components, and three add-on tools are available separately.
The NetMaker solution was the only solution we compared that supports the use of tariff databases and provides an account module to total up the real-world cost of any changes you model. Its only obvious missing feature is end-to-end modelling capabilities. It was also the only Unix-based solution in this comparison; we ran it on a Sun Sparcstation that ran Sun Solaris.
NetMaker has a lot going for it. We particularly liked its ability to provide a bird's-eye view of the network; the tool lets you change those views on the fly, so we were able to switch from a geographical (macro-level) view to a campus view (a micro-level view of each LAN and router) without any problems.
Adding it up
We really appreciated NetMaker's ability to generate detailed expense reports for every WAN link, providing monthly charges and inter- or intra-LATA (Local Access Transport Area) charges, among others. By using the area code and prefix of the number instead of the WAN distance, we were able to come up with very accurate charges.
As with the Optimal solution, the discovery engine in NetMaker is "smart". The discovery begins with a seed router that tells the program about other existing networks and WAN links. However, even though the discovery process is similar to the Optimal solution's, the NetMaker solution took twice as long to catalogue the same topology.
As with its competitors, the device library in NetMaker was severely lacking: only popular devices, such as those from Cisco, Bay Networks, and 3Com, are predefined. Obviously, these vendors can't support all the devices in their libraries, but we were disappointed by just how few devices each product did maintain. And even though each product allows you to create your own device, you have to know the performance characteristics of the device. This is not exactly an easy task to accomplish. For example, do you know what the packet-forwarding rate and packet latency of your routers are? None of the vendors offer help for such information.
It's all in the definition
NetMaker has a traffic importer that we used to bring expert Sniffer files into our model; it's a two-step process. But before we could import network traffic into our model, the product had to translate the .CSV files into a recognisable file type.
The drag-and-drop capabilities of NetMaker are noteworthy.
The IXI Desktop used by Make Systems allowed us to drag a device or link onto any module and have all of its characteristics fill in automatically, which saved us a lot of time.
Also, NetMaker's reports are incredibly detailed.
For example, not only was it easy to view the delay to and from a client, but every segment and stage was broken down so we could see exactly where the biggest delay would be.
Optimal Performance 2.03
Optimal Surveyor 2.03
There are two components within the Optimal solution: Optimal Surveyor and Optimal Performance. We used both products for our tests. The Optimal solution is a highly accurate modelling tool capable of performing all of the most common modelling and simulation tasks. As long as features such as geographical placement and network-costing are not essential criteria for your needs, the Optimal solution is an ideal modelling tool for you. It works particularly well modelling small to moderately large networks.
A difficult-to-use interface is the single quickest way to turn modelling software into shelfware. Each solution's user interface tortured us in one way or another, but only the NetMaker solution's interface was better than the Optimal solution's. We were prepared to overlook the fact that the Optimal solution's components are 16-bit applications that run in Windows 95 - a 32-bit environment. But we couldn't overlook the bugs we found, which affected basic functionality. For example, we could not mass-delete clients on our networks. Nor could we simply select a node from a model. If our hand wasn't completely still as we selected a particular device, the interface wouldn't pick it up. However, we did like the Optimal solution's collapsed views of the network.
Unlike the NetMaker solution, the Optimal solution doesn't have tariff information for WAN connectivity. Tariff information is incredibly valuable when quickly ascertaining the cost of modelled changes. For example, if your modelling tool discovered that your 64Kbit/sec WAN link would be saturated at the modelled traffic levels, you could double or triple the speed of the link to find the best cost-per-performance increase. Because the Optimal solution lacks this feature, you'll have to get price information from your local and long-distance carriers.
However, the Optimal solution's device library was the most complete of the three solutions. Unfortunately, that's not saying much. The solutions we compared mostly support more popular devices in their libraries. So if your hardware isn't 3Com, Cisco, or Bay Networks, you'll have to create your own device or use a comparable device and hope the results are similar.
Smart is good
Unlike the NetArchitect solution, the Optimal solution uses its own SNMP discovery engine, which probes all of your network links and pulls any information it can for use in a model. We found a known bug in the version of Optimal Surveyor that we compared. It caused the product to crash every time we tried to save the topology after discovering the WAN links. Optimal Networks quickly supplied an upgraded version, which fixed the problem.
The discovery engines of the Optimal and NetMaker solutions are "smart" in that they query a seed router, pull the routing and Address Resolution Protocol tables from it, and then use only those segments and clients in the discovery. In contrast to the NetMaker solution, which does not discover clients, the Optimal solution has a head start on any future end-to-end modelling capabilities, at least visually.
Like the NetMaker solution, the Optimal solution has a traffic-import feature. Before testing, we thought that such a feature was essential to providing accurate application modelling. But NetArchitect proved in our modeling and server-placement test that traffic-import capabilities do not always equal accuracy. But they do equal convenience.
Unlike the other solutions, the Optimal solution makes recommendations based on a simulated model. This "intelligent" reporting is exactly what most network managers need to provide quick fixes for improved performance. However, the Optimal solution can only suggest in increments.
For example, after modelling our LAN application across the WAN, the Optimal solution suggested increasing the WAN-link speed to at least 125Kbit/sec.
Once we made the change and ran the simulation again, the Optimal solution suggested increasing the speed again, this time to 260Kbit/sec. Only then did the utilisation on our WAN link drop below 80 per cent.
Test methodology questioned
Make Systems believes that the methodology used in these tests is flawed. The company is currently in discussions with the test lab and ARN will bring you more on this in a later issue. However, Make Systems also believes the same bias has been applied in the testing of each product, so InfoWorld Test Labs believes the results can stand for the moment, as the scores should not change. Australian Reseller News is happy to send the full Make Systems comments and a related white paper to anyone who requests it, by e-mail to Helen Cousens at firstname.lastname@example.orgWeb connectionIf you'd like more information on the solutions we compared, check out each vendor's Web site: