Menu
Remote users find the Net

Remote users find the Net

IS managers faced with swelling ranks of remote users have sighed with relief lately for two reasons: outsourcing and the Internet. It should come as no surprise that IS staff members are warmly embracing anything that gets them out from under hefty telephone-company line charges and the burden of supporting off-site users.

But lest anyone grow complacent, pulling the plug on that remote access concentrator solves only part of the problem. The tough choices about pricing, security, reliability, and quality of service don't disappear just because a corporation's remote users are dialling into an ISP instead of the company modem pool. And with the effects of telephone deregulation looming on the horizon, some of those choices promise to become moving targets.

For now the issues facing IS are whether and how much to outsource and when the Internet will become secure and robust enough to become the access method of choice.

Security

Although companies can outsource or use the Internet to minimise cost and reduce infrastructure and support headaches, the pivotal issue of security remains firmly in the IS manager's lap no matter how remote users are connected to the corporate network.

Most communication can be protected from prying eyes via encryption, and everything from link-layer connections to individual data files and messages can be encrypted. However, knowing with certainty who is accessing what and who is talking to whom is the key to secure remote access.

There are three types of authentication beyond the static passwords typically used in-house. One involves encoded ID cards and biometric devices. This approach requires the end user to possess a physical ID or the system to know a physical characteristic of the end user such as a fingerprint. The second is dynamic passwords generated by hardware or software based "tokens". The third, certificate systems, requires a central authority such as an IS manager to digitally tag users and objects for identification. Certificate systems avoid the expense of ID-card readers and biometric scanners, and are easier to use than dynamic password systems. However, certificate systems are relatively new to the market.

The Remote Access Dial-In User Service standard, which allows multiple authentication servers to share a common directory, is helping IS shops scale up their remote-access programs. But the lack of interoperability among authentication systems could soon emerge as a problem for large multivendor shops and IS managers looking to build extranets.

"We'll move ultimately to certificate-based security," says Martin Smith, director of information services for the US International Trade Commission. "We want to have an open network where we don't rely on a big wall and moat."

Meanwhile, security is one of the primary IS concerns that has kept the Internet from becoming an instant cure-all for the problems posed by remote access. Security, performance, and reliability were the concerns cited by network managers in a recent survey conducted by Forrester Research. The survey found that only 4 per cent of the network managers said they use the Internet for remote access today.

Equipment vendors are gearing up to provide ISPs with carrier-class systems so that ISPs can address these concerns. However, security doesn't stop at the corporate firewall nor in the cloud of a potentially more secure Internet. A remote-user's base must be secure, too, according to Aberdeen's Brooks.

Open or closed

To a certain extent the decision to go with an ISP, an outsourcer, or whether to keep remote access in-house depends on the IS manager's view of the future of corporate computing. The intranet trend, with its firewall technology, has heightened the perception of the corporation as a fortress. To IS managers who hold this view or whose businesses require it, the very concept of remote access is problematic because it amounts to necessary breaches of security.

On the other hand, the remote access trend itself has led some pundits to declare that the age of the virtual corporation is upon us. This view holds that, with everyone connected to the Internet at home and on the road, those shiny glass office towers around the world will soon be emptied of everyone but CEOs and IS staff members.

The reality is likely to fall somewhere in between, perhaps with tiers of security that don't necessarily correspond to the physical limits of a company's property.

Inside or outside

Another debate, the shape of a remote-access infrastructure, harkens back to the old question of where in the network the intelligence should reside. Should it be concentrated in the core of the network, at the periphery, or distributed throughout?

Most of the equipment vendors subscribe to the distributed model or profess neutrality. Given that the larger vendors such as Cisco Systems, Bay Networks, and 3Com have acquired so many niche vendors, they sell virtually every component of the remote access infrastructure.

But there are several indications that, for the near future at least, the intelligence will remain in the core of the network.

Some industry observers are predicting the emergence of the remote access version of the so-called God box, a single device that will incorporate the functions of routers, remote-access servers, authentication servers, and firewalls. Some analysts have dubbed these devices "extranet routers".

The major networking vendors already support many of these features in their routers and will likely add the rest in the next six to 12 months as standards such as IPsec and L2TP emerge. Cisco officials use the analogy of a desktop computer where the addition of these functions is the equivalent of an upgrade to the operating system. However, routers by definition have to work together in large numbers.

"The problem with building the intelligence into the boxes is they all have to work with each other," Lopez says. "That's where you have the issues of features and what kind of authentication you have and does it work across platforms within the Internet."

Another consequence of adding these functions is the increased performance requirement. Encryption, authentication, and tunnelling are handled today mostly in software emulation rather than ASICs, notes Mike Paratore, a senior product marketing manager at Bay Networks.

"You need to have a pretty high-performance product in order to do all this and still be able to forward packets," Paratore says.

Meanwhile, the adoption of inexpensive, easy-to-manage thin clients is affecting remote access.

A multibillion-dollar manufacturer based in Chicago is rolling out a remote-access program to connect 2000 users in 70 plants around the world over a frame-relay network. The users work on diskless 486 PCs connected to a Citrix server running virtual Windows NT systems in the home office.

"You can see how the cost of ownership has dramatically decreased," says an IS staffer who requested anonymity.

However, few people see the world returning to the host-terminal paradigm. Given different forms of communications such as e-mail, secure database access, and videoconferencing and their varying security, performance, and quality-of-service needs, some industry observers are predicting downright fat clients.

Deregulation

Planning for different types and speeds of networks is difficult enough, but at least the variables are known. The effects of telephone-company deregulation remain largely a mystery.

"Once competition from deregulation starts happening and you get aggressive pricing, it changes the whole concept of tariff management and outsourcing and who has what rates," Lopez says.

"It makes it really difficult to have a tariff management argument as a remote-access vendor."


Follow Us

Join the newsletter!

Error: Please check your email address.
Show Comments