Menu
TECHNOLOGY: Top 5 technologies pegged for the future

TECHNOLOGY: Top 5 technologies pegged for the future

There has always been plenty of tension between information technologists and end users. But in spite of that, new technologies continue to make their mark in the world of business. Here's looking at the state of information technology's most promising bets: Web services, Extensible Markup Language, 802.11 WLANs, peer-to-peer and 64-bit systems.

Web services

There was nowhere to hide from the impact of Web services last year. With technology in recession, Web services built to tsunami strength, defining the next wave of distributed enterprise computing and offering a means of overcoming years of isolated, enterprise stovepipe solutions, without requiring companies to rebuild and reboot to interoperate.

Spurred by the demand for ROI and lower integration costs, Web services herald benefits not only for in-house integration but also for collaborative B2B and B2C relationships which are crucial to streamlining enterprise activities.

By reducing the requirements for communication to a lowest common denominator - specifically self-describing XML-based protocols - Web services provide a layer of interoperability that allows applications to be described, published, located and invoked irrespective of underlying architectures employed by each party in the transaction. With Web services, companies can restore order to the corporate tower by affordably bridging the gap between disparate IT systems.

Another endorsing factor for Web services acceptance has been the relatively low risk of adoption. With widespread grassroots acceptance propelling many of the toolkits and standards, companies have been able to begin working with the technology for in-house solutions with minimal expense and training.

Web services are continuing to gain momentum as vendors address shortcomings in areas such as security and messaging. Vendors will build on their recent efforts and tighten their focus on strengthening underlying architectural support for service-oriented architectures. Major players such as Microsoft and Sun Microsystems are solidifying their standing in the Web services arena, helping to better define the future of Windows and Java in the context of a service platform future. Furthermore, Web services networks - third-party production platforms for service deployment, monitoring and management - are piquing the interest of early adopters seeking to stabilise operational environments.

This new model for distributed computing is not the answer to every enterprise integration problem. But in coming years, the service-oriented architecture model will help to stimulate new market opportunities and will become the cornerstone of most new application deployment.

XML: e-business's lingua franca

New XML-capable development tools, typified by IBM's WebSphere Studio and Microsoft's Visual Studio .Net, have lightened developers' workloads considerably. New tools and APIs have reduced the tasks of editing XML files, creating new XML schema definitions and parsing XML data to elementary routine. Instead of seeing XML strictly as an intermediary, developers in all languages, from Java and C++ to Perl and PHP, are using powerful XML APIs as their application's primary data storage engine.

In other important XML-related developments, XSLT (Extensible Stylesheet Language Transformation) has grown beyond its original purpose - formatting XML documents for display, usually by conversion to HTML - to become the most common means of translating XML documents from one vocabulary to another. Web services, encompassing XML-based standards including SOAP (Simple Object Access Protocol), WSDL (Web Services Description Language), and UDDI (Universal Description, Discovery and Integration) are rapidly gaining ground as a means of providing remote access to applications' data and functions. Some see Web services as a death knell for the bulky, expensive middleware that currently ties enterprise applications together.

This year is crucial for XML. Emerging World Wide Web Consortium (W3C) standards cover canonicalisation, signatures, queries and security. Canonicalisation will solve the challenge of comparing documents that are identical except for minor syntactical differences. The XML Signatures standard will enable the embedding of a digital signature that verifies the origin and authenticity of a document. XML Signatures relies on canonicalisation to ensure that a received signed document, despite intermediate processing, is the same as the original. XQuery (XML Query) opens up the possibility of performing complex queries against a large XML document as you would a managed database. It will also enable queries across data sources that can expose their data as XML.

W3C work on security standards is not as far along as those mentioned. However, there is a W3C XML encryption working group, and Microsoft has published its WS-License and WS-Security extensions. XML document and Web services security are important pieces of the puzzle, but software vendors and developers are forging ahead without it, confident that standards will eventually provide a solution.

802.11 networks

Some organisations, particularly in healthcare and manufacturing, have implemented 802.11b networks (which provide 11Mbps throughput in the 2.4GHz range), looking to gain freedom and mobility for workers as well as to realise cost savings by not running wires to every area requiring networking. But many others have shied away, concerned with the numerous security risks brought to light.

One driving factor behind the growth in 802.11b installations has been the rapidly falling costs of Wi-Fi equipment. Prices for access points and wireless network cards have fallen dramatically - as much as 50 per cent or more with some vendors. Driving these price decreases has been increased market competition and the introduction of products supporting the 802.11a standard (providing up to 54Mbps throughput in the 5GHz range).

Although 802.11 networks have many positives, they also have a few drawbacks, including security. Numerous security issues with the 802.11 standard have surfaced, many of them focusing on WEP (Wired Equivalent Privacy) encryption. The 802.11i standard will improve WEP by adding a 128-bit key and support for AES (Advanced Encryption Standard). Another standard, 802.1x, will also help improve Wi-Fi security.

The price of 802.11 networks is continuously falling and organisations are coming to terms with the security issues. As 802.11i, 802.1x and other security solutions - such as VPNs or proprietary systems from vendors like Agere Systems and Cranite - come to market, those companies that passed on deploying wireless networks in the past may decide to finally make the move. The increased productivity WLANs (wireless LANs) can provide will be too great to ignore.

The availability of the speedier 802.11a equipment will be another driving force, as well as the introduction of dual-band receivers that allow 802.11b and 802.11a networks to interoperate. Additionally, products supporting the newly ratified 802.11g standard (54Mbps and interoperable with 802.11b) may come to market by the end of this year.

Moving beyond the early adopter stage, WLANs have just started to become a necessary component of the enterprise network. The shift will continue this year as users demand even faster transmission rates and stronger security.

Peer-to-peer

Peer-to-peer technology has taken a bad rap due to an association with music piracy. The controversy over the likes of Gnutella and Napster still colours most people's perception of P2P - even those who should know better. Nevertheless, P2P has become a business strategy, and one that vendors are rushing to support.

One reason P2P is possible is the ever-expanding power of the ordinary desktop, to say nothing of the emerging handheld space. Top-of-the-line PC desktops have as much horsepower as last decade's supercomputer. The explosion of local storage resources also enables the local replication of multigigabyte databases, something unthinkable only five years ago. Finally, the expansion of network bandwidth and the increased mobility of computing resources through the adoption of wireless network technologies means that people and their devices connect faster than ever before, from locations inside and outside of the enterprise.

P2P business solutions are appearing around the world, from Groove Networks' eponymous product for workflow enhancement to more specific customer support solutions, such as Quiq Connect. The challenges many of these face include a sceptical business culture which must be convinced that sharing some of the company's data doesn't mean it's all exposed. Many companies are still struggling to implement secure client/server environments, and P2P brings a whole new set of problems in its train.

Although valid security and management concerns exist, the future of P2P technologies is bright. Peer-to-peer is still struggling to find its place in the wider world of enterprise computing, but its effects will be felt for years to come.

Intel's 64-bit CPU

Intel's broad line of 32-bit processors power many corporate servers, but for the big jobs, it's widely accepted that 64-bit systems from Sun Microsystems, IBM and Hewlett-Packard are better equipped to handle the load. The capability to accommodate huge quantities of memory, combined with their far greater efficiency at managing high-bandwidth I/O, give 64-bit systems scalability advantages that 32-bit technology can't match. If you wonder whether the PC chip giant's enterprise technology matters, consider this: IBM and HP plan to forsake their internally developed Unix CPUs in favour of Itanium. In a few years, Intel's 64-bit processors will be the chief competitor to Sun's Sparc line.

Nevertheless, the mid-2001 introduction of the first generation of Itanium systems was a quiet event. Four-processor servers and dual-processor workstations, based mostly on Intel's reference system designs, rolled out from IBM, HP and a few others. These first machines were hot, power-hungry beasts compared to systems built on Intel's latest cool-running, energy-efficient 32-bit CPUs. But the servers debuted with enterprise-grade management, reliability and availability features that prove Intel understands the corporate market's needs.

Recognising that early Itanium systems weren't as effective as competing models for general computing, Intel narrowed its initial target markets to science, engineering and academia. The first batch of systems also gave more software developers a chance to target Intel's new architecture. That effort is now paying off. Already, Linux distributions from Red Hat, Caldera, SuSE, and TurboLinux are running on Itanium. HP, which partnered with Intel on Itanium's development, was first out of the gate with an Itanium version of its HP-UX Unix operating system. And Microsoft has entered the game as well. The 64-bit version of Windows XP Professional is shipping now, and Beta 3 of Windows .Net Server for Itanium is available to developers.

Intel's patient, methodical marketing of Itanium will prove to be a wise approach. The majority of Linux and Windows applications have been written for 32-bit systems. It will take time for software vendors and open-source projects to rework their apps for a 64-bit environment. Fortunately for Intel, the predominance of Java and the emergence of .Net, both architecture-adaptive platforms, will migrate thousands of applications to Itanium in a flash. IBM is previewing its Java SDK (software development kit) for Itanium now, whereas Microsoft's .Net Framework will land on Itanium with the release of .Net Server. Borland has also committed to making its Java tools and servers available on Itanium.

Competing successfully with Sun requires having a critical mass of software available. Intel was far from that objective at Itanium's launch, but it should come within reach this year while new, faster Itanium processors are aggressively developed. Intel's published road map promises the shipment of two new Itanium architectures: one targeted around the end of this financial year, and another for early 2003. Faster, more scalable CPUs, combined with a much larger library of applications and broad OS coverage, should make this year a breakout year for Itanium.


Follow Us

Join the newsletter!

Error: Please check your email address.
Show Comments