New life for old databases

New life for old databases

Old databases don't usually die. Unlike old soldiers, they don't even fade away. More often than not, they just keep chugging along in the background as new technologies are deployed around them to extend their lives. Alan Radding looks at several ways in which this has been done recently.

At one time, the new technology was client/server computing, which gave users a friendly, graphical interface to connect with the legacy database on the back end. Then, the data warehouse emerged to offload copies of the data for decision-support, analytic and business-intelligence purposes, freeing the old database to continue processing transactions unhindered by resource-hungry user queries. Today, Web technology promises to extend access to data contained in legacy databases to anyone with a browser.

There are several ways to bring legacy databases into the modern era. The following is a look at four examples of how they work in real life.

Business intelligence: creating a new database staging area and adding a business intelligence layer"If it ain't broke, don't fix it" is the guiding principle at a US leading boiler manufacturer, Aqua-Chem. The company runs old mainframe-based financial software packages in conjunction with a host-based Computer Associates International CA-Datacom database management system. Generally satisfied with the performance of the mainframe system, Aqua-Chem is reluctant to change.

The company's only complaint with the software is its inability to perform multidimensional financial analysis - viewing data from a variety of angles to uncover otherwise overlooked trends - but that wasn't enough to make the company switch to a new database.

But, according to Chuck Norris, Aqua-Chem vice president and CIO, instead of changing systems, Aqua-Chem augmented its core financial systems with a new database and business-intelligence tools. It opted for Microsoft's SQL Server DBMS to create a financial server running Cognos' business-intelligence tools. Manufacturing, sales and other data are regularly extracted, transformed, transferred and loaded into SQL Server using Informatica data warehouse products. At the SQL Server database, Aqua-Chem managers access, analyse and manipulate the data through online analytical processing (OLAP) cubes created with Cognos tools.

Using the Cognos tools, managers can analyse products or customers. The mainframe system continues to handle all the core financial processing.

The business-intelligence system has let Aqua-Chem virtually eliminate paper. In the past, the company generated dozens of greenbar reports, split them up for distribution to various managers and sent them to its 11 plants across the US and Canada. "Now, we do not have even one report," Norris boasts. Managers who want a printed report access their OLAP cube and print out the parts they want.

Norris says the company recently ordered a new IBM OS/390 mainframe. Norris claims he doesn't mind the old CA Datacom database except for one thing: he says he feels captive to CA, which he says has been uncooperative in the areas of Datacom licensing and pricing. IBM is wooing him with low pricing and flexible terms for DB2, and it's tempting. "I'd love to convert from Datacom to DB2 just to get CA out of my hair," Norris says, but his 18-person IT staff doesn't have the manpower to handle a database migration, at least not until something breaks.

Analytic applications: making information directly accessible to managers without the bother, delay and expense of a conventional data warehouseTop management at Blue Cross/Blue Shield understands the value of information in the high-pressure health care industry. But the operational data that managers most need sits on a mainframe system running IBM IMS and CICS, which prevents them from quickly getting at the information. And because different applications and systems are used to capture and process different pieces of data, top management wanted a centralised place where executives could view and analyse all the data, notes Twanna Swift, operations project consultant of Blue Cross/Blue Shield's Management & Evaluation (M&E)/Quality Program.

Both the M&E group and corporate IT separately picked up the challenge. After looking at various options, the M&E group decided to build what amounts to a logical data warehouse using SAS Institute's analytical applications and data tools. The IT group took the conventional route, building a physical data warehouse from scratch using Oracle's tools. The race was on.

M&E also considered building a conventional data warehouse using a relational database, but quickly realised that "the conventional data warehouse was going to take too long and be too much work", Swift says. Instead, it started to explore less IT-intensive options. "We studied a lot of literature on the tools and options out there," Swift explains. SAS consistently ranked high in every published tool evaluation the group reviewed, so the health insurer finally opted for the SAS approach after SAS created a prototype for M&E, Swift says.

In phase one, which is rolling out now, senior vice presidents and directors are using the new system to analyse staffing and budget models. In phase two, the system will be extended to everybody via the corporate intranet.

The project is SAS end to end, from the analytical applications and stored data sets to the graphical user interface and, eventually, the Web interface. The analytical system will hold 18 months of production data.

In the race against the IT group, M&E claimed a clear win: it brought in the SAS approach for less than $US1 million, while Swift says the IT group's Oracle-based data warehouse, which is barely out of the starting gate, is projected to cost more than $10 million. The M&E project went live in about six months, and the IT group's data warehouse is still years away from providing useful information.

Data warehouse: building a conventional data warehouse using an incremental approach to deliver results fastUS-based Owens & Minor knew in early 1997 that it had to gain better access to its production data in legacy databases. What it didn't realise at the time was that by gaining access to its legacy data, it was creating a new revenue opportunity.

The company maintains most of its data on the mainframe in a series of IBM DB2, IMS, and VSAM databases and transactional flat files. But managers at the $3 billion surgical supplies company were frustrated when they needed information. "All these databases are independent, which makes it hard to do queries," says Don Stoller, director of information management. The company used a few tools, such as Datawatch's Monarch, which creates and delivers electronic versions of printed mainframe reports to a PC, but it wasn't sufficient. "We wanted a more multidimensional view of the data," he says.

Deciding it needed a full data warehouse, Owens & Minor installed an Oracle database on a Unix server as the central data warehouse repository. It gave managers Business Objects SA's BusinessObjects as the query tool on the front end.

Taking a subject-by-subject incremental approach, the company built the data warehouse quickly, bringing up the first subject area - sales data - in five months, beginning in early 1997. About every two months, it adds a new subject area. Today, the data warehouse contains data covering six subject areas, with work underway on a seventh and two more in the planning stages. The Oracle database contains 120GB of data, including more than three years of sales history.

The company uses Informatica's PowerCenter to extract, transfer and transform data from the legacy production systems and load it into Oracle. Today, Owens & Minor is using the data warehouse to bring its legacy data to internal staff at headquarters and those in the field.

Web front end: giving end users Web access to a data mart while avoiding the desktop support hassles of client/server-based data accessThere are 1.1 million public school students in the city of New York. They generate an enormous volume of data, which is stored in a variety of legacy databases: CA's IDMS, VSAM and others. "These legacy systems perform very well," reports Kamal Kumar, director of the office of students systems development, but they're difficult to access for the purposes of gathering management data.

Though the IT operational people are happy with the legacy systems, the school system's executives, administrators, managers, superintendents and 1200 school principals are stymied. Under pressure to boost students' performance at their schools, they need fast, easy access to the data in those systems.

Making legacy data accessible to school system managers was a two-part challenge. First, the city had to get the data out of myriad legacy databases and into an intermediate-level database that people can access. Second, the city had to provide the users with easy access tools.

Complicating the problem was the city's reliance on IBM's SNA as its host networking standard. That forced it to rely on mainframe tools only. Based on benchmarking it conducted, Kumar's group selected Information Builders's EDA middleware and Focus programming tools and IBM's DB2 as the new intermediate-level database.

To solve the first challenge, Kumar's group used Focus on the mainframe to access, extract and move the older mainframe data to the DB2.

The second part of the challenge proved trickier. The schools are on a Windows NT network. To access DB2 on the SNA network, Kumar's group had to install client software on each user's desktop - a classic client/server solution that proved to be a desktop support nightmare.

A better solution turned out to be the Web. Using Information Builders' WebFocus, Kumar's group crafted a solution that allows administrators to go directly to DB2 via the Web. The use of WebFocus along with Focus on the mainframe also allowed Kumar's group to reuse much of their initial code.

In the end, easy Web access to information is expected to result in better performance at the schools. Because data has been so hard to get to, "most users don't use data. They make decisions off the cuff," Kumar explains. Now, they'll be able to make well-informed, information-based decisions that the school department expects will improve the schools.

IBM preps DB2 Universal Database for AS/400By Juan Carlos PérezDB2 Universal Database is the next generation of IBM's DB2 database line. Unlike its predecessors, which were strictly relational databases, DB2 Universal Database is capable of handling data stored both in relational - that is, in rows and columns - and object formats. This object-relational design allows DB2 Universal Database to handle so-called complex or unstructured data, like pictures, and audio and video clips.

The first DB2 Universal Database release for the AS/400 platform counts the following among its highlight features:

Support for SQLJ, which allows developers to embed SQL calls in a server-based Java application.

A query optimiser with a new indexing technique called "encoded vector indexes" which lets the database solve queries faster. Thanks to this indexing technique, the DB2 Universal Database for AS/400 solved in 35 seconds a complex query that took the regular DB2 for AS/400 about two hours to complete. This indexing feature hasn't yet been added to the versions of DB2 Universal Database on other platforms.

Enhanced JDBC support to make it easier for developers to create Java applets that can access data stored on the back-end database server.

The new version of the database for the AS/400 cannot be bought separately from the operating system upgrade and will not be able to handle complex data until September, when the required modules will become available.

IBM's online analytical processing engine for data analysis, called the DB2 OLAP Server, is also being ported to the AS/400 platform and is slated to ship in the second half of this year.


13 2426

Novell integrates databases

By Rebecca Maxwell

The keys to Novell's new SQL Integrator are both its ease of use, and the role it plays in lowering the skill set necessary to integrate data from multiple databases, according to industry analysts.

SQL Integrator is a data integration application that provides a multiple-tier, multiple-database enterprise environment for connecting ODBC and Java Database Connectivity (JDBC) applications between client systems and database servers. According to Novell, the product builds on functionality present in NetWare and Novell Directory Services (NDS) to transparently integrate and access data, offering the ability to read, write and integrate data from a variety of databases.

SQL Integrator provides a common database join engine for a number of third-party database products, including those from Oracle, Sybase, Informix, Microsoft and IBM, running on platforms such as Windows NT, Unix, Digital VAX and IBM VMS. By offering enhanced access to information, SQL Integrator also allows for rapid application development, with its time to deliver cited as the product's biggest advantage.

In addition, industry sources claim "what database administrators will appreciate is the capability to make the database location transparent to the application, because it gives administrators the freedom to optimise data distribution across the network. Also, the Dictionary Manager includes the capability to create test tables and an interactive SQL query processor that developers can use to test their applications."

According to Novell, SQL Integrator is targeted towards "any customer who has multiple database structures potentially, but not requiring it to be on multiple execution platforms and who need to easily, effectively and quickly do database joins".


(02) 9925 3000

SAP to compete more directly in database marketBy Michael Lattig and Stannie HoltLooking to kill two birds with one stone, enterprise resource planning (ERP) vendor SAP AG will later this year extend its reach into the database market, strengthening its position with midsize customers while possibly slowing the flow of SAP-related database sales to ERP - and now database - rival Oracle.

According to SAP, the company has acquired the rights to German database vendor Software AG's Adabas D database and will begin selling the database under a different name later this year. SAP has been selling the database since 1994, but in the past two years has only been selling it to existing SAP customers in an effort to keep the technology alive.

"The goal is for SAP customers to be able to benefit from a more competitive database offering in the R/3 environment," said the SAP representative. "SAP is not trying to become a database vendor; it is selling the database only with its software, not independently."

Industry pundits believe the move is most likely to challenge Microsoft and its SQL Server 7.0 database, rather than Oracle's high-end offering.

Despite this fact and SAP's claim that it is not set on joining the ranks of database vendors, the move does signal a more aggressive approach to selling beyond the company's traditional strengths. By purchasing the rights to the technology, SAP will now be able to package the Adabas D technology with its ERP software, offering customers a complete solution that would leave Oracle or other database vendors out of the equation.


(02) 9935 4500

Follow Us

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.


Show Comments