LEAD FEATURE: Disarming Y2K at the desktop

LEAD FEATURE: Disarming Y2K at the desktop

The hype about the year 2000 problem was bound to cause a stir in the computer industry. However, reacting to outrageous claims and statements made by self-appointed Y2K gurus, many users have simply tuned the whole thing out. Some Y2K projects are past the funding and preparation stages of last year and are now into implementation and testing. But because big companies and government agencies were the first to attempt to fix their upcoming problems, Y2K projects are associated with mainframe shops. PC users haven't bothered much with Y2K, thinking themselves inviolate because they don't use big iron and Cobol.

PC users are in for a shock - a big one!

It won't just be that some of the BIOS hardware in a PC will fail in 2000. It won't just be that the PC operating system from Microsoft will fail or that the applications on the PC may fail. When local processing fails, it corrupts operational output - even when correct input is provided. If that input is corrupted somewhere else due to previous processing failures, the number of overall errors starts compounding. An entire system can be corrupted. Y2K problems are not just hype or alarmist tripe. There are all sorts of problems awaiting processing in 2000, most of which are not known because no one is looking for them. Most PCs built before the later Pentiums were developed have a Y2K error built into their hardware. And if you have a new machine, just kicking back and waiting for your software vendors to fix up the next version of their product won't get the revision installed.

Integrators must be aware of date dependencies in the overall business system. Just assuming that they're not present will ensure the failure of that system. Even when the date dependencies are eliminated from a local system, the integrator has to be aware of the second-order effects of other people's work. Could bad processing on other systems affect your customer's system? What data must be ignored so that you can generate at least partial correct results?

The first small step

To implement a "surviving 2000" plan, start local; then - and only then - go global. For most integrators, that means starting at one localised central point that is both controllable and verifiable and then working radially out from there. Your own PC is a good place to start.

There are two ways to check if a PC's BIOS chip has date-dependent code. The first way is to set the PC's date to December 31, 1999, at 11:58 p.m. and wait two minutes. If you still have access to your computer when you restart after letting it roll to January 1, 2000, then you'll have no problem.

However, another method - automated testing - is perhaps a better alternative. You can still access existing files if you fail the test - something the other method can't guarantee. UK-based Computer Experts ( 2000) sells a hardware tool kit (in software) that does such testing. The site offers a demo tool kit that performs eight of the full version's 10 tests on your PC.

You'd be wise to get this 85K download and check out your BIOS, the most local part of your processor and the simplest element to verify. The suite from RighTime ( is another good way to check the clock and also examine the CMOS parameters.

Both RighTime and Computer Experts offer software solutions if a hardware replacement can't be done.

Next comes the system software on your computer. If you have a Macintosh, the system software is already configured to handle the year 2000 correctly.

Microsoft assurances

Microsoft's system software is not Y2K ready, despite assurances from the company. Although current OSs and applications might handle Y2K correctly, Microsoft's installed base of software will, without question, require patches or fixes to become Y2K compliant. That installed base comprises much of the PC software currently in service.

Also, all of the custom-developed applications that used 16-bit development tools like Visual Basic 4.0 (as well as custom macros in Excel that use two-digit year codes) will fail if they were developed using the default modes of those tools. What's worse, MS-DOS, Windows 95, Windows 3.1, Windows 3.11, and Windows NT 3.51 (prior to Service Pack 4) will also fail in their default state.

Even though it's possible to reset and reconfigure the default state in Windows, MS-DOS will require users to input - every time - the full four-digit year to function correctly. Applications like FoxPro also assume the default century to be the twentieth. In addition, the OS must handle the fact that February 29, 2000 is a leap year.

Y2K means business

Let's assume you find and apply patches for all the different OS versions in use in the enterprise. That leaves the core business applications unexamined and vulnerable to date-dependency headaches. Sure, core applications can be evaluated for Y2K susceptibility, but what about the custom overlays developed for those applications? Do custom spreadsheet macros invoke two-digit or four-digit years? Does a guaranteed-clean version of the macro source even exist? Has anyone completed a data integrity test, even if there are no date dependencies? These are the kinds of questions integrators must ask in order to evaluate, and potentially eliminate, Y2K dependencies in the business flow. Each element of the flow must be examined while in function with the other enterprise elements it contacts. This methodology becomes critical in the networked environment found in most businesses.

It's easy to see that a malfunction of the central server can affect everything on the network. What may not be so obvious is that testing a networked program in isolation cannot give reliable results. The program must be run on a network to give reproducible results.

This is all very well in theory, but where do you start with a real, live business? In his book The Year 2000 Software Crisis co-authored with William Ulrich, Ian Hayes advocates beginning compliance enforcement with "a group of functionally related systems or subsystems. An example is accounts payable or all accounting systems." This core group then becomes the basis for the upgrade. To expand the upgrade, Hayes then includes the surrounding systems and subsystems that:

Feed data directly into the core system;Receive data directly from the core system;Are on the same hardware/software platform as the core system;Have minimal interfaces with shared data stores, except those used by the core system;Share a common end-user base and test environment with the core system; andDo not require significant APIs or bridge interfaces.

This kind of approach may work best in batch systems, which characteristically fan in and out around a single core system. Additionally, Hayes warns about the possible effect of information infrastructures on the Y2K effort. Infrastructure issues may come to a head due to the additional stress of Y2K correction and validation. Don't confuse one with the other.

It would be best, of course, to work in an environment where the infrastructure doesn't interfere. But resources in a modern business are always limited, and the typically intense Y2K effort can be a resource hog. Infrastructure issues can't be ignored, but the solutions to this class of problem may have to be more creative than just throwing money or people at the problem.

Current conditions

The solution might require going back to the original assumptions on which the infrastructure is based and re-examining them based on current conditions. For example, a cluster of ASR33 paper-tape readers may no longer be needed and may be taking up ports that could be used to increase the bandwidth of other transactions. Making sure that the infrastructure serves the computing mission - rather than the reverse - should be your goal.

If a significant Y2K effort is not already under way in the enterprise, the first order of business for the integrator suddenly involved in Y2K is to develop an action plan for non-compliance failure. Parallel efforts on correcting Y2K dependencies should, of course, go forward, but their timely success is not ensured. Compliance means identifying the dependency problem, correcting it, and testing any introduced corrections for proper function. It takes time to accomplish this, and it is the integrator's responsibility to make the customer aware of this fact. Y2K is going to be a bigger mess than most people think. The time for planning has passed; it's time to take action.

An embedded problem

Y2K tentacles reach into all parts of a business - even into embedded systems like cash registers and ATMs. Here's one likely fix.

Embedded computing systems will be a problem for everyone involved in business, because these devices are so pervasive. Verifone, to its credit, acknowledges in a brochure ( that certain models of its ubiquitous point-of-sale terminals will fail without updates.

Certain cash registers have already shown data corruption when tested. This machinery requires attention and maintenance if it is to be functional in the upcoming millennium. Everything from gas pumps that accept credit cards to ATMs need to be checked out by their owners.

Affected systems

Though you must assume that there will be problems due to embedded systems, most non-compliance will not produce corrupt data, because the affected systems will fail outright. The system programming most likely has error checks of some sort for data, and wide data swings (or dividing by zero) could invoke a system shutdown due to errors encountered.

Of course, getting that embedded device up and running again will depend on the actual device, but programmable-read-only-memory (PROM) updating with new programs may end up being the solution for a lot of these devices. On the other hand, most standard computer peripherals like disk drives, laser printers, and modems should make the changeover without incident because they have no century-dependent code in their hardware controllers.

Overcoming denial

Wishing the problem didn't exist is the unfortunate tactic of some customers. A simple demo can help change this thinking.

When it comes to Y2K problems, denial is rampant. Microsoft is being disingenuous on the Y2K subject, probably hoping to avoid a raft of lawsuits. Its FAQ on the subject ( year2000faq.htm) expressly states that Microsoft will not provide any warranties for Y2K issues. However, be sure to check y2k_issues & solutions.htm for the company's legacy program Y2K patches.

It should be considered axiomatic that almost all legacy Microsoft products will have to be changed or patched in some manner before they can be Y2K compliant. Microsoft users are typically in denial about the potential problem. It sometimes takes strong measures to break through. One of my favourite demos to CIOs who think "we have no Y2K problems here; we use PCs" is really quite simple. Type "TODAY - Jan. 2, 2000" into Excel's command line. You will see a "divide by zero" bomb if Excel is set to default settings. For people using a lot of macro-embedded worksheets to do business, this experience can be very scary.

Let's hope it will open their eyes to how pervasive Y2K compliance needs to be in the enterprise. Incidentally, New Art Technologies (www.newarttech. com) sells a code scanner for Excel worksheets called NA-Excel that identifies date dependencies.

This product might help calm that CIO hopping around the room mumbling gibberish.

Follow Us

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.


Brand Post

Show Comments