Vying for Top Billing

March 1, 2003
Nimble PC-based Windows NT had made it into the OS center ring, and it's stealing the limelight from a muscular old favorite.Electrical distributors everywhere

Nimble PC-based Windows NT had made it into the OS center ring, and it's stealing the limelight from a muscular old favorite.

Electrical distributors everywhere are experiencing tremendous pressure to upgrade their computer systems to better respond to market needs and stay current with the latest technologies. This constant pressure to evolve is driven by escalating customer expectations for improved performance, endless new software releases and the need to keep at bay the ever-present danger of hardware and software obsolescence.

To complicate matters, many electrical distributors are burdened by "legacy" systems-existing, outdated systems that contain critical data but lack the power or the agility to meet current business needs or the sophistication to support new technology. Since these older systems are by nature resistant to modification, it's clear why the choice is often to scrap the legacy system in favor of a completely new information system-ideally, one that won't become a legacy itself in a few years. So what is the right system for electrical distributors today?

Until recently, personal computers simply could not provide the processing power and subsystems to drive graphics applications. UNIX mainframes did have the power, and in many venues they are still the masters of high-end graphics applications.

But the advent of the Pentium II processor and new high-end peripheral devices have placed PCs in the ring with the big boys. Today, PCs can display graphics more quickly and smoothly than on any computer other than an expensive graphics workstation. Faster multiprocessing technology for PCs allows a network-full of users to access a single program seemingly at once. PCs can be configured that will blow the socks off traditional workstations for about the same amount of money. So the question is which type of operating system is the one to go with-a centralized system such as UNIX, or a distributed, PC-based Windows NT system.

Why an operating system matters. Every computer has an operating system. An operating system is the program that controls all the other parts of a computer system-both the hardware and the software. Most importantly, it allows you to make use of the facilities provided by the core program of the system.

Operating systems determine which applications will work, what those applications will look like, and how they will work together. For example, if you want to run Microsoft's Office applications suite (Word, Excel, Access, etc.), you're out of luck with a Linux operating system. Those applications won't run on Linux. With OS/2, it will work for now, but IBM must keep it going since Microsoft abandoned IBM's OS/2 operating system in 1991.

Where do I start? It's a huge decision to change the information system within a company, because that system essentially runs a distributor's business. And it's no small investment, especially if a distributor feels he should change out his old "dumb" terminals in favor of newer PCs.

According to Jay Walther, vice president of marketing for Eclipse, Inc., Shelton, Conn., the reason to change is not because of the hardware; your current hardware may be just fine. "The reason to change is because your business systems are not giving you the functionality that you need to compete in the market or work with your vendors in the supply-chain arena," he explains. "If your vendors are requiring you to send their purchase orders to them by EDI, or don't want their rebate reports faxed to them anymore because they want them EDI'd, then your system should be able to do that." If your system can't do it, then you should start looking at your software options and see to what hardware those options lead.

Eclipse is now working with an industrial distributor in California to install a 36-user system throughout the company's seven branches. The price tag is $222,000 for the system. "The hardware component of that purchase is $42,000, so it's less than 25% of the cost of the system," he says. "If you could get that same hardware for $35,000 and save $7,000, you're saving 3% on the whole package. It's not a big deal. The hardware should be the last consideration, not the first."

Evaluate your company's needs. Many organizations plan to "migrate" their legacy systems to newer system environments or a single product line of systems. But many of these efforts are less than successful because they concentrate on a narrow set of software issues without fully considering a broader set of company-wide, or "enterprise," management and technical issues. A dangerous pitfall in determining the needs of your new enterprise system is to let processes founded in the legacy system define how you will do things with your new system and its architecture.

Deciding on a system must begin with the people that will use it. Les Johnson, information technology manager for North Coast Electric Co., Bellevue, Wash., suggests having everyone who will use the system in his or her daily processes put together a wish list. That includes the people doing the buying, warehouse personnel, inside and outside sales people, operations and branch managers, the finance people. "Everyone needs to have a voice in it, and it needs to be fairly democratic," he says. "Take that functional description of what people feel they need to have, and from that make a functional request for proposal. Then match it against the systems that are out there."

Once you have decided on the application software and determined which operating system will let you do the most of what you want to do, choose the hardware that will allow the system application to run most efficiently.

Eenie, meany, miney, moe. . . Which OS is the way to go? "UNIX" has become the de facto standard for building large-scale application servers such as Internet services, database-management systems and transaction processing systems for a very simple reason: UNIX solutions are capable of handling the load. UNIX encompasses a variety of vendors and an assortment of technologies, but as a group, they generally adhere to a standard operating environment, and applications are available for multiple platforms.

Then there is the upstart NT operating system from Microsoft Corp. that has begun to make computer system vendors and users alike really sit up and take notice. Is Windows NT able to handle everything large-scale UNIX environments can-and do it cheaper and more simply? NT's strength is its price over performance ratio, where increasingly powerful and inexpensive hardware is beginning to give it the edge over proprietary, high-end UNIX platforms.

However, Windows NT does have its weaknesses. Current versions of NT still show signs of immaturity, such as its less than stellar record on fault tolerance (its ability to stay up and running). Other notable problems include its lack of scalability (its inability to continue to function well as it is changed in size or volume to meet a user's needs); and its limited directory services (a directory service identifies the resources on a network and makes them accessible to users and applications. Resources include e-mail addresses, computers, and peripheral devices such as printers). NT directory services are limited to NT domains, which don't tie into non-NT networking.

Choosing Windows NT over UNIX means gambling not only on NT's present capabilities, but also its future performance and Microsoft's commitment to its improvement.

Evaluate the operating systems. Until Bill Gates got into the mid-range computer business with his NT operating system, most distributors had a fairly cut-and-dried decision where information systems were concerned. A centralized, mainframe system, with an assortment of dumb, or "green screen," terminals tied to it, running under some type of UNIX-based operating system, was the vehicle of choice for storing data and producing reports. Sun Microsystems, Inc., Hewlett-Packard Co., and IBM Corp. among others, all offered their own proprietary versions. Each vendor had its own UNIX flavor, but the processing methodology and data handling were pretty much the same. PC-based systems were out of the question because they didn't have the processing power, speed or data storage capabilities needed to run the average business. But with the advent of Windows NT and the NT server, a client/server based system is a viable and often less expensive alternative to the big mainframe processing of the UNIX-based systems.

Which system makes the most sense depends on what the wholesale distributor is doing and what he wants to get out of it in the future, according to Joel Kremke, vice president of marketing for NxTrend Technology, Inc., Colorado Springs, Colo. In both systems the software applications are the same. Most people in the distribution software business agree that if the systems are configured correctly the speed will be about the same. You can link PCs and present graphically from a UNIX system, but with its user-friendly interface NT can present data graphically much easier. Security, which used to be a concern with PC-based systems, is no longer an issue either.

According to Kremke, the fundamental reason why security is not an issue any more is that database companies, such as Oracle and Progress, are really providing the bulk of the security in any application, along with the application vendor. "The operating system is still going to have to have some level of security," he says, "but the forward-thinking vendors, both from a database standpoint and from an application standpoint, are running their applications on NT, and a lot of their former concerns over security are just going away."

Kremke says both systems are very solid. However, he finds most of his customers are still favoring UNIX, and the primary reason is fault tolerance. He says NT has made tremendous strides in that area in the last 24 months. "The fundamental difference now is about 95% to 96% uptime with NT systems vs. 98% to 99% with UNIX systems," he says. "That doesn't sound like that big a difference, but if you extrapolate that over the course of a year, that difference is maybe a day or a day and a half of uptime, which is not insignificant when you're taking orders. If you have an electrical distributor with potentially tens of thousands of line items to be processed every day, that extra day and a half could make the difference between profitability and non-profitability."

It's easy to see why the larger the company, the more of an issue this becomes. But Kremke is quick to add that many people are still considering NT systems because Microsoft has made such terrific strides in improving the fault tolerance of NT in the last 24 months. "If I'm going to be investing in applications system technology in 1999, looking at what Microsoft has done in the last 24 months, maybe it's a pretty safe bet that they're going to continue to improve it," he says. "Then I'm going to derive all the other benefits that I would get out of an NT system."

Some of those benefits include being able to easily integrate Microsoft applications, such as office automation tools, which is, of course, cleaner with an NT system. "This is an obvious statement but one that needs to be put out," says Kremke. "Most people are running Microsoft Office, most people have Word, most people have Excel. Many people use Microsoft Outlook for their e-mail system and calendar system. And that's all interactive with NT. UNIX can be, too, but the integration is obviously a lot tighter when it's all out of one family."

"It's sort of like the beauty is in the eye of the beholder," says Mike Wentz, vice president of marketing for Trade Service Systems, Inc., Blue Bell, Pa. "If you're a PC-oriented person, the NT looks to be a very appealing way to go. If you have no background with PCs or LANs, the old traditional UNIX way of direct connecting terminals rather than PCs is probably a little easier to digest initially for a first-time user. But I think the writing is on the wall that NT seems to be the dominant future for computing for electrical wholesalers we would serve."

Scott Deutsch, vice president of marketing for Prophet 21, Yardley, Pa., agrees with Wentz. He says it's in the eye of the beholder and in which direction the company's management team is pointed. "To typically walk into an NT environment, you will need information technology expertise," he says. "If you have an installation of any size, you will need a full-time, dedicated staff of some size to manage your NT-based IT infrastructure. A distributor will have to have someone who understands how NT BackOffice works, who understands how SQL Server works or DB2, or whatever the database of choice is."

Deutsch says another element is the distributor's client strategy. In an NT environment, a person can walk into a CompUSA and get any PC application software that he feels will help him do his job better. "I don't have that same freedom in a dumb-terminal world," he says. "I think it limits your horizons on how your organization can use technology to your advantage and to benefit your staff."

Flexibility is the reason Michael Fromm, president of Fromm Electric Supply of Reading, Reading, Pa., is convinced NT is the way to go. Currently running MPE, Hewlett-Packard's proprietary UNIX operating system, on a centralized HP system, he says the SQL Server technology of NT puts the computing power in the hands of the people using the computers rather than forcing users to ask for reports to be written when they need them, as is necessary in a UNIX environment. It allows much easier access to business information.

"In the NT environment, we can get more computing power into the hands of more users for less money because the SQL Server PC-based environment is far more adaptive," he says. "When you're faced with requests from internal users, customers or suppliers that may require you to either transmit information or produce information in a way different from normal, the mainframe environment requires hard-core programming with proprietary codes, whereas the SQL Server environment is a very open architecture and allows you to essentially customize anything around the needs of the requestor. And that's why we're migrating in that direction as fast as possible."

Another fundamental issue is scalability. UNIX is still more scalable than NT. Why? "UNIX is just longer in the tooth," NxTrend's Kremke explains. "You've got something that has been around since the '60s and in that period of time it has become very scalable." The UNIX product has just had more time to work out the bugs, so it's not unusual to have thousands and thousands of users on a UNIX system, and you just don't see that with NT. But NT is really coming up fast, and that's why you see a lot of big organizations taking a hard look at it today.

Stan Kuruvilla, Fromm Electric Supply's MIS director, says, "We are a $35-million company and we're not looking at UNIX. It all comes down to economic factors. Getting the software, installing the software, maintaining the software-all are cheaper in the PC environment." He believes a company would need to be approaching the $100-million mark before it might start feeling the constraints of a PC-based system.

So why would anyone consider going with a high-dollar UNIX system instead of the more flexible and less costly NT system? "Part of it is that NT has received some bad raps in the market on scalability and performance," says Deutsch of Prophet 21. "And to a degree, they are absolutely correct. There is sort of this size company that would feel comfortable and have no scalability problems with NT 4.0 or with the pending NT 5.0 (aka NT2000). It's not an issue for the marketplace that we service, for companies that are up to about $150 million to $250 million dollars. NT scalability will only go further up, and failsafe performance of that system will only get stronger and more bulletproof every year. What UNIX has over NT is maturity and time."

That's why Elliott Electric Supply Co., Inc., Nacogdoches, Texas, went with UNIX. Elliott Electric currently runs UNIX on three IBM RS6000 servers that support about 300 users. "Scalability and reliability are definitely the main reasons we went with a UNIX platform, plus there's a lot of software for wholesale distribution that runs on it, whether it's third-party or you develop it yourself," says Phil Hale, the company's director of computer services. "It's a very fast machine and flexible. Even though we were also in the process of converting from dumb terminals to PCs, we still felt it was the best platform.

"The reason is that there are still some advantages to UNIX." He says. "The speed is very fast if you keep all the data in one place and everybody's running their programs off the UNIX system. We are starting to do some development in client/server, which has a lot of advantages as well, but our database will still be on a UNIX platform just because it's so reliable."

Many companies are opting for a mixed environment-a centralized UNIX-based operating system for their database and mission-critical information, such as accounting and inventory data, with PCs hooked up to various NT servers for e-mail and office-automation-type applications. Computer-system vendors have recognized this need and have developed software that will allow the mixed environment, with the capability of easily moving the entire application to a total NT environment at a later date if desired.

Software companies also have developed Internet-based PC software that will interface with UNIX systems.

What about price? There is a common perception that the NT environment costs less than a UNIX environment. Trade Service's Wentz says he's not sure that's necessarily true if you add up all the costs. But he thinks the difference in price is irrelevant anyway. "I think in the end a lot of people won't base their decision on cost," he says. "It will be more the user interface and the owner's perception of what the future trend is going to be."

For most enterprises, UNIX is typically more expensive. It has to be obtained from a vendor and most of these tend to be proprietary. IBM has its own version of UNIX, as does Sun Microsystems, Hewlett-Packard, and so on. So even though a common thread links all of them, you can't use Sun's system on IBM's hardware. If you go with IBM, you have to use IBM hardware, IBM software, and so on. "That automatically constrains me in terms of the choices I have," says Kuruvilla of Fromm Electric Supply of Reading. "And prices are exorbitant from some of these vendors. That's not true of the PC market. And not only are the prices absurd from some of these (UNIX) vendors, they often have service contracts that escalate the price even further."

But there are other cost considerations besides the hardware and software price. Distributors that have a lot of PCs installed might find an NT system a little easier because everyone would already be familiar with the Windows operating system. And if they have a lot of PCs, they probably already have a network in place and are familiar with the ins and outs of local area networks ( LANs). If you have no background with PCs or LANs, the traditional UNIX way of direct-connecting terminals rather than PCs is probably going to be more comfortable.

Another thing a distributor should look at is his communications environment. A company with multiple locations would need a wide area network (WAN) for an NT solution. For most distributors that would mean an upgrade, according to Dean Jester, a regional manager and partner in Trade Service Systems. "Most distributors running in a UNIX environment have a very low-cost, very efficient branch network in place that would not be considered a WAN," he says. "It probably wouldn't have the necessary bandwidth."

Wentz says there is no question that the NT is a more complex operating environment. UNIX is mature, very well developed; all the tools to operate it are very clean and take very little interaction from an operator environment. "With NT it's more complex to set up a user and bring a new person onto the system," he explains. "So operationally it requires a higher level of expertise to support, and you do need a person with that skill set."

Michael Croxten, vice president of marketing for Software Solutions, Inc., Duluth, Ga., also has noted that the data processing professionals today are developing a lot of skills along the way that are Microsoft-centric, especially if they're working on the client side of the market. "If these professionals go to a mixed environment, that is Microsoft clients and UNIX server, they're going to have to have a different, specialized skill set," he says. And, of course, specialized skills almost always come at a higher cost.

The operating system on which you choose to base your IS may even affect your ability to recruit new talent. "You bring someone in who just graduated from college today and sit them down at a PC, they will be very productive because PC tools are very standard," says Jester. "Even though they may be sitting down to an application they've never used before, all the tools that surround it, the way they do things and access data, are the same as they learned in school."

As the dust settles. . . There seems to be a sense of inevitability among vendors and information systems professionals that Microsoft will in time succeed in making NT the dominant network operating system. A glance at its track record in the office automation race will reveal plenty of evidence that Microsoft knows how to take an idea to market and make its competition run for cover.

"Twenty-four to 36 months ago the people who were buying NT were on the vanguard, on the cutting edge," notes Kremke of NxTrend. "Today NT is reaching Main Street. What's that going to mean to us 36 months from now? Will it slow down, will it stabilize, or will it overtake the mid-range market? Our assumption is that Microsoft is going to continue to get more and more market share from traditional UNIX and other midrange system-type vendors such as AS400."

Kremke is not alone in his assumptions. Trade Service's Wentz says most people who are closely following Microsoft and the computer industry realize that it's putting tremendous resources behind NT. Industry watchers expect that Microsoft will boost NT's scalability significantly with NT 5.0 (otherwise known as Windows 2000) and its shaky fault tolerance will become less of an issue. Analysts from the GartnerGroup, Stamford, Conn., predict that by the year 2002 a single NT server will be able to support up to 1,500 users on a network.

"It's clear that this is the direction that everyone is going," says Wentz. "Microsoft is the largest software company in the world, and it's setting the direction in the marketplace."

UNIX was born 1969, when a joint project between AT&T's Bell Laboratories, General Electric, and the Massachusetts Institute of Technology died. Multics, the project in question, was an experimental operating system on the GE 635 computer. Ken Thompson and Dennis Ritchie, both from Bell Labs, had been exposed to the Multics project. They wanted to port a game from the GE 635 computer, which was running Multics, onto a Digital Equipment Corp. (DEC) PDP-7 computer. To help with the porting, Thompson wrote a simple file system and some utilities for the PDP-7. This code, nicknamed UNIX as a pun on Multics, would later be expanded into the UNIX operating system.

In 1970, Bell Labs purchased a PDP-11/20, and UNIX became an official Bell Labs private research project. The goal of the project was to design an operating system (OS) that was simple in form and written in a high-level language rather than assembly language to provide interactive access to AT&T personnel who were doing word processing. Typical vendor operating systems of the time were extremely large and hard to understand, written in assembly (machine) language, used a fairly rigid "input-process-output" format and were not portable from one hardware platform to another.

The only part of UNIX written in assembly language was the kernel portion, which is the core of a computer operating system and provides basic services for all other parts of the operating system. The shell, the outermost part of an operating system that interacts with user commands, was written in a high-level language called C-a language designed specifically for writing an operating system.

Partly due to U.S. trade restrictions on AT&T, from 1974 through 1977 UNIX source code was distributed free to universities, where it became so widely used that it is now the accepted standard operating system within the academic community. It was during this time that a team from the University of California at Berkeley began working to improve UNIX. Meanwhile the AT&T version was also developing. In 1979 AT&T announced its intention to commercialize UNIX and released a more portable version of the program (Version 7). From Version 7, three major UNIX versions emerged: BSD (Berkeley System Distribution) UNIX, XENIX, and AT&T's System V. The DARPA (Defense Advanced Research Projects Agency)-sponsored development of the Internet was done on BSD UNIX, and most of the early commercial vendors of UNIX (Sun's SunOS, DEC's Ultrix, etc.) were largely based on BSD UNIX.

By the early 1980s, commercial interest in UNIX was growing. Because it was rewritten in the C language designed for operating systems, users could write systems applications in C and easily make use of all of its operating system facilities. With a UNIX operating system, applications programmers could quickly write the sophisticated programs needed to access those OS facilities that supported the more advanced and complex applications for network access, multi-tasking and interprocess communications.

In 1983 Sun Microsystems produced a UNIX workstation. That same year, AT&T released UNIX System V.BSD, version 4.2-its first commercial UNIX operating system. By 1987 major hardware vendors such as Hewlett Packard and IBM were feeling enough business pressure to develop a "UNIX" OS for their hardware. This was also the year that AT&T and Sun Microsystems jointly agreed to cooperate on UNIX development to merge and unify AT&T's System V and BSD. In 1990, AT&T issued System V Release 4 as a new standard unifying UNIX variants. The following year, freely distributed Unix clones such as Linux and FreeBSD began to appear, returning it to its early roots.

The "official" trademarked UNIX is now owned by the The Open Group, an industry standards organization, which certifies and brands UNIX implementations. Shareware versions of Unix such as Linux have drawn attention as companies announced their support for the platform, in some cases as a protest against the dominance of Microsoft. As a naming convention, any system that is certified by this body styles the UNIX name in all-uppercase, while generic, shareware and uncertified versions of the OS are written in title case: Unix.

The history of Windows NT goes back to the early 1980s. The original Windows system was developed at Microsoft to run on top of the MS-DOS operating system. Microsoft joined forces with IBM to create a more powerful DOS that would run on the Intel (x86) platform. The resulting operating system was to be known as OS/2. At the same time OS/2 was being developed, Microsoft decided to build a processor-independent operating system that would sit in roughly the same market position as UNIX. They planned to accomplish this by writing the OS in the C programming language, a language that is easily moved from platform to platform. Microsoft isolated the part of the operating system that had to be written for specific hardware in something called the Hardware Abstraction Layer (HAL). When Microsoft wanted to move NT to different platforms, all they had to do was recompile the source code for the new hardware and create a new HAL.

In 1988, Microsoft hired David Cutler, an operating systems engineer from Digital Equipment Corp. (DEC) to help them design their "New Technology" OS, which was to be called OS/2 NT. The architecture of Windows NT would closely resemble that of UNIX, with other features derived from DEC's VMS operating system.

Meanwhile, Microsoft launched Windows 3.0 in May 1990, and it overwhelmed IBM's fledgling OS/2 desktop market. The decision was made shortly thereafter to base NT on Microsoft's current Windows system, version 3.0, and not IBM's OS/2. The OS/2 NT name was dropped, and it became Windows NT. On Sept. 17, 1990, Microsoft and IBM announced a split in the continued development of OS/2. IBM would work alone on the 1.x and 2.x versions, and Microsoft would work on the more advanced 3.x versions, which was specified as a 32-bit, portable, multiprocessing, multitasking operating system with advanced security features. IBM continued development of its operating system, which ultimately became the first version of OS/2 Warp, a 32-bit operating system that ran only 16-bit Windows applications and offered neither portability nor multiprocessing capabilities.

With IBM more or less out of the picture, Microsoft threw its efforts into further development of NT. Windows NT version 3.1, so named because its graphical user interface (GUI) was grafted from Windows 3.1, was released in July 1993. It was a pure 32-bit operating system.

While sales of Windows 3.1 skyrocketed, the market for Windows NT 3.1 failed to meet expectations. Prospective customers were appalled by Windows NT's monstrous resource requirements: 16M of RAM and about 100M of fixed-disk space just to run the operating system. Plus, this "memory hog," as it was dubbed, required hardware vendors to write new 32-bit device drivers for their products, limiting support for existing adapter cards and peripherals. Not surprisingly, according to industry reports total sales of Windows NT 3.1 were in the low hundred thousands, most of which were only for evaluation, not production use.

Then, in 1994 fate seemingly intervened on behalf of Microsoft. The cost of PCs that could run Windows NT fell dramatically during 1994. The price of dynamic RAM (DRAM) was in a free fall, as was the price per megabyte of fixed-disk storage, all of which minimized the hardware cost penalty for adopting Windows NT. To capitalize on this turn of events, Dave Cutler's Windows NT development group went to work to reduce the minimum RAM requirement (by about 4M), enhance networking features, improve overall performance, and port Windows NT to Digital's Alpha AXP RISC (Reduced Instruction Set Computing) processor. The result of this effort was the September 1994 introduction of Windows NT 3.5.

Microsoft has continued to refine its operating system over the years. The fourth version, made from a staggering sixteen million lines of C and C++ code, was released to production on July 31, 1996. In April 1998, NT 5.0 went into beta testing. It promises to include 64-bit processing support, as well as a host of other new technologies.

"Client/server" describes the relationship between two computer programs in which one program, the client, makes a service request from another program, the server, which fulfills the request. For example, to check your bank account from your computer, a client program in your computer forwards your request to a server program at the bank. Typically, multiple client programs share the services of a common server program.