Logo Rob Buckley – Freelance Journalist and Editor

A problem of scale

A problem of scale

Conventional wisdom says that Microsoft software does not scale well enough to be truly considered “enterprise class”. But as Windows 2000 gains popularity, is it time to re-assess Microsoft's high-end claims?

Page 1 | Page 2 | All 2 Pages

In the technology business, extravagant claims for new products are as commonplace as they are misleading. And the claims made in 1993 by Microsoft marketing staff about the scalability and reliability of the company's Windows NT Server operating system are a classic case in point. Unfortunately for Microsoft, it soon became clear that NT could neither compete with Unix and other high-end operating systems nor live up to other myriad marketing promises.

The company's latest enterprise operating system, Windows 2000 (launched in early 2000), now occupies the high end of the Microsoft product line, but IT directors still believe that promises of enterprise-class performance from the company are untrustworthy. But are they right to be sceptical, or is it time to erase the bad memories associated with early NT deployments, and consider Windows 2000 as a serious high-end alternative to Unix?

Analysts at IT market research company Forrester Research seem to think so. The company recently completed a report into the adoption of Windows 2000 as a serious server platform, which concluded that it is more than capable of doing the “heavy lifting” necessary. Moreover, Forrester cites some real-life success stories: the USA Today web site, a six-server system capable of delivering up to 100 million pages a day; and retail chain JC Penney's site, which last year served on average three million pages and performed $1 million of ecommerce transactions each day. Similarly, surveillance applications at technology stock market Nasdaq, after a migration to Windows 2000 Advanced Server and Datacenter Server, can now monitor up to 800 transactions per second.

These two server systems are Microsoft's true high-end offerings. Advanced Server supports computers with up to eight processors and 8GB of memory, has two-node failover clustering (where one computer takes over if another fails), and 32-node load-balancing clustering (distributing tasks between servers so that none are over-worked). It also includes a series of packages for systems management and improved performance and availability.

Datacenter Server, meanwhile, is Microsoft's ultra-high-end package, available only through specific partners. It supports 64GB of memory, 32 processors, enterprise storage area networks (SANs), server-partitioning (so one server can look like several) and four-node failover clusters — all on top of Advanced Server's capabilities.

Microsoft is trying to ensure stability by making Datacenter Server only available on certain hardware configurations that have been pre-tested and certified by its partners. According to Paul Sinton-Hewitt, a manager at Unisys, which is one of Microsoft's Datacenter Server partners, “nearly 40% of all errors in Windows NT were from badly-produced drivers from third-party device makers”. Now, before a Datacenter Server is installed, every element of the system that could bring down the servers is stress-tested for 14 days. “It's a process that's equivalent to the kind of testing that IBM does on its mainframes,” says Sinton-Hewitt.

Mark Smith, Dell's product manager for enterprise servers in the EMEA region, says that customers, after an initial delay where they evaluated the product, are now adopting Windows 2000 even faster than they took up Windows NT. “It's a function of cost of deployment and a more open architecture. Those coming from the mainframe world are astounded at the cost savings.

”But customers really have to decide the level of reliability they need. Do they really need 'five nines' availability (99.999% up-time) or is four nines or three nines good enough?“ Choice, via the three server levels of Windows 2000, has increased sales, says Smith: the option of moving from Server to Advanced Server to Datacenter Server means companies know they can scale up as their needs grow, without losing investment in applications if they move to a ”closed or proprietary architecture“ (Unix or a mainframe).

Indeed, a report conducted by DH Brown Associates – and commissioned, ironically, by Sun Microsystems, creator of the Solaris version of Unix – backs up Smith's views. The report compared the relative merits of Windows 2000 and Solaris; while Solaris won on scalability, reliability and ease of use, Windows 2000 came out on top in terms of the choice of hardware on which it will run and the number of applications that will run on it.

But in terms of overall performance, can a Microsoft product really compete with high-end Unix systems running on top-of-the-range hardware? (Sun's Solaris, for instance, has supported 64 processors, fault-tolerance and load-balancing for a number of years.) Al Gillen, an analyst at IDC, points out that Unix vendors have a head start on Microsoft. ”That experience will translate into some sort of a core competitive advantage,“ he says. And the price of a Datacenter Server implementation – approximately B#180,000 – means that Microsoft's traditional price advantage is not so obvious at the high end.

The performance benchmarks, however, are clear: at the medium- to high-end, Windows 2000 has a clear lead over its rivals, putting paid to the scalability accusations. In the standard Transaction Processing Performance Council benchmarks (TPC-C) – which measure transaction-processing performance – databases running on Datacenter Server occupy top and third place, while Advanced Server takes second place. (Third to tenth place are all taken by various Unix incarnations.) Furthermore, in the TPC-W tests, which measure web serving performance, the top five places for servers processing up to 10,000 items per second were taken by Windows 2000.

However, at the ultra high-end, Unix still has an edge. In the TPC-W tests for 100,000 items, Windows 2000 comes in third. In the decision-support arena, the TPC-H tests for databases of 100GB show Windows 2000 in second place behind Linux; worse still, it comes in a lowly eighth for databases of 300GB. And in the Standard Performance Evaluation Corporation SPECweb99 benchmarking test conducted in December 2000, a Linux-based web server proved to be three times faster than Windows 2000.

Benchmarks can be deceiving, though. Michael Long, a senior developer at IT consultancy Hampton-Tilley Associates who has written several studies on transaction processing, says that the near-linear scaling of Windows 2000 as extra servers are added to configurations is suspicious. ”If I saw a trend like this in one of my benchmarks, it would raise serious questions about my environment. Near-linear almost never translates into better per-server numbers.“ He says that a closer examination of the benchmarks shows that Microsoft Transaction Server, a high-end feature of Advanced Server and Datacenter Server for distributed transactions, had been switched off.

And, naturally, Oracle CEO Larry Ellison accuses Microsoft of skewing the tests. ”We have no concerns about Microsoft. They still can't scale. They have this benchmark that works only in the laboratory. It has a three-hour mean time of failure and they chopped up the database into 10 separate little databases. If any one of those fails, it brings down the whole system. You are going to get a major system outage or wrong results every three days. It is a preposterous benchmark,“ Ellison asserts.

Those who choose Windows 2000 can indeed have problems. Online brokerage Goinvest.com, which removed its Linux servers and replaced them with Windows 2000 in exchange for ”six figures“ worth of free software and services from Microsoft, is building a Java-based online trading system for customers. According to Goinvest.com CTO Mohammed Rashid, ”Microsoft can come in and provide you with a low-cost solution, and you definitely have a leg up, [compared to] when you had to buy your own transaction-processing monitor and message queue [software]. It's great for companies that are starting up to have that integrated environment.“

But, Rashid adds, the need to customise Microsoft products to work with customers' Solaris and Linux environments ”holds you back as you grow“. By contrast, Dell's Smith claims that none of his customers has complained to him about the Windows 2000 installations they have put in. At the absolute high end where mainframes rule, Windows 2000 still has a long way to go to convince potential buyers they can dispense with their existing set-up. As with most Microsoft products, each new release of the Windows server operating system is becoming more and more scalable and powerful. But while Windows 2000 is making greater in-roads into the high end than NT 4.0 or NT 3.51 did, it may take until the delivery of Windows 2002 or its successor for Microsoft to really demonstrate mainframe-class scalability and reliability.

Page 1 | Page 2 | All 2 Pages

Interested in commissioning a similar article? Please contact me to discuss details. Alternatively, return to the main gallery or search for another article: