Windows 2000 is the most exciting and important product that Microsoft has ever created, a family of operating systems aimed at the corporate server and desktop markets that combines the latest technology with ease-of-use and simplicity features. As the first major upgrade to Windows NT since version 4.0 shipped in July 1996, Windows 2000 represents the next generation of the reliable, scalable, and secure operating system we've all come to know and respect. Originally titled Windows NT 5.0, Windows 2000 was so renamed in late 1998 when it became clear that this release would extend past its NT roots to embrace markets currently dominated by the mainstream Windows products such as Windows 98.
Today, each Edition of Windows 2000--including the desktop-based Professional Edition as well as the various members of the Server family-- targets a specific portion of the business market, making a compelling case for complete Windows 2000-based solutions. Windows 2000 Professional now bests Windows 98 as the obvious choice for mobile users with its advanced power management features, offline prowess, and integration with Windows 2000 networks. On the high-end, Windows 2000 DataCenter Server scales to heights previously unknown to Windows NT, offering support for 64 GB of RAM, multiple processors, and advanced clustering. Windows 2000 scales up--and down--into markets that are new for a Windows NT product. In this way, Microsoft has greatly increased the number of possible usage scenarios for Windows 2000.
But Windows 2000 Server is the true high point in the Windows 2000 Product line. Encompassing a family of products that includes Server Standard Edition, Advanced Server, and DataCenter Server, Windows 2000 Server scales from small office/home office duty to the largest data warehouses on the planet. In this chapter, we'll take a look at the history, design, and usage scenarios of Windows 2000 Server to gain a better understanding of where we've been and where we're headed.
The Evolution of Windows NT and Windows 2000
In 1988, Microsoft Corporation wooed David Cutler away from Digital Equipment Corporation, where he had spearheaded the development of the popular VMS operating system. Given carte blanche to pursue his dream of a next-generation operating system, Cutler and small group of coders began work on NT ("New Technology"), a micro kernel-based operating system that was architecturally similar to VMS. NT was to be a fully 32-bit operating system that sport a modular design, memory protection, and pre-emptive multithreading in a day and age when PCs operating systems offered none of these features. Microsoft just had two requests: It had to run applications designed for the then command line-based OS/2 operating system, which it was co-developing with IBM, and it had to achieve POSIX compliance, a key feature for US government use.
Windows NT/2000 timeline |
Click here to see a table of the Windows NT/2000 timeline.
|
In the beginning, NT had little in common with any other products at Microsoft, aside from the module that would run OS/2 applications. Cutler and his team designed NT to be modular so that OS/2 and POSIX compatibility could be simply tacked onto the system. Indeed, even the Win32 subsystem that eventually won out is nothing more than a software module on par with its OS/2 and POSIX siblings.
Then Microsoft shipped Windows 3.0 in late 1990. And everything changed: As millions of copies of Windows flew off of store shelves, Microsoft realized that the future of computing was Windows, not OS/2. The company cut its ties with IBM and OS/2 and directed Cutler to make NT at least look like Windows. So Windows NT was born, with a graphical user interface that was almost identical to the regular Windows product. And even though Windows NT had nothing in common with Windows under the hood, Cutler's team also built in the ability to run Windows applications so that users could easily upgrade when needed. The plan for marketing Windows NT was finally coming together.
Finally, in late 1993, David Cutler and the Windows NT team shipped their first product, Windows NT 3.1, which was given an identical version number as the latest version of Windows at the time. Windows NT 3.1 included the new NTFS file system, the domain-based networking scheme used by NT until Windows 2000, and a 32-bit interface for programmers. Windows NT 3.1 landed with a collective thud, however, because of its lofty hardware requirements and lack of native software. Still, the idea was sound and as hardware finally caught up with the design of Windows NT, future versions began to sell well. In September 1994, Windows NT 3.5 was released with faster performance and better administration tools. The watershed release, however, was Windows NT 3.51, which arrived in June 1995, just two months before Microsoft's pseudo 32-bit Windows 95, the next major upgrade to Windows 3.x. Windows NT 3.51 included the first version of what was to become COM, the Component Object Model, Microsoft's vision for distributed computing that is finally realized with Windows 2000. It also included Windows 95 application compatibility, which became extremely important when Windows 95 went on to sell more copies than any operating system ever created.
The release of Windows 95, however, with its new user interface, caused a bit of problem for Windows NT users, who were still stuck with the clunky old Windows 3.x-style interface. Responding to user concerns, Microsoft began working on a version of Windows NT that would look and feel like Windows 95. Windows NT 4.0--featuring the Windows 95 user interface--shipped in July 1996, after only six months in beta. Though largely a re-skinned version of NT 3.51 with few other real-world improvements, Windows NT 4.0 was a resounding success, and quickly sold millions of copies (Indeed, Microsoft has sold over 30 million copies of Windows NT 4.0 at the time of this writing). Reliable, secure, and scalable, Windows NT 4.0 was just about everything the corporate market was looking for. And with the rise of the Web, NT's prowess as a Web server soon made it the platform of choice for an entire generation of systems. Windows NT quickly outpaced then replaced UNIX at all but the highest-end systems. And its only competition on the low-end was Windows 95, which ran better on older desktops and mobile computers, and later, Windows 98, the successor to Windows 95. Microsoft had literally captured the minds--and pocketbooks--of virtually the entire computer industry.
The success of Windows NT 4.0 caused a bit of a conundrum for the NT team at Microsoft, then still lead by the demanding David Cutler. The future of computing--as seen from a late 1996 standpoint--was clearly Web-based, but there were other issues as well. Computing had become too complicated. Windows NT required too many reboots, such as when networking components were reconfigured. And UNIX pundits were beginning to complain that NT's popularity made little sense when UNIX systems were known to scale higher and run more reliably. And the rise of open source solutions, such as the UNIX-like Linux, caused IT managers to begin exploring alternatives.
During the entire lifetime of Windows NT 4.0--basically 1996 to 2000---the operating system came under constant attack from hackers as new security flaws were found. Critics saw this as evidence of NT's vulnerability, but it can more accurately be described as a right of passage as NT became far more popular and thus more logical for hackers to attack. In the 1970's and 1980's, VMS and UNIX came under constant attack themselves, as they were virtually the only connected systems available. But the rise of Windows NT brought a new target to the mix. Microsoft responded with several service packs--designed to supply bug fixes in a single collection--and hundreds of hot-fixes that fixed individual issues as they arose. Windows NT had its trial by fire and emerged as a more secure system as a result. Indeed, the fact that Microsoft hasn't had to make any drastic changes to the design of Windows NT attests to its fundamental strength and adaptability.
When it came time to design Windows 2000, the NT team knew it had its work cut out for it. Windows 2000 would need to be more secure, reliable and scalable than Windows NT 4.0. But it would also need to be easier to use and more manageable, despite the fact that NT 4.0 was already outpacing the competition. And Windows 2000--then known simply as Windows NT 5.0--would have to scale from a typical desktop or laptop to the largest Intel and Alpha systems in the world. In short, it would have to be the greatest version of NT ever created.
In September 1997, Microsoft released the first beta of Windows NT 5.0. They should have waited: The first beta was almost completely unusable and included none of the exciting new features that Microsoft was promising for the final release. Almost a year passed and Beta 2 was finally released in August 1996, offering testers a glimpse at the future. Finally, on April 29, 1999, Microsoft released the feature-complete Beta 3, offering it up to an unprecedented 500,000 testers, the largest beta test Microsoft has ever created. And Beta 3 was a huge success: Most users found it acceptable for production environments and Microsoft itself quickly moved all of its production servers to the new OS. In late 1999, Microsoft announced that Windows 2000 would become available on February 17, 2000. Windows NT, finally, had come full circle as a mass-market phenomenon. And though the NT name dies with this latest release, it's spirit lives on as the tag line for Windows 2000: Built on NT technology.
Meanwhile, Microsoft is busy working on 64-bit versions of Windows 2000 and the next Windows. David Cutler stepped down as the Windows 2000 project lead so that he could focus on the 64-bit version of Windows 2000 instead, which he sees as the true future of the product.
Design Goals for Windows 2000
When the Windows NT team sat down in late 1996 to determine the features that should be included in Windows 2000, they took a hard look at the changing landscape of the PC industry and the way that computers are used, managed, and administrated. The set of design goals for Windows 2000 was honed over time with the input of key customers and partners. And when Microsoft surveyed their top customers, they discovered that two improvements were overwhelming required of Windows 2000:
- Windows 2000 systems must be more reliable and available.
- Windows 2000 must be easier to manage while lowering the Total Cost of Ownership (TCO).
But Microsoft didn't stop there: The NT team also wanted Windows 2000 to be more scalable and resistant to security violations. Let's take a mile-high view of the ways Windows 2000 achieves these goals.
Windows 2000 Is More Reliable
Though Windows NT is overwhelmingly considered far more reliable than other Microsoft operating systems, it falls short of the industrial strength UNIX systems from vendors such as Sun Microsystems. A "reliable" system should stay up and running 24 hours a day, 7 seven days a week. Network disruptions should cause backup systems to kick in automatically, so that the user doesn't notice a problem. Windows NT 4.0, in particular, was vulnerable to the so-called "Blue Screen of Death" (BSOD), which usually brought the system to a halt when a poorly written device driver overran the protected memory area of the system kernel.
In Windows 2000, reliability is increased in a number of ways. The kernel and other key OS subsystems are now protected so that applications and device drivers cannot violate system integrity. And a new feature called Windows File Protection (WFP) prevents applications from overwriting key system files with its own, older versions. A new version of NTFS--Windows 2000's file system--is more fault-tolerant than previous versions so that the system can be returned to a known state in the case of a hardware failure. In the case of a complete system failure, Windows 2000 reboots much more quickly than Windows NT 4.0 and does so without the intervention of the user if desired. This maximizes system uptime.
Windows 2000 also supports a new Safe Mode Boot so that problem hardware and software can be isolated and removed if needed. The number of actions that require reboots has been reduced dramatically as well: Microsoft has reduced the number of required reboots in Windows 2000 to 16, compared to over 75 in Windows NT 4.0.
Windows 2000 Is More Available
"Availability" refers to a client's ability to access that system. A system that is available is online, running properly, and accessible to others. Windows 2000 Server is classified as "high availability," meaning that a properly configured Windows 2000-based system should be available 99.9% of the time. And other members of the Server family achieve even higher availability with such features as clustering and fail-over support. If you're looking for truly stellar availability, you'll want to look into Advanced Server or DataCenter Server.
Windows 2000 Is More Manageable
Windows NT 4.0 offered an almost ridiculous number of management tools, each designed to handle a single administrative task. While the number of tools available to the Windows 2000 administrator isn't any smaller, Microsoft has at least provided a central shell for admin tools--the Microsoft Management Console (MMC)--that gives each of these tools a consistent look and feel.
Microsoft has also made the old DOS-like command prompt superfluous in Windows 2000 with the addition of Windows Scripting Host (WSH), a VBScript and Jscript-based scripting environment that makes it easy to automate management tasks.
For system administrators, one of the biggest headaches in Windows NT 4.0 was managing a Microsoft domain-based network. In Windows 2000, Microsoft has finally provided a true Directory Service, the Active Directory, which gives administrators a single point of administration for every resource on the network. Using a single data store, the Active Directory uses a simple "tree of trees" organizational model for network design, bypassing the old Windows NT SAM database. And the Primary Domain Controller (PDC)/Backup Domain Controller (BDC) model of old has been replaced by a simple Domain Controller (DC) model where all DCs are peers. A change made to any DC is automatically replicated to every other DC on the network.
Everything in Windows 2000 is easier to manage: Users (Figure), groups, security (Figure), and other resources (Figure).
Windows 2000 Is More Scalable
In the Windows NT 4.0 world, there were initially two target machines, workstations and servers, which were served by the Workstation and Server Editions, respectively. As Windows NT matured, however, other markets opened up and Microsoft introduced Enterprise Edition for high-end servers with limited clustering capabilities and Terminal Server Edition for networks of legacy machines that needed to easily run newer applications using a graphical terminal emulation system similar to X Window on UNIX.
With Windows 2000, the range of target systems has grown dramatically. Windows 2000 now scales from a standard corporate laptop or desktop system with the Professional Edition all the way up to multi-machine, multi-processor server giants with the DataCenter Server Edition. As a scalable operating system, Windows 2000 has moved both up and down into new markets for NT, providing support for a wide range of systems. This means that a relatively humble single processor system can be easily upgraded in the future without the need to reinstall the operating system. It also means that Windows 2000 can exist at virtually any tier of an enterprise network. And with a 64-bit version of Windows 2000 expected sometime in 2000, the circle will be complete. Windows everywhere, indeed.
Windows 2000 Is More Secure
The security subsystem in Windows 2000 has been improved at virtually every level. A single secure logon to a Windows 2000 network can be protected with Kerebos v5 (which replaces the old NTLM authentication from NT 4) or even public key authentication. A new Encrypting File System (EFS) can be used to encrypt data stored on an NTFS volume. System and network security can be easily managed with group policy objects in Active Directory.
If desired, Windows 2000 systems can even be protected with smart card logon, which provide a partially hardware-based security where users swipe a card then provide a password to gain access to the network.
What's New in Windows 2000 Server?
If you're familiar with Windows NT 4.0, the new features in Windows 2000 might be somewhat overwhelming. The Windows 2000 Server family represents a stunning upgrade to the previous generation of NT, with an extensive list of features that are covered throughout this book. The following list summarizes the new features in Windows 2000 Server.
Setup and Configuration
Administrators that need to perform multiple identical installations of Windows 2000 will be pleasantly surprised by the new scriptable unattended setup option that requires no interaction during installation. Once a server is installed, you can choose from a list of pre-set server configurations, each of which automatically installs any required services. Available configurations include Active Directory Server (Figure), Networking Server, File Server, Print Server, Web Server, and Clustering Server.
One huge improvement to Windows 2000 is the new Repair Command Console, a utility that allows administrators to access NTFS volumes from a boot floppy so that system recovery can more easily be performed.