|
Windows 2000 Server Beta 3 Reviewed
With the introduction of Windows 2000 Server Beta 3 (Figure
1), Microsoft raises the bar for mid-level department servers,
home/small office servers, and Web servers by offering a reliable
and scalable upgrade to Windows NT 4.0 Server that implements the
best features of its predecessor while improving some of the more
problematic areas. And like Windows 2000 Professional (see my
review), Windows 2000 Server is a compelling upgrade.
A Quick introduction to the Windows 2000 Server family
Windows
2000 Server "Standard Edition" sits at the bottom of a new
family of Windows 2000 servers. It supports two microprocessors
(four if you upgrade a Windows NT 4.0 Server box) and up to 4GB of
physical RAM. The next step up is Windows 2000 Advanced Server,
which supports up to 64GB of physical RAM, four microprocessors (8
for upgraders), and comprehensive clustering. Advanced Server, which
will replace Windows NT 4.0 Server, Enterprise Edition when it is
released, is ideal for SQL Server 7.0 database servers and high-end
Web and file/application serving. At the top of the server food
chain is Windows 2000 Datacenter Server, which supports up to
16 microprocessors (or up to 32 processors from select hardware
makers) and 64 GB of physical RAM. Datacenter server is ideal for
high-end clustering, load balancing, data warehousing, and the like.
Datacenter Server, incidentally, is optimized for over 10,000
simultaneous users and high transaction counts.
This review focuses on the so-called Server, Standard Edition, which
is commonly identified simply as Windows 2000 Server. However, all
of the features in Server can also be found in the higher-end
editions as well. And, of course, each of the Server family members
includes all of the features in Windows 2000
Professional.
Job 1: Increased Reliability, Availability, and Scalability
When it came time to design the feature set for Windows 2000 Server,
Microsoft knew that the number one request was going to be for
improved reliability, availability, and scalability. In other words,
Windows 2000 Server has to work 24/7, require few if any reboots,
self-heal when problems occur, and scale from a single processor PC
to the mightiest multiple processor server monsters used at ISPs
like Best Internet and Data Return.
The first task, then was to tackle the nasty memory leak issue
that's dogged Windows NT for years. The problem, as it turns out,
isn't NT per se but rather certain poorly-written applications that
are commonly used on NT Servers. Administrators know exactly what
I'm talking about: We've all setup schedules to automatically reboot
NT servers at specific times, perhaps once weekly or even once daily
because of this. In Windows 2000, memory management improvements
prevent applications from leaking, largely eliminating this problem.
And a new kernel memory write protection feature removes the number
two server reliability problem: memory access violation, which
results in the infamous blue screen of death (BSOD). In my own
informal tests on a single server box, I've not ever had to reboot a
machine, which runs SQL Server 7.0, Proxy Server 2.0, IIS 5.0, and
Terminal Services, even once: It's as steady as a rock. While I
don't expect Windows 2000 to completely eliminate the problem, I
suspect that it will be much better than Windows NT 4.0.
The next goal was to reduce the number of reboots needed when
configuration changes are made. In Windows 2000, configuring Plug
and Play devices, changing the size of the page file or adding a new
page file, increasing the size of an NTFS partition, adding or
removing network protocols, installing SQL Server 7.0, or changing
the mouse requires no reboot at all. Microsoft estimates that this
will give Windows 2000 20% less downtime than Windows NT 4.0. And
indeed, I've witnessed this minor miracle firsthand: It works and it
works well.
If you do encounter problems, the system restarts more quickly and a
new "kernel mode only" dump option reduces the time
required to create a dump file because it's no longer writing the
entire contents of RAM to a file. When you want to debug the
problem, a Safe Mode boot option (similar to what you'd see in
Windows 98, actually) allows you to boot into a clean Windows 2000
environment so you can isolate the offending application or service.
And Windows 2000's Check Disk (chkdsk) is three times faster than it is in
Windows NT 4.0 SP4. Check Disk (similar to ScanDisk in Windows 98) scans
the hard drive for errors after a hard stop or reboot, and this was
a source of frustration in Windows NT 4.0 because of the amount of
time it took to complete. I haven't seen this feature yet in 2000,
however, because my server has been extremely reliable so far.
Windows 2000 Server supports up to two CPUs on a clean install (no
previous OS) or four CPUs if you're upgrading from a Windows NT 4.0
system with four CPUs. Why Microsoft has backed off from full four
CPU support is unclear, but I suspect it has a lot to do with a
desire to spread the Windows 2000 Server family out a bit more.
There were probably few four CPU systems around anyway and those
customers would be better off with Advanced Server.
One of the most exciting new
features in Windows 2000, believe it or not, is the new file system, NTFS
5.0. This little wonder supports all of the features from NTFS
4.0--compression, per file security settings, and the like--while
adding performance gains and a host of new features. Perhaps most
important among these is disk quotas, which allow you to
manage storage usage on a per-user basis, similar to UNIX (Figure
2). You can set disk quotas, disk thresholds, and quota limits
for all users or separately for individual users. And Microsoft has
even added the capability to monitor and report disk space usage on
a per-user basis.
|
|