Date: Fri, 5 Jul 1996 19:20:47 -0400
In-Reply-To:
Message-ID:
References: <833058917.18622.0@melech.demon.co.uk> <4qg9a5$cdh@goanna.cs.rmit.EDU.AU>
MIME-Version: 1.0
Content-Type: TEXT/PLAIN; charset=US-ASCII
Rex Ballard - Director of Electronic Distribution
Standard & Poor's/McGraw-Hill
Opinions expressed do not necessarily reflect
the Management of the McGraw-Hill Companies.
http://cnj.digex.net/~rballard
On 28 Jun 1996, Daniel A. Taylor wrote:
> On Thu, 27 Jun 1996 06:44:30 GMT, Anthony D. Tribelli wrote:
> >David Whysong (dwhysong@dolphin.physics.ucsb.edu) wrote:
> >
> >: I really can't believe I'm getting into this thread, but the
> >: cluelessness of some posters is really getting to me...
> >
> >Perhaps you should elaborate, I suspect you missed something.
> >
> >I was not discussing linux "hanging", I was discussing the necessity of
> >multiuser in an OS. The Ctrl-Alt-Del failure is merely an example of where
> >multiuser is used in linux as the primary recovery mechanism for some
> >problems. Problems there are addressed without multiuser in WinNT.
One of the fundamental differences between NT and Unix in their ORIGINAL
designs it that NT was ORIGINALLY intended to be a single-user machine. A
workstation, until the customer base balked at the prospect of buying
32-64 megabytes of RAM and Pentium CPUs for everyone at the same time. NT
had abandoned not only the 8086 and 80286 markets, but also the 80386 and
80486 markets as well. You needed the processing power equivalent to a
Cray 1 to run NT with anything resembling normal response.
As a way of "Saving Face", Microsoft rushed to the strategy of calling NT
a SERVER platform to compete with NETWARE and SQL Servers such as those
running under OS/2.
Because multiuser was an afterthought, NT proponants tried to get around
the issue by relying on multithreading (running several connections under
a single unprotected process). The end result was that if the server
process went down, the entire network came with it - big deal.
> Addressed by the Big Red Button for all the evidence that
> has been presented in this forum. That works for Unix too
> you know.
The difference is that UNIX spent the last 20 years in an environment
where hitting the "Big Red Button" could cost several thousand dollars.
Any Unix administrator worthy of the title will do everthing possible
before hittign that big red button. Admins wrote special tools and came
up with ways to back up systems, and rescue crippled systems without
actually rebooting the system. Diagnostic routines are put in a "Safe
Place", and user applications have "rescue" strategies (vi -r).
Perhaps, after 3 or 4 years, NT will have the same type of bullet-proofing
that comes from doing battle in the trenches.
> The Unix multi-user ability serves in error recovery to
> allow the user or administrator to obtain sufficient
> distance from a misbehaving application to be able to
> kill it without taking off body parts. Metaphorically
> speaking of course. (though who knows with some of
> those programmers out at MIT?)
Windows NT and UNIX also have very different views of the "role" of a
server. Windows NT assumes a very sophisticated client that just "needs a
little help" getting to extra disk space and SQL databases.
Unix comes from a history of 200 users pounding away at terminals, often
programming and controlling 5-20 processes each (for pipelines, filters,
merges, and analysis.
From this perspective, there are jobs that NT does do well. Mostly "smart
puppy" functions like "fetch my file" for me.
When you want to perform a series of transforms, UNIX is designed for that
type of load (for example:
ls -l |sed s/dwr/xwr/ | awk '{sum=$sum+$3}' ...| more.
With proper coupling and a minimal amount of coding, one can perform very
complex tasks in a very short time.
> --
> Daniel Taylor
Rex Ballard
From rballard@cnj.digex.net Fri Jul 5 19:48:49 1996
Status: O
X-Status:
Newsgroups: comp.os.linux.advocacy,comp.os.ms-windows.nt.advocacy