Rixstep
 About | ACP | Buy | Industry Watch | Learning Curve | News | Products | Search | Substack
Home » Learning Curve

The Windows Default

It's easier than you think.


Get It

Try It

There's a lot of talk today about why Windows is leaky. There's a lot of talk because of the ongoing attacks by Zeus, by the Chinese. There's a lot of talk because small to medium size businesses especially in the US are being gutted by a combination of botnets, trojans, and command and control servers. People are losing their identities and their life savings - and to the very last one they're running Windows. So naturally there's a lot of talk about why.

Most of this talk should end up on snopes.com. There's the perennial claim any platform with a 90% market share is going to get clobbered. There's even a belief that with the right security suite (or suites) Windows can be safe to use.

There's talk about how Apple's OS X can't be trojanised because that would require a user submitted the admin password. It's all candidate for snopes.com.

The simple fact of the matter is Windows is not safe, has never been safe, and will never be safe. And the reason is just as simple to grasp.

Old Days

Windows grew out of MS-DOS for the IBM PC. This 'system' was fully permissive. Programmers coming to the IBM PC after working on other platforms such as Unix, VMS, and MVS noted they could do anything. They could write directly to hardware ports, directly modify internal RAM memory, directly modify disk sectors. There was no thought for security. It was a wide open system and was built to be that way.

There were no accounts on the PC. Whoever got to the box and flipped the power switch owned it. There was no way to lock down files. Adding a 'read-only' attribute to files mattered not one iota - all it did was provide a speed bump. Anyone at any time could remove the attribute because the system had no owner and whoever used it was the superuser. Root.

Radsoft excelled at 'hacker tools' - direct to screen editors, direct to disk sector editors, and the like. Binary editors that hooked into executables, dumped them on screen, followed code jumps, followed BIOS interrupt vector calls, and the like. The system was wide open.

But the system was also wide open for viruses. The computer virus is an ingenious piece of code that both attaches itself to existing program files and also finds a way to propagate to other executables. Most of the early viruses were virtually harmless (or nearly so). They just propagated.

Viruses were able to get to ground on IBM PCs because the program files couldn't be protected.

The computer hardware couldn't be protected either. The hard drive couldn't be protected either. Virus writers developed an ingenious way of attacking boot sectors: moving the real boot code to remote sectors on the hard drive, then going directly in the file allocation tables and marking those sectors as defective so no one would touch them. Anything attaching to the system got infected.

All this was possible because the IBM PC was a nearly totally open system. No thought had been given to security. None. Security wasn't even on the table.

Older Days

Computer systems before the personal computer weren't personal - they were shared. They helped run many of the most important governmental, military, and financial tasks of the day. These were critical operations and they couldn't be left wide open.

That type of operating system starts by not allowing anything at all. It puts the system firmly in the driver's seat. Permission to do anything at all is granted on an individual basis. By default the user can do nothing at all.

Those operating systems have accounts. The system's administrators, with physical access to the computer, might be able to run it in a mode resembling how IBM PCs were used but ordinary users can't get at the physical box itself and can only access the system by successfully logging in. The administrators control who has access to the system, whether certain individuals are completely denied access to the system, and what each user will be able to do upon a successful login. The administrators, on behalf of the system itself, control all aspects of system use.

Flaws can be found even here. Tricks can be devised to fool the administrators. The term 'trojan horse' was used early on to describe the Unix 'login' trick: you put a fake login program on a terminal and have it hook through to the real login program after harvesting user account information.

Another cool trick was the 'land mine': you put fake versions of important system administration tools in your own home directory. Programs such as ls. These programs ultimately hooked through to the real system tools - but not before getting the opportunity to set up ghost root accounts for future use.

And there are the occasional bugs. Unix SVR4 had a nasty bug in its method of 'moving' files across physical hard drive boundaries. It was possible to copy out the all-important passwd file, doctor it to add stealth root accounts of one's own, and then 'move' (rather than copy) it back to its original protected location.

Flaws such as these are easily fixed because the basic design is correct and already in place. By simply excluding the current working directory from the $PATH variable, the 'land mine' hack was completely circumvented. Other techniques are employed today to thwart the first attack. A code fix took care of the third hack. And so forth.

Again: these are individual examples and often involve a type of social engineering or categorical programming error - the system itself isn't about to allow just anything. By default it allows nothing.

Today

Windows is the opposite. It started as a standalone system that was completely wide open where anything was possible. It graduated to a messy semi-graphical environment were windows got stuck on screen in a single location and couldn't move around. It got better when the PC got more memory and underwent a sort of renaissance in the early 1990s with the release of versions 3.0 and 3.1. But it was still a totally standalone system - no ownership, no file protection, no nothing.

It's important to keep in mind that if a system can't protect its files then it can't protect itself either. Such a system could be (and often was) attacked at a very low level. Through the boot sector or by compromising system files. Because there was no way to protect anything at all. It was a standalone system.

David Neil Cutler brought VMS technology to Redmond. VMS is a secure bulletproof system. It works as an older operating system works. It assumes ordinary users will not have physical access to the computer itself and that the administrators will grant and deny access rights as they see fit. VMS users can't do anything they want - only what they've been allowed to do. By default they can do nothing at all.

Cutler came to Redmond because Microsoft wanted him. And because - through a stroke of luck - Microsoft found out Cutler was at odds with DEC management. They offered him a way out and he took it. And he took his code with him.

Cutler's new OS wasn't supposed to supplant Windows. And it didn't. It wasn't even supposed to coexist with Windows. It was supposed to run on a separate computer with no Windows system at all. Cutler was building a file server, a LAN manager. How Microsoft wanted to connect to computers running his system was their business and not his concern. And he didn't care much about it either.

Microsoft ended up taking Cutler's otherwise excellent code and squishing it down on top of existing Windows/PC code. The basic PC architecture remained in place. The MS-DOS file system remained in place. Cutler had his own file system for use on his servers but that wasn't considered something for the Windows 'terminals'. Computers running Cutler's new 'NT' were more sophisticated than anything Microsoft had ever had but they weren't secure. No computer system Microsoft ever released is.

The Windows Default

The basic difference between Windows and every other operating system on the planet, the one impervious reason Windows is a mess and always will be, is this: the Windows default is to allow everything.

By definition, anything is possible for any user on a Windows system. That's the basic fundamental idea. Security is a mere afterthought. What Microsoft engineers are doing today - for all the billions they're spending - is trying to hunt down and root out all the myriad holes that exist on a system initially designed to have no restrictions - and they're doing this after the fact.

  • Real operating system: allow nothing by default and grant permissions only as needed.

  • Windows 'operating system': allow everything by default and try to stop bad things after the fact.

That's why you see astonishing things like system alerts telling you your file system has been compromised. Why couldn't the system just stop the compromise? For Windows cannot do that.

  • If you're still running the old MS-DOS (FAT) file system then you have no protection at all. Anything on your hard drive can be trojanised.

  • If you're running Cutler's NTFS file system as an administrator then you still have no protection - NT administrator accounts can assume ownership and control of anything on the system - anywhere in the Registry, anywhere on disk, any system file on disk - without privilege escalation and user authentication.

Most Windows users don't even know what 'privilege escalation' and 'user authentication' mean.

About | ACP | Buy | Industry Watch | Learning Curve | News | Products | Search | Substack
Copyright © Rixstep. All rights reserved.