Rixstep
 About | ACP | Buy | Industry Watch | Learning Curve | News | Products | Search | Substack
Home » Learning Curve

Cookie Tin Tips I

This series is directed to potential switchers. It's about Unix. Remember when reading that Unix is considered more than an operating system: it's a way of thinking.


Get It

Try It

Unix History

Unix started at Bell Labs in the late 1960s early 1970s. Ken Thompson put together a minimal OS kernel written in assembler for the DEC PDP-7. He also worked on a port of FORTRAN from the old GE mainframes no longer in use - destination the PDPs - but found the exercise futile and came up with a brand new programming language instead.

He called his language 'B'. This is typical of 'ken' as he is by his very nature 'cryptic'. His language was based on 'BCPL', the language used at Bell for work on the GE mainframes.

But the language was bulky and the PDP didn't have enough memory for it, so he 'truncated' its features. As he was in the business of truncating, he thought it a fitting thing to christen the new language with a truncated form of the name of the older one. Thus 'B'.

B was an interpretive language - it didn't transform 'English' into computerese until runtime. This necessarily means things go slow.

The name 'Unix' ('UNICS') was a crack by corridor mate Brian Kernighan. He and the others were summoned by Ken and new partner Dennis Ritchie to witness the first run of their as yet unnamed operating system. Ken logged in to the console - so far so good. Now it was up to Dennis: he was to try an ordinary 'user login'.

The system stood still. Using B, there wasn't enough processing power to take care of two logins simultaneously.

The operating system everyone had been working with up to then was called 'MULTICS'; Brian quipped after the disaster: 'One thing you can't call this baby is MULTICS - more like 'UNICS' - multiuser system for at most one user!'

The name stuck. Dennis was sent to the drawing boards to come up with a better - and compiled - form of B. He based his work on his knowledge of parser engines - in particular the LALR(1) way they work.

LALR(1) means 'look ahead left right one token'. A token is the smallest part of a computer language expression or statement. The parsers can in other words look one token ahead of where they are, so to speak; Dennis made sure the syntax of his new language used this explicitly.

When he'd finished he showed it to Ken. Ken asked if he'd come up with a name yet.

'I was thinking of calling it New B', said Dennis.

'Too long', said Ken. 'You need a shorter name'.

Dennis thought about it for a few days and then suggested: 'Let's call it NB where N stands for New.'

'Still too long', said Ken.

So Dennis finally suggested C because C was the next letter in the alphabet. Ken didn't complain the name was too long this time, so it stuck.

von Neumann Assembler

When Ken and Dennis had rewritten the OS kernel in C, things took off. C is by its very nature von Neumann assembler: it does generically all that modern computers - 'instruction oriented computers' as defined by John von Neumann - must be capable of doing.

If there's something a CPU has to be able to do, C can do it. If there's something only some CPUs do which is not absolutely necessary, C won't do it. And so forth.

Thomas J Watson 1964

When Ken and Dennis introduced Unix at the IBM Thomas J Watson Research Centre in 1974 and caused a sensation. Up to then the 'gurus' of the age such as Tony Hoare and Edsger Dijkstra had been in the habit of traveling around the world to OS symposiums and basically discussing ad nauseam what good systems should be like and what current systems lacked and with no one coming up with anything substantially better. Then Unix hit.

Unix put hierarchical file systems in the mainstream and it was the first system that regarded files as 'streams of bytes'. This latter point might be hard to grasp in our age but as Brian put it:

'Take a seat at your terminal and start a source code file. Write a program that copies one file to another. Build the program. Now instruct the program to copy your source code file. If your copy of the source can immediately be built to run again, you have a good system.'

Strangely enough, with all the 'records' and 'random access' back then, with all the 'punch card' thinking, this innocent test most always failed. Unix was the first major system to make 'stream of bytes' happen.

Unix also had 'pipes', a suggestion by Doug McIlroy, head of the CSRC at Bell Labs who also set the tone for 'programs and computers should only work with ASCII so they can always communicate with one another'. In fact Doug is the main proponent of reusability and 'never reinventing the wheel', things we take for granted today.

[ESR has excellent material online in his 'Tao of Programming'. Ed.]

First expressed with a hat ('^') the pipe was a means, as Doug suggested, of redirecting output from one program into another as its input. Ken and Dennis didn't understand what it would be good for; Doug promised them that if they did make it work, it would be widely popular. And he was right.

The hat changed to the 'logical OR' ('|') afterwards; the method used was to cache I/O between programs in a 4 KB buffer and manage the transfer from the 'shell'.

One Floor Up No Lift

The CSRC - Computer Science Research Centre - in Murray Hill is one floor up in a building with seven floors and no lift. You go in the front door, turn right down the corridor, then walk all the way to the end and take to the stairs. The twenty five or so PhDs working in the CSRC at the time were on the next floor.

Almost all the CSRC people were PhDs - with the notable exception of Ken, a mere BSc from Berkeley. PhDs back then were rarely in computer science - it still wasn't a known field - but were instead in mathematics. Both Brian and Dennis were PhDs in math - Brian from Princeton and Dennis from Harvard.

[Yes Dennis got a degree whilst another famous Harvard student dropped out. Ed.]

Once Dennis and Ken got Unix working on their new PDP-11 they set out to port it to another machine on the premises, an Interdata I32. To do this they had to convert their 'telex' tapes from one machine format to the other. The converting machines were at the top of the building on the seventh floor; the Interdata was on the ground floor; for Dennis it was a lot of climbing up and down.

The first thing you do when porting an operating system is port the language (the compiler) it's written in. Once you get programs to compile on the new system - if you've succeeded in porting the compiler - the rest is mostly downhill. So Dennis set about trying to port C to the Interdata.

His days were spent much like this.

  1. Make a new attempt.
  2. Export the code to tape.
  3. Take the tape and climb six flights of stairs.
  4. Convert the tape to Interdata format.
  5. Run back down seven flights of stairs.
  6. Fail miserably at making the code work on the Interdata.
  7. Go back up one flight of stairs and go to 1 above.

Steve Johnson from Cambridge University in the UK was a recent addition to the CSRC at the time. He'd poked around a bit on the PDPs and looked at Unix and C and one day - according to an account by Dennis himself - he ran into Dennis in the stairs.

Steve said hello and asked why the huff and puff on the stairs.

Dennis explained what he was up to and Steve suggested it was completely unnecessary. What he had seen, C was a portable language, said Steve. The idea of portable languages was mostly untested at the time.

Steve suggested they work together on what he would call the 'pccm' - the Portable C Compiler Machine. It would start by filtering out all possible code that worked outside the abstract 'von Neumann' definition - code that might break on another platform.

Steve said afterwards that his empirical evidence was that 94% of all C code is truly portable; it was the remaining 6% that had to be filtered out and taken care of separately.

Steve also wrote a new compiler. The compiler Dennis had written was recursive - not surprising inasmuch as his doctoral thesis was on 'Hierarchies of Subrecursive Functions'. But Steve had a better idea.

When parsing code input, Steve's new compiler would figure out every possible way to express things in 'machine code' and then compare: the one with the fewest memory accesses would win.

This was in effect a self-optimising compiler. Its tenet was based on a fact of the day - namely that memory access was the most time consuming thing you could do.

[Intel processors long used twelve clock cycles to access memory; the PowerPC used only one from the very beginning. See why Apple computers are so fast? Ed.]

Once Steve got his pccm running it was all downhill. Unix got ported all over the place and suddenly universities were interested in it, and Ken Thompson decided to take a year off and visit his mates at Berkeley.

BSD Unix

Ken took some Unix source with him on his trip and showed it to the people at Berkeley. This turned eventually into BSD Unix. FreeBSD Unix, the flavour used in OS X, is a 'free' version of BSD Unix.

Unix became very popular with 'Baskin Robbins' flavours everywhere. Things got a bit chaotic and unorganised. AT&T, who officially owned Unix, were not in a position to market it until their breakup in 1983. In the meantime, licences for the source code were dirt cheap.

[Bill Gates got himself one of those dirt cheap licences. You know he did. Ed.]

The maintenance of Unix code left Bell Labs and moved down the state of New Jersey to Unix System Laboratories. Once the AT&T breakup was in place, AT&T began moving to assert their rights to Unix, but eventually the ownership was sold off.

Alan Kay

Alan Kay did a lot of research on how one teaches children with the aid of computers. A lot of his early work used the Logo language (the one with the turtle) and Kay began to see that this kind of interface could be good for adults too.

Doug Engelbart had shocked the world in the end of the 1960s with a demonstation of networking, windows, and mice. Kay was aware of this, and also began visualising it all in terms of almost living 'organisms' on screen (self-correcting too he predicted) and when moving to Xerox research ('PARC') began work on what he was to call Smalltalk.

Smalltalk was at once a programming language (interpretive) and an environment. One built up things as one went along. It used windows and a very crude form of mouse.

Kay also had object orientation coming out the wazoo (he invented the term) and networking with electronic mail.

Steve Jobs, then head of Apple Computer, was looking for the 'next big thing'. IBM had entered the marketplace and Steve needed something to pull out in front of IBM. He heard about Kay and ended up paying, according to the story, a cool million for a tour of Kay's research centre at PARC.

Steve says that at the time he was so blinded by the GUI that he didn't notice the other fantastic things Kay and his team were doing. He supposedly tried to buy the Smalltalk system outright from Xerox but was turned down; he then offered the PARC team double salaries to come work for Apple instead.

[Kay would become an 'Apple fellow'; another member of the team would found Adobe in which Apple would invest heavily. Ed.]

Sugared Water

The Macintosh started by embracing the GUI metaphors of Smalltalk and then trying to figure out how to wire it all together under the bonnet. Initial attempts weren't all that successful - the first prototype of the file system, called 'MFS', did not result in anything promising and it was dropped after two years.

Work began on HFS - 'hierarchical file system' - which is the forerunner of the (supposedly) POSIX compliant file system in use today on OS X.

The Macintosh caused a storm when it was introduced in 1984 but sales were not good. It wasn't until Apple and Adobe literally invented the desktop publishing market things started to take off.

But by then Steve Jobs was in a lot of trouble.

It was Steve who wooed Pepsico VP John Sculley to Apple with the famous quip 'Do you want to sell sugared water for the rest of your life or do you want a chance to really change the world?'

But Steve wasn't happy with John's performance. He saw John as more interested in lining his own pockets than contributing and learning the ins and outs of a computer company.

So Steve decided to stage a coup and oust John Sculley; unfortunately the coup backfired and the Apple board, now on John's side, voted to keep Sculley and relegate Steve to a back lot on the Apple property.

Steve wasn't too happy about that.

Steve was to officially still get word of things happening at Apple but in reality he got little or nothing at all and soon tired of the charade. He took time off and started to think through things - a bleak period in his life as he's later said.

One day he was knocking about with a university professor, discussing the uses for computers, and the prof told him that some things would be perfect for computing - if there were the right kinds of computers to do them on (which there were not).

This got Steve thinking, and his ideas gradually materialised into what would soon be NeXT Computer Inc.

NeXT

Steve has always believed (and always said) that the difference between the best of the best and the mediocre in computer science is on an order of 100 to 1 - where in most other industries it's at most 2 or 3 to 1.

Obviously if he wanted to produce a space age product he needed the best of the best. So he went out looking for them.

Underlying operating system. It was obvious that Unix was the best choice. Not only was it the most stable and secure going, it was also the system academia traditionally worked with.

But add a MACH kernel to it and you get really really stable: Carnegie-Mellon at the time were working on the MACH.

So he picked up Avie Tevanian from the MACH group.

(Avie became chief of software at NeXT and is today head of software at Apple.)

Programming languages. Smalltalk wasn't fast enough. It was interpretive. But Brad Cox's Objective-C was essentially Smalltalk with C syntax and it used the C compiler, so it was not just 'faster' - it was 'very fast'.

So he entered into an agreement with Brad Cox to licence the language.

(He finally bought the rights to the language shortly before the 'merger' with Apple.)

Development tools. Steve had heard about something called SOS Interface for the Mac. Built in Lisp by the Frenchman Jean-Marie Hullot, it revolutionised the method of constructing GUI applications.

Steve invited Jean-Marie to his new NeXT offices in Redwood City to demonstrate SOS Interface; completely sold on the idea, he told Jean-Marie: 'I want that on my new computer' and went about buying up Jean-Marie's program from everywhere and made up a new contract whereby he and he alone would have rights to it.

So he hired on Jean-Marie to make the application work on the NeXT computers.

OpenStep

By the time NeXT were paid to take over Apple, Steve Jobs was looking at possible annual revenues of about $250 million - not a lot as with Apple but still and all a nice chunk. But he was also deeply in debt - for about $429 million - in part because he'd never again relinquish majority control of a company as he'd done with Apple. So every time someone bought into NeXT he bought more too.

By the end of 1996 Apple were in trouble and pundits speculated how long they'd hold out before declaring 'chapter 11'. Apple's product line had stagnated under Sculley; Sculley eventually left but none of his successors had much better luck. The company really needed Steve to pull them out of the crisis.

Apple had also understood the need to move to secure 32-bit multitasking computing but their attempts (the abortive Copland) yielded no results. They had also been in the process of purchasing BeOS but that too ran out in the sand.

Leaving only Steve Jobs and NeXT Computer Inc, now called NeXT Software Inc, now marketing the cross-platform OpenStep.

Apple paid off Steve's debt - all $429 million of it - and Steve came back for a nominal salary of $1 per annum as 'temporary' CEO. As did his source code and all his engineers. And Steve immediately began turning the company around.

'Macintosh' OS X

It's a wonder in retrospect it took almost six years for the eminently portable OpenStep to finish its port to the Macintosh hardware platform but it's a fact. Work started officially in the early spring of 1997 but 'OS X' wasn't really mature until the August 2002 (24 August 2002) release of version 10.2 'Jaguar'.

What held up the show?

For one thing, Avie is himself sometimes credited as opposing attempts to port OpenStep. But the old MacOS was a pile of inedible spaghetti and something had to be done.

And what happens to all the old Maccies out there? Are they now abandoned? What happens to all the programs they've used all those years?

This is a classic question when moving from 16-bit to 32-bit. 16-bit apps cannot be secure; if you provide for backward compatibility you're opening a security hole in your operating system, something that can lead to crashes and exploits.

And 'thunking' code across boundaries like that is never easy.

Steve Jobs once got in trouble with Steve Wozniak for turning his back on Apple ][ users with the Macintosh. Perhaps he didn't want that to ever happen again. Whatever - NeXTSTEP/OpenStep, now called 'Mac' OS X, provided backward compatibility for all the old Maccies out there.

Steve Booed

And then there was the fact that the Maccies hadn't followed Steve to Redwood City. They stayed with John Sculley instead. Steve, Avie, Jean-Marie, and the rest were making computing history in Redwood City but the Maccies ignored it all. And when Steve came back and showed them the coming OS they booed him.

The warts in OS X are caused by Steve giving into pressure from Maccies to implement things that shouldn't be there.

Yet OS X is still the most usable, most stable, and most dazzling distro of Unix going. Apple are responsible for their own hardware (with a 'second to none' rating for its quality) and having only these machines to worry about, hardware compatibility is never a real issue.

Combine this with the space age technology of NeXTSTEP on top and you have something close to identical to what NeXT created in the late 1980s and which is still a dozen or so years ahead of what anyone else can dream of today.

  • PDF screens. NeXT used EPS which they developed with Adobe; OS X uses the successor PDF which is an even hotter technology.

  • Vector based graphics. Not raster based. Not pixel based. Vectors.

  • Floating point screen stuff. From the RGB and alpha values to screen coordinates everything is in floating point.

  • Objective-C's dynamic binding. The entire system is capable of using messages in a way not conceivable on other platforms.

  • The NeXTSTEP API. This API today called 'Cocoa' is about four times the size of the Windows API but eminently and incomparably more accessible. The work that's gone into this can neither be easily appreciated nor adequately praised.

Bottom Line

  • You have Unix under the bonnet - about as stable and secure as you can get.

  • You have a MACH kernel in there - which means it's going to be even more stable.

  • You have the - by far - flashiest and user friendliest GUI in the world.

You can't get much better than that.

About | ACP | Buy | Industry Watch | Learning Curve | News | Products | Search | Substack
Copyright © Rixstep. All rights reserved.