|Home » Learning Curve » Developers Workshop
It ain't worth it.
Even if it's possible.
APPLE PARK WAY (Rixstep) — Apple's Unix has become ridiculouser and ridiculouser. What happened to the genius of NeXT?
Twenty-five years ago, when this was first news, good people hoped for a synergy between unlikely - seemingly incompatible - bedmates: Unix the welcome newcomer to the OS world, Apple of a breed apart that no one liked playing with.
Unix influenced everything. It spread to Bill Gates. Early versions of MS-DOS incorporated Unix ideas.
Over in Cupertino they were still using Pascal.
'To state my conclusions at the outset: Pascal may be an admirable language for teaching beginners how to program; I have no first-hand experience with that. It was a considerable achievement for 1968. It has certainly influenced the design of recent languages, of which Ada is likely to be the most important. But in its standard form (both current and proposed) Pascal is not adequate for writing real programs. It is suitable only for small self-contained programs that have only trivial interactions with their environment and that make no use of any software written by anyone else.'
Brian Kernighan wrote about Pascal already back in 1981 - 'Why Pascal is Not My Favorite Programming Language'.
Brian was kind. Actually Pascal is a lot worse. At a deep level Pascal makes no sense, and at implementation level it's hopelessly 'paraplegic' - a word, coincidentally, that Brian once used to describe the Macintosh mouse.
Standard Pascal allowed only a limited number of source code files to be used to build an application. It made a mess of file I/O - which, for some unfathomable reason, it builds into the language rather than letting it stay extern as grownup languages do. It placed artificial constraints on programming style - its silly abuse of the semicolon being a case in point (pun not intended by nevertheless welcome). It was described by another guru as 'programming in a strait jacket'.
'The I/O design reflects the original operating system upon which Pascal was designed; even Wirth acknowledges that bias, though not its defects. It is assumed that text files consist of records, that is, lines of text. When the last character of a line is read, the built-in function 'eoln' becomes true; at that point, one must call 'readln' to initiate reading a new line and reset 'eoln'. Similarly, when the last character of the file is read, the built-in 'eof' becomes true. In both cases, 'eoln' and 'eof' must be tested before each 'read' rather than after.'
The creator of Pascal, Niklas Wirth, was a Swiss programming teacher. The sad fact is that Pascal made it around before C. Otherwise the world would have been spared a lot of pain.
'Pascal's built-in I/O has a deservedly bad reputation.'
Niklas had to defend his monster of course - and the one really cogent thing he said was that he never intended it to be used in production. In other words, the language wasn't meant for professional use - only for teaching. For it was Niklas' opinion that his students were sloppy.
'A typical Pascal program reads from the bottom up - all the procedures and functions are displayed before any of the code that calls them, at all levels. This is essentially opposite to the order in which the functions are designed and used.'
Niklas also implemented call by reference, a horrific bane. Algol had - in theory - call by name, whereas sensible languages like C had call by value. And if you don't know what that means, go visit Daring Fireball.
'Comparing C and Pascal is rather like comparing a Learjet to a Piper Cub - one is meant for getting something done while the other is meant for learning.'
We personally were tasked with revamping a group with eight Pascal programmers. Within three days we'd accomplished more than they had together in two full years. They were all reassigned within the company. We asked one of these 'veterans' already on our 'day one' if he could tell us how to perform basic file I/O. The conversation went something like this.
'What do you mean by basic file I/O?'
'Well how to open and read and save a file.'
'Why do you want to open the file?'
And so forth. Another of our first goals was to, in the interim, create a preprocessor so people could write sensible mature code that looked a bit like C.
'The language is inadequate but circumscribed because there is no way to escape its limitations. There are no casts to disable type-checking when necessary. There is no way to replace the defective run-time environment with a sensible one, unless one controls the compiler that defines the 'standard procedures'. The language is closed. People who use Pascal for serious programming fall into a fatal trap.'
A further goal, implemented after a lot of pushing and tugging, was the introduction of C on the platform. After that, things took off.
'I feel that it is a mistake to use Pascal for anything much beyond its original target. In its pure form, Pascal is a toy language, suitable for teaching but not for real programming.'
Upper management had been worried for some time that they had major expenses ahead because of their use of Pascal. The language was a memory hog. The purchase of memory cards for another 5,000 computers was in the works.
But by revamping key software components, first with our preprocessor and then with our new C compiler, we were able to show that such a purchase was not necessary, saving our company tens of millions and getting the hardware sales rep to storm out of a high-level conference in frustration.
Pascal sucks. As Brian said, Pascal did come first, but it's about as similar to C as German is to a Scandinavian language if you're a teenager growing up in the suburbs of Tokyo. (Brian is kind.)
Pascal is no more, essentially. It took the Royal Institute in Stockholm years to abandon it, but the institute in Linköping got on the Unix bandwagon right away.
Pascal is a good tool for the undisciplined mind, thought Niklas. But it's also a good tool for the lazy mind. Reality is a crusher. And, once again, if you don't know what this is about, go read something else. The Daily Mail perhaps. Have a good time.
Why people chose Pascal way back is easy. It was the first widely available prodecural language with context-free parsing. Why they continued using it as long as they did? That's another matter. There's no excuse.
By the mid-1970s Unix and C were causing a storm, all started at the Thomas J Watson Research Centre. By the mid-1970s Apple programmers were putting up Pascal posters in their 'offices'.
The arrival of NeXT in Cupertino was the first real confrontation with C and Unix. They weren't prepared. Gil Amelio paid $429 million to a company projecting a coming annual profit of only $300 million. Apple's programmers still thought code was all written in UPPER CASE. Yes, they had the step to Objective-C to contend with, but that's easy. The big hurdle - insurmountable, it seems - was to C.
Learning C properly is grueling. It takes years. But it can be done if you're the right person for the job. Unlearning all the silly aspects of Pascal is much much harder.
Top management at Apple in 1997 was mostly Nexties. No problemo. But the rank and file? They were - still are - another matter.
NeXT helped save Apple. But what really saved Apple was iPhone. It saved so much that Apple moved most of their programmers over to the iPhone side. The idea may have been to maintain two completely sovereign operating systems, but that's not how it worked out. More and more the cornerstones of NeXT were eroded away by former Pascal programmers who didn't have a clue - not a clue about Unix, or about C, or about the intricate architecture that back in 1997 saved their sorry arses.
NeXT's OS was a work in progress in 1997. That same system, in the hands of Apple, by 2007 was a textbook example of disarray.
Things were in such disarray that the first three point updates to the version of OS X for iPhone ran everything as root. And legendary clown John Gruber quipped that if people didn't understand that they knew what they were doing at Apple, they had another thing coming.
Sure, John. Running everything as root is a good idea.
GNU had an Objective-C compiler. Good for them. But then LLVM came along, with a project leader who thinks C is just too difficult.
This has led to the introduction of so-called 'playgrounds' (the name itself should give it away). This in turn led to Sir Tim declaring that programming is for everyone, just like open-brain surgery evidently, and even though no one knows any longer what a 'computer' is.
Dark Mode comes along. Everybody's got it! And this can't be taken away from Apple: when it comes to graphics, they sweep the field, both in technology (mostly inherited from NeXT) and good taste.
Then they try to outlaw screen updates from background threads.
They'd been mumbling about this for almost twenty-five years. They never did anything about it, so they were mostly ignored.
The documentation says the method was introduced for 10.2 Jaguar. So it had not been present before that.
It looks easy enough, but the issue is with implementation and flexibility. You can always send a good arg for the arg arg, but you can't send anything but a pointer. The first implementations evidently were flexible here, but later implementations - coded by Apple's best - evidently test access and, if it can't be dereferenced, things go south. So passing a BOOL there won't work. Of course there are workarounds here as well, but that's the point - and it's a point in the simplest of all possible scenarios. Quite a few links can be found online discussing how to get past this impasse. And they're all pretty ugly.
One thought is that, as the system itself can detect what thread is being used for the call, system methods automatically skip to the above call when needed with a 'waitUntilDone' of YES (and then provide another way to not wait on such calls).
What's bleating apparent as time goes on is that code gets messy. The call itself, especially after being reduced to a macro as Rixstep do, is quite tidy. The messiness comes about with all the necessary workarounds. 80% of all code today only takes care of the window lickers. Now that figure is threatening to rise even higher.
Code has to be written so it can be maintained, as code will always be maintained if it's good code. Code should be self-explanatory. Code should be commented only when absolutely necessary, such as to explain why a seemingly weird algorithm is absolutely necessary. One doesn't clutter code like this, for example.
i = 0; // set variable i to the value 0
Code must be written so it's easy to get an overview - a condition programmers call 'discovery', as in 'AHA! So THAT'S what it's all about!' The less code, the less clutter, the better.
Needless to say, having otherwise decent code riddled with workarounds for childish stipulations cannot possibly help, especially when it's known that these stipulations are arbitrary and often used to hide the inner workings of what once had the ambition of being 'open source', and when it's not known, not now and likely not ever, what the fuss is all about.
'Pascal is a toy language.'
Rixstep FTP: Why Pascal is Not My Favorite Programming Language
You've obviously heard of us, otherwise you wouldn't be here.
We're known for telling the truth even if it's not in our interest.
We're now telling you to beware Apple's walled garden. Don't get locked in.
What you've seen so far may be only the beginning of something far far worse.
Download our Test Drive and at least check out our free Keymaster Solo.
That's the first step to regaining your freedom. See here.