Rixstep
 About | ACP | Buy | Industry Watch | Learning Curve | News | Products | Search | Substack
Home » Learning Curve » Developers Workshop

Objective-C, Objectively

Not for them - for you.


Get It

Try It

There's quite a lot of talk going on around the web again about deficiencies in Apple technologies. It's written by the same type of people who used to complain that Apple wanted developers to make their application documents more readable on other platforms, who complained that Avie Tevanian had some hidden evil hatred of the old 'Mac OS', who harboured a nostalgia for a 16-bit system that couldn't make it onto the Internet in any safe or secure fashion, and who in the most extreme and cowardly of cases won't even dare link directly to the complainers but instead link to others who more courageously link directly to them.

There seems to be some sort of delusional idea propagated as always by so-called 'besserwissers' (people long on opinions but disastrously short on cred and chops) that Microsoft of all sorry corporations have something Apple sorely need.

One needs only look at current market and financial statistics to see just how poorly Apple are performing. They have the second biggest market cap in the US after Exxon, have a near 25% of the smartphone market where they once believed they'd hardly make an inroad at all, and have a new product (the iPad) that is so 'revolutionary' that it's in a market niche all its own - there's no one at all to compete with it (and likely never will be).

And everything Apple have on the software side is based on the same code that's based on the same code that's based on the same code... Apple are really hurting alright.

There's also been a claim - by these same besserwissers - that 'ordinary' programmers can't see the deficiencies precisely because they're programmers - in other words: you have to be a besserwisser to understand anything at all. That's quite the mouthful.

Damn Steve Jobs and his walled garden App Store. Damn him for his prissy attitude towards control of content on his new fleet of devices. But don't you dare damn him for putting together two of the most amazing products ever seen in electronics. And give credit to the hardware teams for pulling off the impossible but never forget the inordinate contribution of the software that runs in them.

And say what you want about cock-eyed security models but don't diss the software for that - security models are management decisions - they don't reflect on the software itself or its quality.

This nonsense talk is the same kind seen from time to time where Java programmers ask hilarious questions such as 'why doesn't C have string classes'. It's talk to not be taken seriously.

But for those of you who neither are programmers nor pretend to be: there's a way you can understand a bit more of what's going on.

Roots

Apple's iOS has its roots in OS X as you've probably heard. It's a honed down version of OS X. It's based on FreeBSD and the Cocoa (NeXTSTEP) classes. The Unix on disk is only what the devices need to run - not every Unix tool known to man is present. This saves space.

The Cocoa application architecture is also streamlined. OS X directories that only serve to contain other directories are dismissed - back in the days of OS X design they were kept on board 'just in case'. They're not needed any longer.

The classic Cocoa classes are gone too - sort of. This because there's a totally new input model. You don't have keyboard and pointing device events as before - you have 'tap tap tap' (and a whole new world of gestures). The Apple hardware picks up these gestures and turns them into events sent to the controlling classes. Who now have new methods to deal with and respond to them.

The classes might not be the same - not totally - but the legacy is there. The programming model permeates everything. And that type of model includes both event driven programming and object orientation.

Event driven programming is used on all modern platforms. Application software initialises on its own but then sits and waits for things to happen - hardware failures, disk events, user interaction, you name it. Even the most primitive graphical user interfaces (Windows, GNOME, KDE) have to use this method. But yes, they are extraordinarily primitive. It takes object orientation to raise vanilla event driven programming out of the primordial mush and make it an erect land creature.

Object orientation is the invention of Alan Kay. He even invented the term. It's not the most adequate of epithets ever but it's the one that stuck.

It might be more appropriate to call Kay's model 'organism orientation' because that's how he saw things - computer screens with dozens, hundreds of sovereign 'organisms' all interacting with one another. The antithesis of a single creator god thread ruling and running it all like an evil eye from Mordor. Scroll bars operate (and 'think') on their own, communicate with text views which in turn manage text cells for the individual glyphs that make up the letter to your grandma and so forth.

Kay's organisms were also 'self-correcting' in his vision: if someone sent them bogus data, asked them to perform the impossible, commit digital seppuku, then they'd be able to catch it in time - and in the worst of all possible scenarios, recover if any calamity did occur.

Application objects run document classes which run document windows which contain content views which contain text fields, check boxes, radio buttons, push buttons, and so forth. Menus contain submenus which contain menu items and so forth. All is neatly contained like Russian babushkas and all is sovereign.

Kay invented his own programming language to achieve his goals. He called this language Smalltalk. This was a research language - it wasn't meant to be used in production. Any more than his mouse prototype was. (Apple produced hundreds of prototypes of mice based on Kay's to finally get the one they wanted.)

Steve Jobs paid a lot of money to see what Alan Kay was doing. Some people at the Xerox Palo Alto Research Center (PARC) and working with Kay's Learning Research Group (LRG) didn't like the idea - they were worried Jobs and Apple would steal all their ideas. Xerox management overrode their objections. The rest is history.

But Jobs admits in retrospect that he didn't have much of a clue what was going on in the LRG labs - that he was blinded by the window-oriented graphical user interface they'd developed, telling himself that he was sure that one day all computers would work the same way. Jobs didn't notice Ethernet. And he didn't notice object orientation as Kay had started calling it. And that lack of insight was to plague Apple for years where programmers continued using the abortive Pascal and designers tried to duplicate and emulate the Kay desktop without realising the desktop was a reflection of the underlying technology, not the cause of it.

Jobs never stays in the dark for long. He fully understood what object orientation was by the time he was ready for his NeXT startup. Odds are the layman Jobs actually understood the essence of object orientation a lot better than the besserwissers do today. Everything built inside the NeXTSTEP and OpenStep systems was object-oriented. And the construction was at once both simple and complex.

The text system remains a case in point for showing just how complex yet elegant this programming idea is. There are dozens of code classes interacting with one another to give users full access to any character on the planet, Unicode or otherwise, written left-to-right or the other way around. Base classes that for example hold character strings that can be encoded from their internal representation in hundreds of localised encodings, classes that build on these classes to allow search and replace operations and so forth, classes that build on these that add paragraph styles and colour, alignment, kerning, and other attributes such as bold, italic, underline, superscripting, subscripting, and so forth. Classes that build on these classes that can become 'storage' classes for what's eventually shown on screen.

Kay also promoted the 'model/view/controller' (MVC) programming paradigm - the model is the internal representation of the application data, the view is what the user sees and interacts with, and the controller is what the other two use to communicate with each other. Cocoa classes (and Cocoa Touch classes) have hundreds of 'view' classes that take care of on-screen behaviour and communicate with client code (application code) and today they've also got a fair share of template classes for use as communicators/controllers.

There are a number of big advantages with this programming approach, not the least of which is the speed of quality application development. Cocoa developers normally need about 20% of the time they'd waste on other platforms.

The Cocoa classes also contain a lot of information never shown to the user, only seen by the programmer, information that tells the programmer ahead of time if the application design is proceeding as intended or if something is wrong. It's never necessary to put a Cocoa application through a debugger if it's written well - the programmer will see all of the unexpected as the app is run through Apple's IDE.

Cocoa applications are far more stable than comparative applications written on other platforms. And the wonder of it isn't the depth of the API - perhaps four times the size of the Windows/GNOME/KDE APIs - but that it's so easy to use, so easy to find what you're looking for, so easy to predict what you're going to be looking for because of the standardised nomenclature used, and above all that it has an almost nonexistent learning curve.

One wants to empathise (and weep) with Steve Jobs when he laments that it's taken Rip Van Adobe ten years to create their first Cocoa application. Cocoa can take a while to get comfortable with but the underlying language takes merely a few hours.

This of course applies only to 'real' programmers - and not the besserwissers who do things like Python and Java. Real programmers can learn the underlying language Objective-C in a matter of two to three hours. Learn it well enough to start using it right away. That's how simple it is - but again: only for those who already have the chops to do it.

Objective-C

Objective-C is based on C and like its ugly duckling half-sibling C++ was originally a preprocessor pass onto the standard C compiler. The key to Objective-C is C. The trick is the fact it's actually based on Alan Kay's Smalltalk.

C is a 'generic assembler' - as Brian Kernighan put it: 'C is a language that's supposed to do what the machine does and do it well'. There never was a language like C before C and there isn't going to be another one after either. C by its own definition is its own successor.

Generic 'von Neumann' machines - based on a design definition by John von Neumann - have to be capable of a number of basic operations. They all operate with primary memory (randomly accessed) and with secondary memory (tapes, disks, USB sticks) and they all have central processing units with arithmetic logic units. The central processing units (CPUs) have to be able to do impossible things like addition (with carry bits) and subtraction (by using complements) and multiplication and division (by shifting register values left and right). There are some things von Neumann machines don't have to do but which some CPU OEMs put in anyway - these aren't included in the C definition.

C has no rotate operator simply because von Neumann machines don't have to be able to rotate register contents. And so forth.

C goes as far down to machine level as it can without being hardware-specific. It also allows for 'inline assembler', a technique often used in device drivers today. Yet C can also attain high levels of abstraction through its macro preprocessor. It's namely possible to construct higher levels of program logic complete with function arguments as it were that have no additional overhead whatsoever.

The malleability of C together with its unparalleled efficiency and rich set of operators make it the 'assembler' of the new age where software development costs far exceed hardware costs. Programmers no longer need to learn new instruction sets for new processors - a small band of compiler writers do that and then everyone can create applications on the new platform.

[When Steve Jobs insists third party developers stick to official APIs, he's really ensuring that application software that works today will work tomorrow if Apple go and change their underlying hardware platform. It can seem harsh to insist on such a policy but it makes a lot of sense to strongly recommend it.]

Objective-C then adds object orientation onto C. C is a procedural language, much like CPUs can be considered procedural - with a definite order things are performed in. CPUs have instruction pointer registers and goto/gosub instructions, C has functions, and so forth. Object orientation needs a layer (a thin layer) on top of this.

Objective-C uses 'classes' of code with both class 'methods' (functions) and instance variables. The variables are contained in 'instances' of the classes - the instances are allocated in memory and their variables are initialised (most often to zero by default). The variables are by default visible only to instance methods - functions meant to operate only within allocated instances of the classes.

The classes themselves - the objects or organisms - are also 'event driven' - they mostly sit and wait and respond to news of things happening in the system. But the brilliance of the Objective-C/NeXTSTEP model is how they do this. This is the same model used both in OS X and iOS.

Buttons, scroll bars, text fields, radio buttons, tick boxes, et al all communicate with their controlling (delegate) classes with what's known as the target-action paradigm: they send Objective-C messages to designated methods with a single parameter: themselves. The recipient can use this parameter to query them about any additional information that's needed.

Objective-C is a harmless or 'gentlemanly' superset or add-on of C: it in no way changes the behaviour of the language it's based and dependent on. It adds classes and other things that can be used to achieve object orientation. But it's the way Objective-C has been used by NeXT and Apple that sets the environment apart. And there exists no competitive technology anywhere on the planet that comes close to achieving the same thing.

Then of course one must add Jean-Marie Hullot's fabulous Interface Builder - a tool that makes interface development so intuitive and adds so much intelligent logic to what otherwise would be tedious programming projects that it's not funny.

It's no surprise that Sir Tim Berners-Lee claimed he'd never have been able to create the World Wide Web without this technology - the 'point and click' as he called it of Objective-C, the NeXTSTEP classes, and Interface Builder.

Think about that for a moment when you again encounter one of those silly meaningless diatribes proclaiming Apple's sky is falling. Think about the fact that the future of program development is something Apple and NeXT have had for the past twenty five years - and the fact that no one anywhere has even to this day come close to duplicating it.

The Future

Could Apple be secretly working on something to supersede their current technology? Of course they could. Apple are secretive. All companies are. No one gains ground by standing still. But replacing what they currently have is nigh on impossible, even if it were in some twisted way desirable. A lot of Cocoa code was migrated out of the old NeXTSTEP classes and into procedural modules so as to accommodate the legacy Mac programmers who couldn't hack the three hour Objective-C learning curve. This code could in theory be used in some way to afford such a transition.

But why? As in 'why would anyone want to do this?'. And exactly how? The Apple detractors like to talk about things like 'automatic garbage collection' as if they're the end-all of programming. But the sad fact of the matter is that a programmer is worthless if memory maintenance seems too challenging. Cocoa memory management is eminently simple and straightforward - there are easy rules as to when you and how you free memory just as there are for C programmers where those besserwissers are laughed at if they complain about calloc/malloc and free.

The current generation of programmers - especially in the US - are graduating with immersion in Java and precious little else. Assembler is ignored. Understanding of von Neumann is never approached. C++ would be bad enough but today even C++ doesn't get much airplay. And the teachers - how in the name of all that is good did they ever get into such a career? Where they're so distanced from reality?

Who will create the operating systems of tomorrow? Who will write the drivers for tomorrow's peripherals? A world overpopulated with misplaced Java programmers can't do it.

The system and software of the iPhone wouldn't be possible without the exact tools the detractors claim are outdated and surpassed by Microsoft. The world wouldn't be close to its current smart-smartphone market without them. The iPad wouldn't exist without them. Apple have the tools to get the job done and to stay years ahead of the competition.

They might have secret plans for new things they're not ready to share with the world. But talking shit about the incredible things they've done, all the while all the also-rans stand at the wayside and look so pathetic in comparison, is to write only for the sake of writing. Leave such discussions to the people who are actually engaged in the field on a professional basis. Let the besserwissers return to their articles about Apple finances and published Apple white papers.

About | ACP | Buy | Industry Watch | Learning Curve | News | Products | Search | Substack
Copyright © Rixstep. All rights reserved.