Rixstep
 About | ACP | Buy | Industry Watch | Learning Curve | News | Search | Test
Home » Learning Curve

You Don't Have to Wait for Leopard

Interest in OS X programming is growing but is it the right kind of interest?


Of late the Microsoft developer blogosphere is ripe with speculation about Apple's OS X Leopard, with many already taking a peek at the betas and planning on taking the plunge when the final release is made available. But what has been holding this brain trust back up to now?

It's no secret that Objective-C is the space age language for the space age and that the NeXTSTEP classes are the perfect continuation of Objective-C technology, with developers regularly claiming they cut their work times by 80%. And although NeXTSTEP's 'API' is extensive - by some estimates four times the size of the Windows API - it is more orderly and navigable, and by an order of magnitude.

The comments by seasoned Microsoft developers enthusiastic about OS X are all the more alarming and seem to focus on a new language feature coming with 'Objective-C 2.0' in Leopard - automatic garbage collection.

'I was always running for mommy when seeing alloc and the other guys', writes 'StuFF mc'. 'I just couldn't stand to make myself do something that the computer should just do for you - like managing memory', writes James Welborn.

But the thought that serious professional programmers should get cold feet at the thought of managing their own memory allocations is tantamount to their never having successfully used calls to calloc, malloc, and free; or the 'Virtual' calls on the Windows platform. Managing memory has never been an issue up to now. Not for real programmers at any rate.

The Java language has automatic garbage collection, but it's no secret the implementation is shaky at best. And with the Objective-C language the rules of memory management are simple and easy to follow - and the programmer has complete control. It's all about the release method.

The release message is sent only to objects you own. You own objects you create with one of the alloc or copy methods. These methods return objects with an implicit reference count of 1. You also own or perhaps share ownership in objects you send retain messages because they increment the reference counts. And each retain you send should be balanced with a release or autorelease so the object can be deallocated.

And it can't be simpler than that. In fact it's eminently simple - much simpler than what's found on other platforms.

The autorelease method is of particular interest. Much of the code in any GUI application is 'event driven': it's called by the system. And before the application code is invoked the system sets up an 'autorelease pool': this pool will later deallocate all objects created in response to the event. This is also used (manually) when detaching new threads in an application.

NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init]; /* * */ [pool release];

And it can't be simpler than that. Managing memory in 'Unix' applications is hardly any different.

char *cp = malloc(BUFSIZE); /* * */ free(cp);

Windows also has a [cm]alloc/free API: much of the Windows underbody is borrowed from Unix. Windows developers have been managing their own memory for as long as Windows has managed its own operating system so to speak. And before that developers were still managing their own memory. Developers have always managed their own memory. Real ones that is.

There've been 'enhancements' in the C language of late as well. Now it's no longer necessary to declare one's stack variables at the start of a function - they can be placed anywhere. At what cost to the efficiency of the software not too many stop to worry about.

But this code won't be backward compatible with earlier compilers and above all it's just an excuse to write sloppy code. Either you know what you want to do or you need more time to think about it. And when it comes to memory: either you have control over your program or you should go back to school and work more on it.

But go back to school where? Aye, there's the rub.

IBM legend Fred Brooks teaches computer science at UNC these days. His students regularly complain he's too tough on them. Fred demands his students understand computers - and at least dabble in assembly. And preferably learn C in and out too.

That's rare. All too rare. At least in the US. Where all the major ideas are supposed to come from.

It's been years since anyone attempted 'bare metal programming' on Windows. Most developers inside and outside Redmond aren't even acquainted with the API. They've gone from an intrinsic acquaintance with the system they work with to tools such as the MFC, the ATL, and now the .NET framework using a Java offshoot.

One of the consultants for this website was called recently to London to teach a Microsoft programming course precisely because said consultant had skills far beyond what Microsoft deemed necessary - skills in the 'bare metal programming' API.

The concomitant waste with using such tools is formidable. And even so Bill Gates talks about 'zero brains' technologies where programming is truly turned into a wannabe art.

But someone has to build the tools for these brainiacs to use; someone has to build the operating systems they'll run their fantastic software on; and none of these heroes would ever rely on automatic garbage collection even if they could.

Objective-C was far better than anything Microsoft could come up with already twenty five years ago. And Objective-C was only patented twenty four years ago. And Objective-C is still far better than anything Microsoft can come up with today.

Saying one's interested in Objective-C only when Leopard is released reveals too much about too little ambition and talent.

About | ACP | Buy | Industry Watch | Learning Curve | News | Products | Search | Substack
Copyright © Rixstep. All rights reserved.