|Home » Learning Curve » Developers Workshop
The Longest Screed
Time to dig in again.
Happy new year everyone. Now's the time to dig in like never before. But where to begin? That's easy - with good old NSTableView. Before we get into it: a few extra comments.
We've been on about NSTableView for about as long as we've programmed on this platform. Which is about nine years now. We posted something called 'Table View Blues' back in the day to the Mac dev mailing list. We made a lot of friends from the newbies who were very eager to help but we also had our first encounter with the curmudgeons we otherwise refer to as the 'hens' or the 'landed gentry of Mac OS X development'. They're a very unpleasant rude lot.
We were attacked left and right by some of the most incompetent - and simultaneously 'acknowledged' - developers on the platform. Most of these were not legacy NeXT people - they were people who'd stuck to Apple through thick, thin, and the Redwood City years, only nodding occasionally to Chesapeake Drive.
One of the great things from the discourse: the suggestion that NSTableView is incapable of handling a situation with 4000 or more rows of data. One of those 'experts' blurted that there were no application domains which would require so much data. Talk about a moron. Several months later Scot Hacker protested to Apple that iTunes was too sluggish - again because of NSTableView. The assembled legion of Mac idiots fell silent.
Another cute one was how a Swiss twit - who thought he'd written the ultimate 'Go' game which was nothing more than the GNU Go with a Cocoa front end which he boasted was 'the fastest in the world' - came out of the woodwork when we posted code that proved NSTableView was screwed up. And he posted the unforgettable:
Don't blame Apple if you write crappy code.
Whereupon we felled the ultimate bomb.
That isn't our code. That code belongs to Aaron.
Whereupon 'Mr OCD' now came out of the woodwork.
Aaron! Are you there? We know you're there, lurking! Come out! Come out and help us!
Aaron never came out. He didn't dare. His code was crappy and he knew it. And he also knew we were right.
End of first encounter. And then Scot Hacker came out and attacked Apple and suddenly everything was accepted. Without further acrimony. What a lovely group, these rude Apple developer assholes.
We mention this because each time we publish something exposing Apple programmers for the clowns they are, there's a flurry of flame mail, death threats, and the like. And nobody likes that. The answer isn't to keep silent - that's cowardice. So ultimately one pushes on anyway.
Then there's also the reason we stick with this platform. We've often said it's because they offer the best alternative and the most powerful and promising alternative to Windows - but that's not it. Not all of it.
Actually what we're after is to see C++ eat shit and die. Objective-C is the diametric opposite of C++ and it's a very good language. The only language for the application domains we deal in. C++ is a fucking mess.
There are a lot of things to be said about C++ but the most important is the following. C++ has damaged computer science more than any language ever. Even more than Pascal. And considering what Apple did with Pascal, that's saying a lot. And it's all true.
C++ is not an object oriented language. You need object orientation for GUIs because there is so much going on. But C++ is a terrible proposal for this. C++ is a derivative of Simula. Simula is what Bjarne Stroustrup took with him to Bell Labs.
It remains unfathomable that Bell Labs took that jerk on. He is so unpleasant. He feeds on his own self-adulation. And even if he were to turn out to be a 'nice guy' - which is theoretically impossible anyway - then one still couldn't like him because he unleashed this terrible C++ on the world.
Function names are mangled. Try debugging with mangled function names. Everything can be overridden. It's got no basis in computer reality. Pascal ideas are surreptitiously reintroduced - after the programmer world rejoiced that with C they were finally gone.
Worst of all: Bjarne built his pathetic language on C - and then turned around and talked shit about it.
C++ demands 'constructors' and 'destructors' - code snippets - that have to be included even if you don't need them. He invented his own confused command line I/O which is just bloody mess. And so forth. Worst of all: he's a terrible writer. Where Brian Kernighan is the best teacher in print ever, Stroustrup is easily the worst.
It was providence that moved Brian and Dennis to one division but put Bjarne in another when Bell Labs starting splitting up, Lucent was founded, and so forth. The twain will never meet. Thank goodness for that.
Exactly why so many people used C++ is not adequately researched. For Objective-C is the clear winner. But Brad Cox took out a restrictive patent on the language, possibly at the behest of Steve Jobs who went on to sign an exclusive licence with Cox for NeXT. And in 1995 Jobs 'bought' the language from Cox, right in time to begin negotiations with Apple for his comeback.
The language should be free today but does anyone honestly know what the situation is? Then too: the NS classes are a big part of what's going on. We don't know who is responsible for the original architecture but whoever it was (Avie?) it was a gargantuan work on a quality level with Unix itself. That's how good things were - even back when few heard of the platform and fewer still used it.
And then NeXT came to Cupertino. And the rest, as they say, is history. The world's most open and well designed architecture suddenly became more closed and more closed and more closed... NeXT had to play with everybody and did it well. Apple hate their third party developers and show this more than ever today. They treat them like shit. With scorn. With disdain. With those tiny beady eyes the 'Mac community' people are so infamous for.
And then the code started changing. Jaguar was good and felt solid but it had a hint of a stink of being Apple. There were a lot of good things that had been removed - and for no other reason than that they were 'not Apple'. Code was removed because it couldn't be 'Carbonised'. Code was moved around so 'Carbon' could get at some of the good stuff in NeXTSTEP. There are layers to system code to this day that are not needed save for access by Carbon code. And when you realise that the learning curve to NeXT's Objective-C is one of the smoothest the industry has ever witnessed, you really have to ask yourself how lame those Carbon people are. Three hours. That's what we say and that's what the Apple microsite said: it takes three hours to learn Objective-C. Provided you're a professional of course and not just an Apple wanker.
But three hours was too much. As was File Viewer. As were cascading menus coming out of the left of the screen. Apple finally dispensed with all that underlying NS technology - submenus that rip off and float wherever you like and so forth - because the Carbon wankers couldn't get at the code. This is literally a case where the features of an OS and its GUI are downgraded to placate the wankers who can't be bothered to spend three hours studying.
There may have been a lot of NeXTies around to begin with. But they were overwhelmed by lots more Maccies who were resentful of them. The Maccies had tried to write their own 32-bit OS (in Pascal undoubtedly) and failed miserably, more miserably than outsiders could ever possibly imagine. Now here come the NeXTies, on the crest of a $429 million wave, they start taking over key management positions, they actually grasp how the code works...
And in the meantime: most Maccies had never written a line of C, much less Objective-C. Any number of stories were told of how they were sent on courses - and in typical snarky Maccie fashion began stumbling immediately. Sooner or later they were let at the code. And again: the rest, as they say, is history.
Just witnessing the wanton destruction of the dazzling OPENSTEP interface (which a friend of 'Mr OCD' described at this site as the most brilliant thing he'd ever seen, so ahead that the Mac was left in the dust, gone forever and laughed at) into that dinky Mac System 7 type of thing with 16 colour graphics and pathetic rendering - only to build it back up again? That should mean war. It was the biggest travesty, the biggest disgrace, in the history of IT.
And we all lived with it. There was still potential.
The platform was doomed however - Jobs refused to license it. The dongle kept the market share down to single digit - the first time in the history of business anywhere where a CEO actually tried to make less money. And the fanboys were jubilant. Someone they know owns a Psystar? They can't sleep at night. They're sick - they're mentally ill. And computer science isn't supposed to be about abnormal psychology. But BA students specialising in Apple need courses in it. Oh do they ever.
Jobs' game plan sucks. Apple will go under again under his wayward leadership. But the iPod comes along. It's an idea and a device they buy up and absorb and then let Ive refine. In no time flat that relatively cheap iPod is bringing in over 50% of total corporate revenues. So much for Apple computers. And you can't get very far when their anal retentive CEO, as ignorant as they come when it comes to computer science and Internet security, declares the OS war for the desktop over and Microsoft the winners.
The computers aren't really stagnating yet but the iPod's forced time and again to jump the shark, the same old same old recycled ad nauseam. Nothing's happening in that company. Avie and Jon decide to leave together after twenty years - and make a statement in so doing: they resign the day before the 30 year anniversary of Apple. The door slams and the sound's heard around the world. The Mac fanboy media don't dare touch it. It's only Rixstep who mention it. Of course.
And by now the causes of platform fatigue are becoming apparent: the platform doesn't have any real developers (save us). There are no real developers to scream bloody murder at Apple and bitch slap them for being such screw-ups. There are no major corporations with programming teams who are going to put Apple (and Jobs) in their place. Everything goes dweeb, dweeb, and more dweeb.
The iPhone is a big secret. It's under development for two years. Not a word leaks out. It's based on perceptive pixels, great Ive design, and the kind of code that only Objective-C and the NS classes can achieve. Code is rewritten, honed and tweaked - but it's NS under the bonnet all the way. They might be changing the system of nomenclature but that's not Pascal - it's NS code all the way.
And so far the device is brilliant. Here's the hitch: the best programmers at Apple were put on the iPhone and a new litter came in to take care of the computer OS.
This new litter - they're not too good. Even today. Slowly but surely the computer OS is falling apart. Those guys working low level on Apple's beige coloured FreeBSD foundation: their work can be used on many Apple platforms - but up above that? It's new guys. It's inexperienced noobs who not only lack the ability to maintain the code but in most instances have precious little experience even using computers. They don't understand how GUIs are supposed to work. They don't understand - or miss completely - why twenty year old NeXT code has always been the way it's been.
And we've seen signs of this all along.
And now finally to NSTableView again. Today we added an eleventh column to Xfile. This is a column we'd previously decided would never be needed. A column for system/user flags. Which are normally (99.9999% of the time) zero.
But now Apple are really screwing around again with their totally undocumented (and unshared) compression technology and what looks a bit weird but otherwise alright on SL looks like a fucking mess from any other OS version. What a shame: there are special APIs in NS for compressing and uncompressing files but are they connected now, finally, after ten years? No. Fucking. Way. And where do these compression flags turn up? In the system/user flags? chflags()? O RLY? So they can be toggled through chflags()? Uh - NO. Not this time. You can try all you want but the flags won't stick. Another milestone for Apple.
Flags are being used. Most 'Unix' directories on 10.6 are full of them. Despite their being used in a really ridiculous way - no worse: a really 'Apple' way - they're still being used and we may want to see them.
Therefore the eleventh column.
Last time around (yesterday) we added a 'blue mode' feature so the file mode was rendered in an alternate colour if there were flags attached. It's not really pretty though, is it? So today we put in the eleventh column and made 'blue mode' an option disabled by default.
Now here's the trick: it's all in the NIB. The NIB has to get one more column. It has to be a new column right before the rightmost column (which is mode). This placement is necessary because the table view will automatically accommodate the rightmost column and that column has to remain most of the time.
The Xfile window is huge. It's 1,236 pixels wide. Making it any wider is out of the question. The 'flags' column has to be put in 'invisible' to begin with. So users can pluck it out if they want it.
It's right next to 'mode' and all you have to do is drag and there it is.
Columns have to have a minimum width of -3 to be allowed to be invisible. Now certainly there are Apple programmers everywhere who'd rather add megabloats of code to add and remove columns, and so forth - but it's very unprofessional. No worse: it's downright stupid. And when you have a file manager that's already listing 10,000 files in ten or eleven columns of data in a fifth of a second, you don't have to worry about speed. And the other (idiotic) method is way slower anyway.
Three pixels are used to paint the curvature and the border of the headers. The header borders go together and merge perfectly. This has always been so. Since the beginning of NeXT time, twenty years ago.
Now here comes the iPod. Then the iPhone. Then the new litter of programmers. Who honestly and truthfully don't have a clue and won't ever be able to buy one. And one of these idiots - for no good reason - decides that Interface Builder should not allow minus numbers for widths and things like that. Because to his feeble mind, it doesn't make sense!
You can still get minus widths with Interface Builder for Tiger. But only if you use the old 'archiving' format and only if you know how to work around the numerous bugs. For they're actually trying to thwart it anyway - except their code is so fucking buggy. (And it wasn't like that with IB for Jaguar earlier.) And if you switch - even temporarily - to their 'new improved' format, negative widths are 'corrected' to zero.
NSTableColumn widths are expressed in 'CGFloats'. CGFloat is either a float or a double. And both float and double can assume negative values.
We've been in programming a long time. And we've seen a lot of dumb things. But we've never seen anything dumber than this. Or any number of incredibly dumb things these idiots come up with. It's only Apple.
We spent about three hours booting between platforms and working Xfile's new NIB, ferreting out all the idiocies perpetrated by these morons. And remember: after Tiger 10.4, we're all lost - we're going to have to do things like this in code - despite the fact the NIBs themselves are perfectly capable of holding these values, despite the fact these values are more than just legal - they're fucking necessary.
And as stated: this is not the first instance. Most of us remember the scandal they caused (and we uncovered) back in the day Tracker was being tested up here at this forum. Another fuckhead had stared perplexed at the fact that legacy NeXT code defined the path to open an application as the path to the application itself, and couldn't understand it and so he just chopped it the fuck out of there.
A year later Apple corrected the code - but they never reverted the code to Panther. Panther is still diseased in this regard.
There are so many examples of this.
We know things are no better in other camps. The Ubuntu programmers are morons too. The world at large seems fatally afflicted with idiot programmers.
Remember Steve's theory? That good programmers are several hundred times better than average programmers? Steve was right.
But that doesn't help us now. For the mediocre have taken over and more and more rule the world.