Much of the cutting edge today is based on Unix which has been continuously evolving since 1969... if you solve problems well from the outset, you often don't have to re-solve them over and over.
There are few problems for which unchecked pointer arithmetic is a good solution, and even fewer at the application level. Computers are hundreds of times faster than they were in the NeXTstep days. We no longer need to risk undefined behavior just to get a GUI to keep up with a human. Hell, Eclipse is probably the most bloated Java IDE you could find, and it's only sometimes intolerable on a six-year-old laptop.
ObjC is basically Smalltalk weakened by some dangerous efficiency hacks we haven't needed for a decade.
You know: my heart agrees with this, but over and over I see 'managed' systems exhibiting substantially longer startup times, unexpected and sometimes-long freezes, and often visibly lower performance.
Similarly, virtual memory backed by paging to a swapfile is the beautiful, "right" solution to me intuitively for a memory architecture that is made of several storage classes of highly different access times; yet, I have to admit that hourglasses/beachballs aren't a part of most "mobile" systems but often are — frustratingly — part of the desktop OS experience.
In actual practice, there's very little unchecked pointer arithmetic in a Cocoa [Touch] app — you don't access the elements of an NSDictionary via direct (to you) pointer arithmetic but through well-tested methods that fail similarly to how to out-of-bounds collection access fails in managed systems.
Anyway, it's unfortunate that such things are essentially non-comparable in a scientific way to my knowledge; but my experience of ObjC apps is not that they tend to be more crashy or likely to corrupt my data than the Java/C# apps that I also use day-to-day.
But your laptop isn't your phone. I don't think Eclipse would run very well on my phone's relatively weak ARM core: it barely keeps up on my 2008 MacBook Air.
Unfortunately, Smalltalk is basically dead to mainstream programmers (and the image concept never played well with others anyway), so their choice is between safety (Java) and better method dispatch (ObjC).
Which would be a valid point if Objective-C had evolved since 1986. The C family has evolved since then, C++, Java, C# v1 - C# v4. Objective-C gained properties in v2 and a host of ways to break encapsulation but its hardly evolved. The tools are even worse. XCode Fanbois rejoice at "new features" that I've had in SlickEdit since 1993.
You've hit the nail on the head with regard to my complaint.
Why make such a comment? Being so short you can hardly add to the discussion. Being so sarcastic you add no more than a negation. To make a contribution whose tone is negative is both easy and galling. Strive, instead, to find the things which are worth saying.
Because it needs to be said. PHBs look at the iPhone and its success, and see that it uses ObjectiveC, and jump on the bandwagon. Someone who has actually used ObjectiveC and other languages needs to point out that Emperor Steve's new clothes were just stolen from a pantomime wardrobe in 1986.
You can make the same argument without detracting from it by adding unnecessary sarcasm. Your original comments says less than you imagine it did and is easily dismissed without consideration on the part of the reader.