Java vs Objective-C: The Smackdown
I’m in the process of translating an incomplete Java program I wrote into an Objective-C program. It’s a simple interpreter for a very simple kid programming language. At the time, it had been a while since I had dabbled in Java and thought it would be fun to get back into it. After all, it had “generics” and whatnot now, and other features that C++ programmers claimed it needed to be a “real” language. And what could be more fun than the overuse of angle brackets, incomprehensible error messages about type conflicts, and sounding all smart rambling on about “generics?”
Gooey Troubles
The Java programming went well until I had to start implementing the user interface for the interpreter. Now the language uses lots of graphics and such, because its supposed to be visual, and thus more fun for the kiddies. Although I remembered Java as being a nicely designed language, with a good runtime library, I had forgotten how bad the GUI programming was. Perhaps I was simply repressing the traumatic memories.
First off, there are a lot more choices for GUI programming in Java now than there was the last time I did Java programming. Back in the day, it was AWT (Abstract Window Kit) or nothing. AWT wasn’t horrible, assuming you were a masochist, but you couldn’t precisely position things, and the controls never looked native. Oh, and it was insanely slow too.
Fortunately, today you have more than one choice when it comes to UI toolkits. There’s still AWT, but Sun also introduced Swing (as in, swing from a rope) to replace AWT, and IBM came up with SWT, which apparently stands for “Swearing With Toolkits.” Now, when I say Sun replaced AWT with Swing, I actually mean they confused the hell out of me. Because Swing doesn’t actually replace AWT. It kinda supplements it in some places, while in others outright replaces AWT classes. But good luck figuring out which class does what. After about an hour with the Swing documentation I decided it’d be easier to just implement the entire UI in ASCII art. Oh, and if you were worried about the non-native control look in AWT, don’t worry, Sun kept that feature for Swing.
Then there’s SWT, which apparently stands for Surely the Wrong Thing. It was created by IBM for the Eclipse project, because AWT and Swing were so awful. But don’t worry, IBM bested Sun at its own game and managed to create something that was even worse. The Mac version of SWT was written by an Mac hating chipmunk. It uses old style Carbon controls and Quickdraw. That’s right, not even HIView’s and CoreGraphics, much less Cocoa. I’m hoping that since Adobe has an Eclipse plugin now, that they’ll rewrite SWT/Mac to be something decent, much like what they did with Premiere.
There’s also the Java Cocoa bindings from Apple. They’re not all that up-to-date though, and Apple has said they’re not going to be supported any more. Furthermore, if I’m going to be writing to Cocoa, why am I writing this program in Java?
So that’s where I stopped with the Java project. I decided that I didn’t really need all this pain to do, what should be, basic GUI programming.
Learning from the pain
I have a confession to make: I like Java. Or, more precisely, I want to like Java. I like the simple language design. It’s easy to read and write. It comes with a great runtime library: built-in threads and sockets and all sorts of good stuff. I think there’s a few things Objective-C can learn from Java, other than how to be tasty.
Probably the biggest thing I noticed when converting this Java program into Objective-C was the the garbage collection. What used to be a one line assignment in Java ended up being three or four lines in Objective-C to make sure it was retained and released properly. I found that there was more Objective-C code than Java code, primarily because I needed to manually manage memory. When I write straight Objective-C code, I don’t really think about it, because where I started, C++, memory management is even more manual. But doing the conversion really drove the point home: I’m wasting a lot of thought cycles and code on memory management. I also found that I suddenly had to worry about cyclic references, which I didn’t have to worry about in Java.
I know Objective-C 2.0 is going to have garbage collection, but it can’t get here soon enough for me. This conversion process only confirms the fact that the feature is well overdue, and how insufficient simple reference counting is.
Speaking of memory management, why is it that I have to manually call alloc and init? And inside my init, why do I have to manually call my super class and check self for nil? Why do I have to call my super class in dealloc? Maybe I’m delirious from all the reference counting I’ve been doing, but it seems to me like that’s all boilerplate code. Boilerplate that the compiler should be taking care of, not lazy ‘ol me.
Other than memory management, my other gripe is exceptions. Objective-C doesn’t unwind the stack per-se, it essentially just restores the stack and registers to a previous location, much like a thread context switch. i.e. It does a setjump() and longjump(). That means things get left dangling, memory leaks (‘cause, you know, no garbage collection) and it’s generally harder to clean things up and get back to a known, stable state. I know exceptions still work this way because of historical reasons, but it has to move on at some point. Currently the exception handling is pretty much worthless, which is probably why Apple still recommends NSErrors instead of exceptions. Exceptions aren’t worthless, just the way Objective-C implements them is.
No, it really was that painful
Now, lest you think I’ve gone all soft on Java, I have plenty of other gripes about Java. These didn’t come rushing back until I started the conversion to Objective-C.
Enums. Seriously, Sun, what up with that? One of the easiest to understand programming concepts, that’s equally as easy to implement, and it’s completely not there. What, did you think I’d like to list out every constant value, along with it’s type and value, instead of the compiler inferring the type and automatically generating a value for me? I may be a masochist, but I’m not that much of a masochist.
I’m still trying to figure out mule-headed reasons behind the “no free functions” rule. Is the thought that it would force the users to write object-oriented code? As they should be well aware, you can write really bad non-objected oriented code using just classes. Just look at MFC. Furthermore, just because someone writes a free function, that doesn’t make the code not object oriented. This failure of the Java language just feels like a religious argument to me.
Java really strives to be easy to read and understand. This lead to the fact that object parameters are passed by reference, while scalars are always pass by value. While this is somewhat easy to understand (See kids, this arbitrary type is reference, while this other arbitrary type is value), it doesn’t help real programmers. Because sometimes, I just have the crazy urge to retrieve the results of a function that’s done some work for me. I had more than a few classes that were just wrappers around out parameters.
And don’t even get me started on that stupid exception specification checking. Morons.
Ending the pain
Comparing Java to Objective-C is interesting to me because Java was influenced by Objective-C. All in all, almost all constructs in Java have a direct equal in Objective-C, which makes the conversion process fairly straight forward. However, while Java has continued to evolve, Objective-C hasn’t. Well, Apple is pushing forward with Objective-C 2.0 improvements, but the language has been dormant for many years. Hopefully, Apple will continue the improvements, and Objective-C can catch up.
As for Java, they just need to scrap all their previous GUI attempts and start over again. Until then, I don’t care how much support for “generics” or other modern language features Sun adds, I can’t take it seriously as a language or a platform.