This is the thirteenth installment of Cubist Threads, but ironically I'm feeling pretty darned lucky to be writing it. Who would have thought this blatantly self-aggrandizing auto-theoretica would survive a whole year?
Not me, though I must admit that time has really flown by. Every month, Alan's thoughtful e-mail reminder catches me a little off guard.
I've already used up most of my "brushes with greatness" stories, so I find myself at a bit of a loss as to just what to write about this month. For the life of me, I can't seem to get into "flowery verbiage mode" today, so please bear with me as I try to adopt a less formal tone. (After all, we've been through a lot together this past year, dear reader, so perhaps I should start addressing you less as an abstract audience of strangers, and more as the amalgamated "friend" you've become.)
I've been very busy at work lately. Schedules have a way of creeping up on me, and the project I'm currently working on is no exception. Compared to recent projects, the nice thing about this one is that it is absolutely thick with Java code. The not-so-nice thing, as usual, is the looming deadline. There's something about the word deadline that really twiddles my bits.
As I've told you before, my links to Java began tenuously. My work has mostly been about writing bits and pieces of the C++ "underpinnings" of our wonderful iSeries JVM, but lately I've been studying (and writing) a lot of Java code - good and bad - and have never been happier. Oh sure, writing C and C++ code is fun and all, but the relative drudgery of worrying about structure alignment and storage management really makes me appreciate the beauty of a garbage-collected language like Java.
Of course, part of me worries that having a garbage collector makes me a "namby-pamby" programmer. It just isn't hard enough to implement convoluted program logic when you know you have a good GC behind you, right? How can I maintain my illusion of guruery if I don't have to malloc() and free()? Shouldn't I mask the occasional sign bit, grok the double floating point format, or XOR repeatedly and recursively? "Proper" programming simply can't be simplified too much, or else it turns into some abominable exercise in visual button-pushing, doesn't it? I mean, you can't let just anybody be a programmer, can you? Egad!
Yeah, right...hogwash. Programming is about taming unruly slivers of etched silicon, using whatever best practices are available (at the current state of the art) to get the job done.
When I started programming, one of the first things I really sank my teeth into was chaos and fractal visualizations. It seemed like my (4.77MHz 8088-based) hardware was never even close to fast enough - especially for Mandelbrot set renderings - so I remember taking great pains (can you say ASM?) to implement an M-set iterator directly in the 8087 math coprocessor chip's fabulously expansive 8-register onboard stack.
The interface between Turbo Pascal 3.x and ASM was as clean as could be hoped for: the compiler pushed floating point arguments to my little assembler routine (creatively named "iter") directly into the 8087 stack. All my ASM code had to do was merrily FDUP, FMUL, and FCMP to my heart's nascent content. My "improved" iterator ran in just one-third the time taken by the high-level language algorithm (even fully optimized), and I was in programming Nirvana.
Of course, the real world reasserted its cantankerous nature when the next release of the compiler up and changed the ASM linkage. Suddenly, parameters were no longer getting pushed directly into the 8087 registers, so I had to rework the linkage, losing a good bit of my hard-won performance improvement.
One true beauty of programming in Java, in my not-so-humble opinion, is that we get to think about bigger pictures, without getting caught up in relative trivialities. Instead of ensuring correct argument alignment, chasing pointer bugs, and forgetting to delete our temporary structures, we can spend our time changing the world.
Okay, I've obviously swung my paradigm pendulum too far. I have a ton of respect and admiration for the bit-twiddlers and storage managers in the programming world. Frankly, it is on their vast shoulders that we stand, boldly new-ing where no programmer has new-ed before, and we must never forget that. Java's decidedly credible beauty of form is -the result of incredible intellectual effort and the culmination-du-jour of programming practice.
What's next? "Computer, please cue up some Blue Oyster Cult and brew me a stout ale, will ya?" Cool.
Blair Wyman is a software engineer working for IBM in Rochester, Minnesota, home of the IBM iSeries. [email protected]