Java has emerged from its own hype relatively unscathed and is now showing itself capable of matching the lofty predictions made for it. The two main indicators of this rite of passage are standardization and evolving best practices developments that are bringing corporations much nearer to achieving the productivity gains that Java can deliver.
Java's advantages are well known. In the spotlight from the outset has been the promise of WORA ("write once, run anywhere"). Even in an imperfect state WORA offers corporations bottom-line benefits that accommodate past, present and future: cost-effective integration with legacy investments, relatively painless synchronization of heterogeneous business units and longer-term platform viability that protects against costly future changes in technology infrastructure.
Developing in Java also brings productivity benefits. Because it's object-oriented, Java offers more opportunity for reuse of blocks of code, allowing teams to amortize development costs over more projects and longer time periods. Java is also easier to learn and work in, which allows for faster development and condensed training.
Now that its advantages are becoming recognized, Java is finally approaching its heyday in the enterprise. How do we make sure it behaves itself, though, having got to this point?
While emerging technologies always tend to be accompanied by rosy expectations of cost savings and streamlining of processes, turning such expectations into reality is another matter. While other languages and platforms already have a history long enough to benefit from the emergence of best practices, Java development has remained pretty much a game without rules. But this is changing.
Standardization Is On Its Way
The trend toward standardization is evidence of this shift. While there's no shortage of companies peddling Java products, the shakedown has begun, driven by the desire of corporate purchasers to secure reputable, stable vendors in an insecure and immature market.
Pressure is growing to standardize on best-of-breed tools and components from preferred vendors. Good candidates for standardization include application servers and IDEs, as well as JavaBeans such as KL Group's JClass JavaBeans or Rogue Wave's StudioJ. These reusable GUI components allow developers to build graphical front ends quickly and easily. Acting as ready-made building blocks, JavaBeans provide functionality for requirements that tend to recur from project to project (e.g., graphs, charts and tables). By standardizing on one family of components, teams can ensure high-quality interfaces and protect themselves from inconsistencies in their code base. What's more, by purchasing ready-made components rather than dedicating in-house resources to building them from scratch, companies can save valuable development time and focus on core competencies.
Equally important is a commitment to best-of-breed methodologies such as rigorous and timely code tuning. Again, other languages have a head start on Java. In Java development, early efforts concentrated more on the seemingly glorious capabilities than on the limitations. As Java becomes a serious contender for corporate application development, however, cost-conscious managers are increasingly focused on identifying and overcoming these limitations.
A key concern for Java has been performance. Software companies have risen to the challenge with solutions designed to enhance Java performance that include native compilers, VM improvements, and performance tuning and code analysis tools. Products such as JProbe Suite from KL Group and OptimizeIT! from Intuitive Systems are good examples of this. These tools offer some or all of the following functionality: performance profiling, memory debugging, thread analysis and code coverage.
Best Practices Are Evolving
How and when should these tools be used? Best practices associated with other programming languages may be useful, even when least expected.
Take memory debugging, for instance. With the zeal typical of early adoption, developers once heralded Java as the language that would rid applications of memory leaks once and for all, thanks to the garbage collector. Yet today developers are beginning to recognize that garbage collection isn't a panacea after all. Java has its own unique brand of memory leaks that though quite different in nature from C++ leaks can have an even more devastating effect on performance. Consequently, memory debugging is now recognized as a critical component of the Java development cycle.
Performance tuning is another best practice that can be applied equally well to Java. Donald Knuth once quipped that "premature optimization is the root of all evil." This 25-year-old quote is often taken out of context to imply that all performance tuning should be performed in the QA and acceptance phases; conveniently omitted is the first half of the quote: "We should forget about small efficiencies, say about 97% of the time." Knuth's maxim was written at a point when computing time was several orders of magnitude more expensive than it is now. Thus developers learned to use "coding tricks" to squeeze performance out of programs, often at the expense of maintainability. More often than not, the highly obfuscated code that resulted wasn't a performance bottleneck to begin with.
In the Internet age each new revision must be rolled out within a shorter time frame than ever before. These applications, with a large user base, have performance and scalability requirements that were unheard of 25 years ago. Today's applications are larger too, and more complicated, relying on a great deal of componentized code written by many different authors. Postponing performance tuning until the end of a project can result in problems that are difficult to diagnose and often require massive rework and redesign to address, which threatens the project schedule. No development team can afford to take those kinds of risks in today's environment, where time-to-market is paramount.
When should performance tuning be done? Move tuning efforts too early and developers may run right back into the evils of premature optimization, even in the Java space. Experts recommend a risk management strategy that sees tuning begin immediately after proof of concept is safely in the bag. This strategy avoids profligate optimization efforts on mere prototypes while simultaneously ensuring that performance problems aren't given the opportunity to accumulate prior to deployment. Consequently, thread analysis, memory debugging and performance profiling in Java are increasingly taking place at the functional unit level, typically earlier than in C++ development. This new best practice makes it easier for companies to deploy reliable Java applications on time and within budget.
Talk of Java on the server and in the enterprise is now increasingly underpinned by pragmatic technologies and methodological principles that are turning the promises into reality. Standardization and best practices are a clear indicator of Java's coming of age and the real bottom-line results will soon follow.
Ed Lycklama is the chief technology officer and cofounder of KL Group, with primary responsibility for overseeing the company's technology direction and intellectual capital.