The Java community members involved in defining the J2EE platform and its supporting standard enterprise APIs would be the first to admit that it was never their intent to create standards for every element of enterprise development. If there were only one right way to do everything, there wouldn’t be much room for creativity (or profit, for that matter). J2EE covers a critical mass of the capabilities needed when building and running enterprise applications. It standardizes those things that are required in order to make enterprise applications portable across application servers and support systems, to allow vendors to integrate their services into the Java environment in a standard way and to prevent enterprise developers from wasting their valuable time re-creating utilities that really should be plumbing, enabling them to focus their efforts on the business logic and user interfaces of their systems.
In addition to the standard Java Enterprise APIs discussed earlier, an enterprise developer typically finds herself turning to a set of other support tools and extension APIs in order to be efficient and effective in building enterprise systems. These tools and APIs fill the gaps left by the standard enterprise Java APIs. Sometimes these gaps are intentional (e.g., “it doesn’t make sense to standardize an API for X”). Other times the gaps are temporary, because a standard hasn’t been defined yet for a particular area.
In the second part of this book, we provide tutorials on popular, effective tools and APIs that are typically used when building enterprise Java systems. Choosing the tools to include in this section was difficult in many cases, because so many viable, effective options are available in some areas. But we chose tools and APIs based on the size of their respective user communities, the effectiveness of their solutions, and the expected longevity of the tools and APIs relative to other options.
Readers will quickly realize that all of the tools chosen for this section of the book are open source tools (see http://opensource.org). That was an intentional decision on our part, for a few reasons. In nearly every functional area covered here, a family of solutions is available, with some commercial and some open source. The right tool for you depends on the context of your enterprise development scenario, but we wanted to give you practical exposure to the issues in each area, a sense of why extension tools exist in the first place, and what makes the tool effective and useful. In order to do that, we needed to choose tools that were accessible to all readers and that represented proven, effective solutions for the areas we cover here. For these reasons, open source tools were the obvious approach. While many good, effective commercial tools exist in virtually all of these areas, they would not necessarily be accessible to all readers to experiment with the examples provided here. We also realize that commercial tools are a necessary and effective part of many development environments that readers need to use. To that end, as we cover each of the tools included here, we’ve tried to highlight general concepts in each area that will help map the characteristics found in these open source tools back to their commercial counterparts and hopefully help you, in an indirect way, to understand and manage those tools.
Ant is used to manage the process of building and
deploying code, among other things. Ant is similar in purpose to
age-old tools like
imake. It provides a way to codify the
parameters needed to build your code (like dependent libraries,
configuration files, and the like) and to define various tasks that
can be done with the code (compile it, generate Javadoc pages from
it, assemble it into a J2EE application archive, deploy it to an
application server, and run unit tests on it, among other
Ant is an open source project managed by the Apache Software Foundation. It “ships” with a large set of core tasks that can be used to compose Ant “build scripts,” which are written in XML. An Ant build script consists of a set of targets that you define. Each target consists of a set of tasks that are executed when that target is requested by the user or invoked by another target. These tasks can be core tasks included with Ant (like “compile this set of Java code” or “copy these files to that directory”), they can be custom tasks that you define, or they can be tasks that are imported from a third-party library. Many J2EE application servers and tools (including many of the open source tools discussed in this book) now include their own Ant tasks, to allow you to easily integrate them into your project Ant scripts.
Many Ant tasks and practices have been defined to help develop and manage large enterprise projects in the Java environment. Chapter 17 provides both an overview tutorial of Ant in general and some details on Ant-related utilities that are particularly useful when developing J2EE applications. The chapter also includes some best practices in terms of designing Ant build processes for multiple developers and multiple environments.
In any application development context, the practice of testing your code is a critical success factor. Testing enterprise applications involves a number of different dimensions. The user experience and the proper behavior of views (in the model-view-controller, or MVC, sense) can be tested using functional testing tools; the correct behavior of software at the code level, in terms of objects and components performing as expected, can be tested using unit testing tools; the throughput and overall behavior of a system when under heavy load (large numbers of users or transactions or both) can be tested using performance testing tools.
JUnit is an open source unit testing framework for Java. It is one in a series of such tools, built for a number of development environments using the same conceptual architecture. Other tools in this suite include PerlUnit for Perl and CppUnit for C++. JUnit includes a Java API that provides interfaces and base classes for defining and running unit tests as well as some tools that facilitate the configuration, running, and reporting of unit test suites. JUnit is used by developers to write suites of unit tests that exercise their code in critical ways and that verify the results to ensure that they are correct, according to the documented behavior of the code under test. These unit tests are themselves Java code, making it easy to use the same code management tools to manage tests for code along with the code itself.
Enterprise developers face unique challenges when unit testing their Java objects and components. The proper behavior of enterprise code can be tested only if a suitable simulation of its runtime environment can be achieved. A web component like a JSP tag handler, for example, can be tested only if the testing framework can operate within a web container, can issue simulated handle requests to the tag handler, and can interpret the responses that are generated to assess whether the test succeeded or not.
To facilitate the task of enterprise unit testing, the good people at Apache extended JUnit with a framework called Cactus. Cactus allows J2EE developers to write unit tests for full enterprise components, like servlets and EJBs.
Chapter 18 provides an introduction to the basic JUnit testing framework and also gives a tutorial on using Cactus to define and execute test suites for enterprise components.
The MVC paradigm for architecting UI-oriented applications was introduced more than 20 years ago in the Smalltalk environment. Since then, it’s become a popular and effective pattern for designing and managing web applications, and many tools have been created to support it.
In the MVC pattern, the model represents the data and business logic at the heart of the application, views represent interactive user interface elements, and the controller links the views to the model and keeps the application flow moving. In an MVC-based application, users interact with one or more views in the user interface. Their actions are passed to the controller, which is responsible for handling them. Based on the user’s actions, the controller can make changes to the application model, generate new views for the user, or all of the above. Changes in the application model can also generate notifications directly to views, causing them to change appearance in key ways or adjust their behavior, based on the new state of the model.
Java servlets and JavaServer Pages provide the essential tools for building web interfaces in Java, using an object-oriented programming model or a “scripted” web page model, respectively. Struts is an Apache project built on top of both of these standard Java technologies, using servlets to implement the controller, both JSPs and servlets to implement views, and standard Java technologies (such as Java beans and EJBs) to implement the application model. Struts further facilitates MVC development by providing a configuration scheme for specifying the page flow of the application based on user actions and an API for defining action handlers and other key MVC elements.
It’s important to point out that the introduction of the JavaServer Faces standard provides many of the same elements that the Struts project covers. Put another way, Struts was created to fill a perceived gap between the Java standards at the time and the needs of enterprise developers. Since then, JavaServer Faces was defined to fill a part of that same gap. JSF was released in specification form in March 2004, and implementations are now readily available. The relationship that Struts will have in the future with Java servlets, JSPs, and JSF still remains to be seen, but in the meantime, Struts remains a popular and effective tool for MVC development.
A tutorial on Struts is provided in Chapter 19. Because of the functional overlap between JSF and Struts, the code examples in the two chapters parallel each other and have identical user interfaces. As a result, you can compare and contrast the implementations to help you decide which solution is right for your application.
Many enterprise application scenarios involve interactions with relational databases. When operating within an object-oriented environment like Java, the issue of mapping relational constructs, like tables and columns and rows, into object-oriented concepts, like objects and methods and data members, arises to some degree in any database-enabled application. JDBC provides a relatively low-level interface for interacting with databases—allowing you to establish connections to database engines, construct SQL statements and issue them to the database, and retrieve the results in the form of basic Java data types. The work of integrating these operations into the overall object model is left up to the developer.
The approach to managing the object relational mapping (ORM) is an important one since it can make the overall development, testing, and long-term management of your code either very easy or very difficult indeed. For example, suppose that you developed a complicated application that relies heavily on a given database with its own rich, complex schema structure. Now suppose that the database needs to be changed in some significant ways (existing tables are restructured or eliminated, new tables and columns are introduced, and stored procedures are altered). If you have mapped the relational structures of the database into your application by sprinkling JDBC calls throughout your code in an ad hoc fashion, refactoring the application will be possibly painful and definitely costly. On the other hand, if you’ve used a consistent object relational mapping pattern that insulates the application from the database details, the refactoring job will be relatively straightforward, take less time, and introduce less instability into the system.
Since ORM mappings play such a potentially important role in enterprise development strategies, a number of tools and approaches have emerged to fill the gap between the low-level standard API provided by JDBC and the object-level persistence mapping needs of enterprise developers. One such tool is an open source project called Hibernate, currently managed by JBoss along with their open source application server and other related projects. Hibernate provides a framework for defining mappings from relational elements to Java object-oriented elements, for generating the SQL calls implied by these mappings, and for exposing the persistence capabilities to the rest of an enterprise application using a clean Java interface that effectively isolates the persistence details from the rest of the system. This makes database evolutions easier to manage, makes the overall application more portable (to different database schemas or vendors), and makes your life as an enterprise developer much happier in the long run.
It’s important to note that efforts are afoot in the Java community to define standard ORM mapping approaches. The most notable of these efforts are the EJB 3.0 specification work (JSR 220) and the Java Data Objects (JDO) standard (JSR 012). JDO came about as a standards effort after Hibernate and other ORM tools had already taken hold in the Java community, with the same conceptual goal as many other Java standardization efforts—to define a portable approach to ORM mappings that tools could support independently, giving the enterprise developer the ability to choose solutions and possibly switch solutions in the future without a lot of rework. The 1.0 version of JDO took a slightly different technical approach to managing the ORM mappings than Hibernate and other tools. The JDO approach involves postprocessing of Java bytecodes to enable their persistence capabilities, using an ORM mapping to drive the generation of persistence code at the bytecode level. The JDO approach did not take significant hold in the Java community for a variety of reasons, and a 2.0 version of the JDO specification is now being developed. Meanwhile, elements of the JDO and Hibernate approach to object relational mapping have had a significant influence on the upcoming EJB 3.0 specification, which is in development. The newer versions of Hibernate implement preliminary versions of the EJB 3.0 model, and it’s expected that the relationship between the two will become even closer in the future.
Since this standards area is still in a state of flux, we chose to include an established but nonstandard ORM mapping tool in this edition, to give you exposure to a tool that can be immediately applied in enterprise contexts and to provide a basis for understanding other ORM mapping tools that you may encounter in the Java community.
Chapter 20 provides a tutorial on the use of Hibernate to create ORM mapping layers for your applications.
As enterprise applications and their corresponding development practices have matured and expanded their scope, the concept of code and artifact generation has become more and more important. In some cases, it can be more efficient to specify behavior using metadata and attributes rather than explicitly specifying the lines of code needed to perform a task. It’s also possible to infer configuration parameters from attributes of source code rather than having to manually map the intentions of the code to configuration details, like deployment descriptor entries. A good example of this is object relational mapping scenarios, as discussed in the previous section. In many cases, it can save you a lot of time and effort if you can simply describe a mapping from relational tables to Java elements, using annotations inserted into the Java code itself, and then let a tool generate the corresponding implementation artifacts: the Java code and SQL statements that implement the mapping described in the metadata. Another example of this concept that is familiar to any Java developer is Javadoc: tags inserted into your Java code are used by Javadoc to generate HTML documentation for your source code, saving you the trouble of writing the HTML yourself.
XDoclet and the newer Java annotations (introduced in Java 5.0) are tools to assist with attribute-oriented programming. Attribute-oriented development is aimed at leveraging this metadata-driven approach to generating code and other implementation artifacts, but it is specifically aimed at using metadata inserted into Java code to drive the generation process.
XDoclet’s approach uses Javadoc extension tags to drive the process of generating source code, deployment descriptors, configuration files, or whatever other implementation artifacts are needed to deploy and run your code. The typical example used to explain XDoclet’s utility involves EJB components. Rather than writing several Java interfaces, Java classes, and a deployment descriptor, you can annotate a single Java class with special XDoclet metadata tags and use XDoclet to generate all the other required EJB artifacts for you.
The newer Java annotation features built into JDK 1.5/Java 5.0 provide similar capabilities through actual Java syntax extensions instead of Javadoc comments. The concept is the same (code is annotated with metadata that is used to generate other artifacts), but the annotation feature is a standard element of the Java language itself.
Chapter 21 provides a tutorial on the general metadata models used by XDoclet and Java annotations and also provides details on some specific enterprise Java use cases, such as developing EJB components and web services. We also describe the integration of XDoclet and Java annotations into Ant build scripts. This chapter is an exception in the book since it covers both standard (Java annotations) and de facto standard (XDoclet) tools for the same functionality. In this case, we felt that there were significant parallels between the two solutions and presenting them side by side in the same chapter was much more effective for providing a comparison. Struts and JSF, on the other hand, use very different programming models to achieve somewhat similar functionality, so we felt they would be better served by separate chapters.