Enterprise Java Development@TOPIC@

Enterprise Computing with Java (605.784.31) Course Exercises

Revision: v2014-11-30

Built on: 2015-11-18 03:51 EST

Abstract

This book contains lab exercises covering for JHU 605.784.31


General Instructions
1. Lab Exercises
I. Enterprise Java (605.784.31) Development Environment Setup
Purpose
2. Java JDK Setup
2.1. Download and Install JDK
2.2. Verify your JDK is installed
3. Git Client Setup
3.1. Install Git Client
3.2. Get Class Repository
4. Maven Environment Setup
4.1. Maven Installation
4.2. Maven Configuration
4.3. Test Maven Build
4.4. Missing Dependencies
4.5. Build Local Site
5. (Optional!!!) Maven Proxy Setup
5.1. Nexus OSS Manual Setup
5.2. Integrate Proxy with Maven
6. JBoss (Wildfly) Setup
6.1. Download and Install Wildfly 9.0.1.Final
6.2. Configure JBoss Server
6.3. Add JBoss Admin Account
6.4. Enable JBoss Remote Debugging
6.5. Define JBoss Maven Properties in settings.xml
7. H2 Database Setup
7.1. Locate the h2*.jar file
7.2. Start Server
7.3. Access DB User Interface
7.4. Activate H2 Server Profile for Builds
7.5. Update JBoss to use Server Mode
8. Eclipse Setup
8.1. Download and Install Eclipse
8.2. Define JDK location
8.3. Setup Maven Eclipse Integration (m2e)
8.4. Setup EGit Eclipse Team Provider
8.5. Setup JBoss Eclipse Integration
9. Ant Setup
9.1. Install Ant
II. JavaSE: First Simple Module Exercise
Purpose
1. Goals
2. Objectives
10. Develop and Test Module using Command Line Tools (OPTIONAL!)
10.1. Summary
11. Automate Build and Testing with Ant (OPTIONAL!)
11.1. Summary
12. Adding Logging
12.1. Summary
13. Creating Portable and Repeatable Project Builds with Maven
13.1. Summary
14. Leverage IDE using Eclipse
14.1. Import a project into Eclipse
14.2. Setup Eclipse to be able to execute Maven project goals
14.3. Setup environment to enable interactive debugging
14.4. Summary
III. Java Persistence API: Entity Manager Exercise
Purpose
1. Goals
2. Objectives
15. Setup and Start Database
15.1. Summary
16. Create Core Project POM
16.1. Summary
17. Setup Database Schema
17.1. Summary
18. Add SQL Tuning
18.1. Summary
19. JPA Persistence Unit Setup
19.1. Summary
20. Setup JPA TestCase
20.1. Summary
21. Ready Project for Import into Eclipse
21.1. Summary
22. Test EntityManager Methods
22.1. Summary
23. Automatically Generate Database Schema
23.1. Summary
24. Create JPA Parent POM
24.1. Summary
IV. Java Persistence API: Entity Mapping Exercise
Purpose
25. JPA Entity Exercise Setup
25.1. Setup Maven Project
26. JPA Entity Class Basics
26.1. Create POJO Class using descriptor
26.2. Create POJO Class using annotations
26.3. Summary
27. Mapping Class Properties
27.1. Map Entity to Specific Table
27.2. Using JPA Property Access
27.3. Summary
28. JPA Enum Mapping
28.1. Mapping Enum Ordinal Values
28.2. Mapping Enum Name Values
28.3. Mapping Enum Alternate Values
28.4. Summary
29. Mapping Temporal Types
29.1. Mapping Temporal Types
29.2. Summary
30. Mapping Large Objects
30.1. Mapping CLOBS
30.2. Mapping BLOBS
30.3. Summary
31. Primary Key Generation
31.1. IDENTITY Primary Key Generation Strategy
31.2. SEQUENCE Primary Key Generation Strategy
31.3. TABLE Primary Key Generation Strategy
31.4. Summary
32. Mapping Compound Primary Keys
32.1. Using Embedded Compound Primary Keys
32.2. Using Compound Primary Keys as IdClass
32.3. Summary
33. Mapping Embedded Objects within Classes
33.1. Mapping an Embedded Object
33.2. Mapping Muti-level Embedded Objects
33.3. Summary
34. Objects Mapped to Multiple Tables
34.1. Mapping to Secondary Tables
34.2. Summary
V. Java Persistence API: Relationship Exercise
Purpose
35. JPA Entity Exercise Setup
35.1. Setup Maven Project
36. Mapping One-to-One Relationships
36.1. Setup
36.2. One-to-One Uni-directional Relationships
36.2.1. One-to-One Uni-directional Using a Foreign Key
36.2.2. One-to-One Uni-directional Using a Join Table
36.2.3. One-to-One Uni-directional Using a Primary Key Join
36.2.4. One-to-One Uni-directional Using MapsId
36.2.5. One-to-One Uni-directional Using Composite Primary/Foreign Keys
36.3. Mapping One-to-One Bi-directional Relationships
36.3.1. One-to-One Bi-directional Joined By Primary Key
36.3.2. One-to-One Bi-directional 0..1 Owner Relationship
36.3.3. One-to-One Bi-directional 0..1 Inverse Relationship
36.4. One-to-One EntityManager Automated Actions
36.4.1. One-to-One Using Cascades From Owner
36.4.2. One-to-One Using Cascades From Inverse
36.4.3. One-to-One Using Orphan Removal
36.5. Summary
37. Mapping One-to-Many Uni-directional Relationships
37.1. Setup
37.2. One-to-Many Uni-directional
37.2.1. One-to-Many Uni-directional with Join Table
37.2.2. One-to-Many Uni-directional using Foreign Key Join (from Child Table)
37.2.3. One-to-Many Uni-directional Mapping of Simple Types using ElementCollection
37.2.4. One-to-Many Uni-directional Mapping of Embeddable Type using ElementCollection
37.3. One-to-Many Provider Actions
37.3.1. One-to-Many Orphan Removal
37.4. Summary
38. JPA Collections
38.1. Setup
38.2. Entity Identity
38.2.1. Instance Id
38.2.2. Primary Key Id
38.2.3. Switching Ids
38.2.4. Business Id
38.3. Collection Ordering
38.3.1. Ordered Collections
38.4. Collection Interfaces
38.4.1. Using Maps with Collections
38.5. Summary
39. Mapping Many-to-One, Uni-directional Relationships
39.1. Setup
39.2. Many-to-One Uni-directional
39.2.1. Many-to-One Uni-directional Using a Foreign Key
39.2.2. Many-to-One Uni-directional Using a Compound Foreign Key
39.2.3. Many-to-One Uni-directional Using a MapsId
39.3. Summary
40. Mapping One-to-Many/Many-to-One Bi-directional Relationships
40.1. Setup
40.2. One-to-Many Bi-directional using Foreign Key Join
40.3. One-to-Many Bi-directional using Join Table
40.4. One-to-Many Bi-directional using Derived Composite Primary
40.5. Summary
41. Mapping Many-to-Many Relationships
41.1. Setup
41.2. Many-to-Many Uni-directional
41.3. Many-to-Many Bi-directional
41.4. Summary
VI. Java Persistence API: Query Exercise
Purpose
42. Exercise Data Model
42.1. Class Model
42.2. Database Schema
42.3. Object Instances
43. JPA Entity Exercise Setup
43.1. Setup Maven Project
44. Creating JPA Queries
44.1. Setup
44.2. Create/Execute Query
44.2.1. Multiple Results
44.2.2. Single Result
44.2.3. Single Result - NoResultException
44.2.4. Single Result - NonUniqueResultException
44.3. Query Parameters
44.4. Paging Query Results
44.5. Named Query
44.6. Value Queries
44.6.1. Retrieve Value
44.6.2. Retrieve Function Result Value
44.6.3. Retrieve Multiple Values
44.6.4. Encapsulate Row Values with ResultClass
44.7. Summary
45. SQL Queries
45.1. Setup
45.2. Create/Execute SQL Query
45.3. SQL Query Entity Result Mapping
45.4. SQL Result Set Mapping
45.5. Summary
46. Bulk Updates
46.1. Setup
46.2. Additional Setup
46.3. Using JPQL Bulk Update
46.4. Using Native SQL Bulk Update
46.5. Summary
47. Query Locks
47.1. Setup
47.2. Additional Setup
47.3. Using No Locks
47.4. Adding Lock Mode
47.5. Using Pessimistic Write Lock
47.6. Summary
VII. Basic EJB Development Exercise
Purpose
1. Goals
2. Objectives
48. Multi-Module JavaEE Project
48.1. Purpose
48.1.1. Goals
48.1.2. Objectives
48.2. Create Root Module
48.3. Create EJB Module
48.4. Manage Application Server
48.4.1. Application Server Setup
48.4.2. Standalone Application Server
48.4.3. Embedded Application Server
48.5. Summary
49. EAR Deployment
49.1. Purpose
49.1.1. Goals
49.1.2. Objectives
49.2. Create EAR Module
49.3. Create RMI Test Module
49.4. Deploy the EAR
49.5. Lookup and Invoke @Remote Interface
49.6. EJB Client
49.7. Summary
50. WAR Deployment
50.1. Purpose
50.1.1. Goals
50.1.2. Objectives
50.2. Create WAR Module
50.3. Add RMI Test
50.4. Embed EJB in WAR Module
50.5. Summary
51. Build Commands
51.1. Purpose
51.1.1. Goals
51.1.2. Objectives
51.2. mvn (phase)
51.3. mvn (phase) -rf :module
51.4. mvn (phase) -f (path to module)
51.5. mvn clean -Pundeploy
51.6. mvn clean -Dit.test=fully.qualified.ITPath#testMethod
51.7. Summary
52. Controlling JNDI Names
52.1. Purpose
52.1.1. Goals
52.1.2. Objectives
52.2. Eliminate Version# from EAR-based JNDI Name
52.3. Eliminate Version# from WAR-based JNDI Name
52.4. Summary
53. Debug Remote EJB
53.1. Purpose
53.1.1. Goals
53.1.2. Objectives
53.2. Running IT Tests in IDE
53.3. Debugging Deployment to Standalone Server
53.4. Debugging Deployment to Embedded Server
53.5. Summary
54. EJB Parent POM
54.1. Purpose
54.1.1. Goals
54.1.2. Objectives
54.2. Create Root POM
54.3. Summary
VIII. JNDI/ENC Configuration Lab
Topics
55. Configuring EJB Using Annotation Lookups
55.1. Setup
55.2. Inject Resources
55.3. Summary
56. Configure EJB using Injection of ENC Resources
56.1. Setup
56.2. Inject Resources
56.3. Summary
57. Configuring EJB using JNDI Lookups of ENC
57.1. Setup
57.2. Inject Resources
57.3. Summary
58. Configuring EJB using Injected Resources from XML
58.1. Setup
58.2. Inject Resources
58.3. Summary
59. EJB Injection
59.1. Setup
59.2. @EJB Inject No Interface EJB
59.3. @EJB Inject Local Interface Bean
59.4. @EJB Inject Remote Interface using Lookup
59.5. Summary
IX. Implementing JPA-based EJBs and Remote Interfaces Lab
Purpose
1. Goals
2. Objectives
60. Getting Started
60.1. Start the Application Server
60.2. Build Solution
60.3. Build Exercise
60.4. Import Exercise into IDE
60.5. Summary
61. Server-side Persistence Unit
61.1. EJB Persistence Unit
61.2. Imported EJB - WAR Deployment
61.3. Embedded EJB - WAR Deployment
61.4. Summary
62. Remote Interface Issues
62.1. Serializable DTOs
62.2. Provider (Hibernate) Proxy Classes
62.3. Lazy Loading Exception
62.4. Cleansed BO/DTO
62.5. Pre-Loaded Entities
62.6. Separate DTO Classes
62.7. Summary


You will need a copy of Java 8 SDK installed.

Keep 32/64-bit choices consistent

Keep the 32/64-bit choice consistent with what you download later for Eclipse.

You will use Git in this class to perform an initial checkout and get updates for source files. Any Git client should be able to perform that function. You can determine if you have a command line Git client already installed using the following simple command.


There are a number of options and some are going to be based on on your platform. Your basic options include command line or using an Eclipse plugin

The class repository is located on github and can be browsed using the following http URL https://github.com/ejavaguy/ejava-student. With a cloned copy, you can receive file updates during the semester.

  1. CD to a directory you wish to place source code. Make sure the path to this directory contains no spaces.

  2. Clone the class repository using the following URL git://github.com/ejavaguy/ejava-student.git

    $ git clone git://github.com/ejavaguy/ejava-student.git
    Cloning into 'ejava-student'...
    ...
    Checking out files: 100% (1289/1289), done.
    ...
    
    $ ls ejava-student/
    ...
    
    $ cd ejava-student
    $ git branch -a    //list all branches -- local and remote
    * master
      remotes/origin/HEAD -> origin/master
      remotes/origin/master
    

    Note

    Git leaves you with all branches fetched and a local master branch referencing the class' master branch on github. You will be using the master branch for the duration of the semester. Other branches may show up, including my working branches where I am actively working on the next wave of updates. The master branch is usually updated the evening before or the day of class and should always be stable.

  3. Perform a mock update. This is what you will be doing several times this semester to get file updates.

    $ git checkout master //switches to master branch
    $ git pull            //downloads changes and attempts merge
    Already up-to-date.
    

    Note

    There are many modules within the class repository. Some are ready for use, some are still being worked, and some are not for use this semester. The ones ready for your use will be wired into the build and will be compiled during a follow-on section. The list will increase as the semester moves forward. Please ignore these extra modules. Keeping them within the baseline helps me keep related things centralized.

    Note

    If you ever make changes to the class examples and would like to keep those changes separate from the updates. Store them in a new branch at any time using the following git commands.

    $ git checkout -b new-branch       #creates new branch from current branch 
                                       #and switches to that branch
    $ git commit -am "saving my stuff" #commits all dirty files to new branch
    $ git checkout master              #switches back to the master branch 
    

    If you simply want to throw away any changes you made, you can discard those changes to tracked files using the following git commands.

    $ git reset --hard master
    $ git clean -rn  //shows you what it would delete without deleting 
    $ git clean -rf  //deletes files not managed or specifically ignored by git
    

  1. Download Maven 3 http://maven.apache.org/download.html

  2. Unzip the contents into a directory with no spaces in its path.

    $ ls apache-maven-3.3.3/
    bin  boot  conf  lib  LICENSE.txt  NOTICE.txt  README.txt
    
  3. Add an environment variable for MAVEN_HOME and add MAVEN_HOME/bin to your PATH

    //my linux system -- should be done in .bashrc
    export MAVEN_HOME=/opt/apache-maven-3.3.3
    export PATH=$MAVEN_HOME/bin:$PATH
    
    //my windows system -- should be done in Advanced System Settings->Environment Variables
    set MAVEN_HOME=/apps/apache-maven-3.3.3
    set PATH=%MAVEN_HOME%\bin;%PATH%
    
  4. Verify maven is installed and in the path

    //my fedora system
    $ mvn --version
    Apache Maven 3.3.3 (7994120775791599e205a5524ec3e0dfe41d4a06; 2015-04-22T07:57:37-04:00)
    Maven home: /opt/apache-maven-3.3.3
    Java version: 1.8.0_40, vendor: Oracle Corporation
    Java home: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.40-21.b25.fc21.x86_64/jre
    Default locale: en_US, platform encoding: UTF-8
    OS name: "linux", version: "3.19.2-201.fc21.x86_64", arch: "amd64", family: "unix"
    
    //my windows xp system
    > mvn --version
    
  1. Add a skeletal settings.xml file that will be used to provide local overrides for the build. This is the place where you can customize the build for local environment specifics like directory locations, server address, server ports, etc.

    1. Add the following to the.m2/settings.xml file in your HOME directory.

      
      
      <?xml version="1.0"?>
      <settings xmlns="http://maven.apache.org/POM/4.0.0"
          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
          xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">

          <offline>false</offline>
          
          <profiles>
          </profiles>
          
          <activeProfiles>
          </activeProfiles>
      </settings>    
    2. You can test whether your settings.xml file is seen by Maven by temporarily making it an invalid XML file and verifying that the next Maven build command fails with a parsing error.

      $ mvn clean
      [ERROR] Error executing Maven.
      [ERROR] 1 problem was encountered while building the effective settings
      [FATAL] Non-parseable settings /home/user/.m2/settings.xml: only whitespace content allowed before start tag and not s (position: START_DOCUMENT seen <?xml version="1.0"?>\ns... @2:2)  @ /home/user/.m2/settings.xml, line 2, column 2
      
    3. Add a default specification for the database profile we will be using for class at the bottom of the .m2/settings.xml file in your HOME directory.

      
      
          <activeProfiles>
              <activeProfile>h2db</activeProfile>
          </activeProfiles>
    4. If your operating system HOME directory has spaces in the path (e.g., Windows XP's Documents and Settings) then add a localRepository path specification to the .m2/settings.xml file and have it point to a location that does not have spaces in the path. The path does not have to exist. It will be created during the next build.

      
      
          <offline>false</offline>
          <!-- this overrides the default $HOME/.m2/repository location. --> 
          <localRepository>c:/jhu/repository</localRepository>

There are a few cases where dependencies cannot be hosted in public repositories and must be downloaded and installed manually. Oracle DB Client is one example.


If the message is a warning (i.e., for site/javadoc documentation -- it can be ignored). If you want to eliminate the warning or it is coming up as an error, you can download the artifact directly from the vendor and manually install it in your local repository.

Note

This is only an example. You are not required to download if the Oracle database driver for class if you do not wish. You can create a dummy file (touch dummy.jar) and register it using a dummy groupId, artifactId, and version if you wish.

  1. Download the driver jar from Oracle accept the license agreement.

  2. Install it manually into your localRepository

    $ mvn install:install-file -Dfile=/home/jcstaff/Downloads/ojdbc6.jar -DgroupId=com.oracle -DartifactId=ojdbc6 -Dversion=11.2.0.3 -Dpackaging=jar
    [INFO] Scanning for projects...
    ...
    [INFO] --- maven-install-plugin:2.4:install-file (default-cli) @ standalone-pom ---
    [INFO] Installing /home/jcstaff/Downloads/ojdbc6.jar to /home/jcstaff/.m2/repository/com/oracle/ojdbc6/11.2.0.3/ojdbc6-11.2.0.3.jar
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    ...
    

Artifacts are Best Placed in Network Repositories

This artifact would ideally be placed within a network cache/repository (i.e., Nexus) and not into individual localRepositories. In the next chapter you will be shown how could could setup a Maven Nexus for team use. If you go that route -- it would be better to upload the file to that archive.

Rogue Internet Repositories

Probably violating license restrictions, but the following network repository does contain the missing Oracle driver. One could add that as an upstream repository in their Nexus or define it as an artifact repository in the pom to avoid having to manually import the artifact.

As you should have noticed in the previous section, you can function just fine as a developer connected to the Internet as long as the critical Internet resources are available when they are needed to populate your localRepository cache. However, if you were part of a larger development team you would not want each separate developer reaching out the Internet to locate a common dependency. There is both a speed and failure issue with that type of setup.

The steps listed in this chapter are not necessary for class because you work alone and you have a localRepository cache. In fact -- I urge you not to follow them unless you feel the strong desire to dig deeper in this enterprise development setup. However, it does provide you with some experience in setting up and using a realistic production maven environment.

Still here? Okay -- lets get started...

In the previous section we installed and setup Maven in a stand-alone client mode. In that mode the individual environment communicated directly with the external Internet organizations to obtain missing artifacts. In this section we will add a proxy server between the developer and the Internet so that one or more developers can share what has already been downloaded and be completely isolated from network and remote outages. Ideally this proxy would be placed on a shared server somewhere on your local Intranet with access to the outside Internet. The instructions will act as though you are installing it locally. It is your decision whether to use it and where to install it. If you do not install and configure a proxy -- you will notice small delays in your builds while your local maven client checks Internet sources for new artifacts and updates to existing artifacts.


This procedure will take you through a manual download and setup of the Nexus OSS software and have you go thru a manual setup of all items. Use this approach if you want a minor amount of experience in setting up the sever.

  1. Download the Nexus OSS Software from Sonatype http://www.sonatype.org/nexus/go

  2. Unzip the the compressed file to your filesystem. There will be two directories created; one for the software (nexus-(version)) and another for the repository caches (sonatype-work)

    $ tar tzf ~/Downloads/nexus-x.x.x-xx-bundle.tar.gz 
    ...
    
    $ ls nexus-x.x.x-xx/ sonatype-work/
    nexus-x.x.x-xx/:
    LICENSE.txt  NOTICE.txt  bin  conf  lib  logs  nexus  tmp
    
    sonatype-work/:
    README.txt  nexus
    
  3. The server will listen on port 8081 on all interfaces (0.0.0.0) by default. Modify nexus-x.x.x-xx/conf/nexus.properties if you want something different.

    $ more nexus-x.x.x-xx/conf/nexus.properties                                                                                                                                       
    ...
    # Jetty section
    application-port=8081
    application-host=0.0.0.0
    ...
    
  4. Locate the startup script in the NEXUS_HOME/bin directory.

    $ ls nexus-x.x.x-xx/bin/
    jsw  nexus  nexus.bat                 
    

    Tip

    Linux users might want to register this script in /etc/init.d. If you also configure it to automatically start at boot -- supply a value for RUN_AS_USER to keep the server from running as root. I also needed to change "su - $RUN_AS_USER" to "su -m $RUN_AS_USER..." because of the way my legacy nexus account was setup for nologin.

  5. Use the script appropriate for your platform to start and perform other controls on the server.

    $ nexus-x.x.x-xx/bin/nexus
    Usage: nexus-x.x.x-xx/bin/nexus { console | start | stop | restart | status | dump }
    
    $ nexus-x.x.x-xx/bin/nexus start
    Starting Nexus OSS...
    Started Nexus OSS.
    

    Note

    The above command attempts to run Nexus as a service. If you do not have the permission on your system to register nexus as a service, you can optionally run it as an interactive command.

                        
    $ nexus-x.x.x-xx/bin/nexus console
    /etc/init.d/nexus console
    Running Nexus OSS...
    wrapper  | --> Wrapper Started as Console
    wrapper  | Launching a JVM...
    jvm 1    | Wrapper (Version 3.2.3) http://wrapper.tanukisoftware.org
    jvm 1    |   Copyright 1999-2006 Tanuki Software, Inc.  All Rights Reserved.
    jvm 1    | 
    jvm 1    | 2012-04-26 12:14:38.531:INFO:oejs.Server:jetty-7.5.4.v20111024
    

    The specific error observed on Windows was

                        
    wrapper | The nexus-webapp service is not installed - The specified service does not exist as an installed service. (0x424)                            
    

  6. Access the Web UI for Nexus OSS http://localhost:8081/nexus

  7. Login with the default account of admin/admin123

  8. Change the admin password (or not) using Security->Change Password

  9. Add remote repositories

    1. Click on Views/Repositories->Repositories

    2. Click the +Add...Proxy Repository

    3. Fill in the data for the repositories in the following Repositories table. Use default values for anything not included in the table.

    4. Press Save


    Note

    If the provided list is missing any sources, check for a current list of repositories and plugin repositories in ejava-build/dependencies/pom.xml from the class source tree and add those as well.

    Note

    Nexus will check each repository as they are added. Try restarting if things get stuck.

    $ service nexus restart
    Stopping Nexus OSS...
    Stopped Nexus OSS.
    Starting Nexus OSS...
    Started Nexus OSS.
    
  10. Add the created repositories to the Public Repositories

    1. Select Public Repositories line towards the top of the repositories list.

    2. Select repositories form the right Available column and add them to the left column. Specify them last in the following order

      1. JBoss Nexus

      2. Apache Repo

      3. Exoplatform

      4. Webdev Snapshots

      5. Webdev Releases

    3. Press Save

We will be using the JBoss/Wildfly Application Server this semester. This is a fully-compliant JavaEE 7 application server.

JBoss Application Server/Wildfly (AS) and JBoss Enterprise Application Platform (EAP)

JBoss has a community version (formerly call JBoss AS - renamed Wildfly ~2012) and commercial version (JBoss EAP) of their JavaEE application server. Both are open source and built off the same code base. In theory, changes propagate through the community version first in daily changes and short iterations. In theory, commercial version is a roll-up of a stable version of the community version with the ability to purchase support on that specific version. With commercial version support - you can receive patches for a specific issue prior to upgrading to the latest release. With the community version - you pretty much need to keep up with the latest release to get any patches. Of course, with either version you are free to perform your own support and code changes, but you can only get this commercially with the EAP release. There is a newsgoup post and slide show that provides a decent, short description of the two.

JBoss makes the EAP version available for *development* use from jboss.org but you will notice it has not yet caught up with Wildfly 9.x (at wildfly.org) just yet. We will be using the open source/Wildfly version of the server - which is fully JavaEE 7-compliant.

JBoss AS/Wildfly version numbers are ahead of JBoss EAP because not every community version becomes a commercial version. JBoss AS 6 was skipped entirely by EAP.

  1. Download Wildfly 9.0.1.Final http://www.wildfly.org/downloads/. The 'Quickstarts' examples are also helpful but class notes, exercises, and guidance may have simplified, generalized, or alternative approaches to what is contained in the guides.

  2. Install JBoss into a directory that does not have any spaces in its path.

    $ unzip ~/Downloads/wildfly-9.0.1.Final.zip                
                    
    $ ls wildfly-9.0.1.Final/                                                                                                                                        
    appclient  bin  copyright.txt  docs  domain  jboss-modules.jar  LICENSE.txt  modules  README.txt  standalone  welcome-content
    
  3. Test the installation by starting the default configuration installation.

    $ ./wildfly-9.0.1.Final/bin/standalone.sh 
    =========================================================================
    
      JBoss Bootstrap Environment
    
      JBOSS_HOME: /opt/wildfly-9.0.1.Final
    
      JAVA: java
    
      JAVA_OPTS:  -server -XX:+UseCompressedOops  -server -XX:+UseCompressedOops -Xms64m -Xmx512m -XX:MaxPermSize=256m -Djava.net.preferIPv4Stack=true -Djboss.modules.system.pkgs=org.jboss.byteman -Djava.awt.headless=true -agentlib:jdwp=transport=dt_socket,address=8787,server=y,suspend=n
    
    =========================================================================
    
    Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
    Listening for transport dt_socket at address: 8787
    01:04:45,733 INFO  [org.jboss.modules] (main) JBoss Modules version 1.4.3.Final
    01:04:47,438 INFO  [org.jboss.msc] (main) JBoss MSC version 1.2.6.Final
    01:04:48,207 INFO  [org.jboss.as] (MSC service thread 1-1) WFLYSRV0049: WildFly Full 9.0.1.Final (WildFly Core 1.0.1.Final) starting
    ...
    
    01:05:49,522 INFO  [org.jboss.as] (Controller Boot Thread) WFLYSRV0060: Http management interface listening on http://127.0.0.1:9990/management
    01:05:49,524 INFO  [org.jboss.as] (Controller Boot Thread) WFLYSRV0051: Admin console listening on http://127.0.0.1:9990
    01:05:49,525 INFO  [org.jboss.as] (Controller Boot Thread) WFLYSRV0025: WildFly Full 9.0.1.Final (WildFly Core 1.0.1.Final) started in 66294ms - Started 270 of 445 services (252 services are lazy, passive or on-demand)

    Note

    There are .sh version of scripts for *nix platforms and .bat forms of the scripts for Windows platforms. Use the one that is appropriate for your environment.

  4. Verify you can access the server

  1. Shutdown the server using Control-C

  2. Copy over the class example server files from what you cloned and built from github earlier.

    $ cd wildfly-9.0.1.Final
    wildfly-9.0.1.Final]$ unzip -l .../ejava-student/servers/ejava-wildfly901/target/ejava-wildfly901-4.0.0-SNAPSHOT-server.zip 
    Archive:  .../ejava-student/servers/ejava-wildfly901/target/ejava-wildfly901-4.0.0-SNAPSHOT-server.zip
      Length      Date    Time    Name
    ---------  ---------- -----   ----
            0  09-02-2014 21:48   standalone/
            0  09-02-2014 21:48   standalone/configuration/
            0  09-02-2014 21:48   domain/
            0  09-02-2014 21:48   domain/configuration/
          575  09-02-2014 21:48   standalone/configuration/server.cer
         1747  09-02-2014 21:48   standalone/configuration/application-roles.properties
         1190  09-02-2014 21:48   standalone/configuration/server.keystore
        31118  09-02-2014 21:48   standalone/configuration/standalone.xml
         2490  09-02-2014 21:48   standalone/configuration/application-users.properties
         1747  09-02-2014 21:48   domain/configuration/application-roles.properties
         2402  09-02-2014 21:48   domain/configuration/application-users.properties
    ---------                     -------
        41269                     11 files
    $ unzip /home/jcstaff/workspaces/ejava-class/ejava-student/servers/ejava-wildfly901/target/ejava-wildfly901-4.0.0-SNAPSHOT-server.zip 
    Archive:  /home/jcstaff/workspaces/ejava-class/ejava-student/servers/ejava-wildfly901/target/ejava-wildfly901-4.0.0-SNAPSHOT-server.zip
      inflating: standalone/configuration/server.cer  
    replace standalone/configuration/application-roles.properties? [y]es, [n]o, [A]ll, [N]one, [r]ename: y
      inflating: standalone/configuration/application-roles.properties  
    ...
    
  3. Restart the server

  1. Use the batch script to add an admin user to the system. Note the password must have at least one digit and one non-alphanumeric character. If you run the application server on a remote machine or under a different account, please use the jboss.user and jboss.password supplied in build/dependencies/pom.xml. JBoss/Wildfly will bypass user credentials when the client executes on the same machine by the same user that started the server.

    xml$ grep jboss.user build/dependencies/pom.xml 
            <jboss.user>admin</jboss.user>
    $ grep jboss.password build/dependencies/pom.xml 
            <jboss.password>password1!</jboss.password>
    $ ./bin/add-user.sh 
    
    What type of user do you wish to add? 
     a) Management User (mgmt-users.properties) 
     b) Application User (application-users.properties)
    (a): 
    
    Enter the details of the new user to add.
    Using realm 'ManagementRealm' as discovered from the existing property files.
    Username : admin
    The username 'admin' is easy to guess
    Are you sure you want to add user 'admin' yes/no? yes
    Password recommendations are listed below. To modify these restrictions edit the add-user.properties configuration file.
     - The password should not be one of the following restricted values {root, admin, administrator}
     - The password should contain at least 8 characters, 1 alphabetic character(s), 1 digit(s), 1 non-alphanumeric symbol(s)
     - The password should be different from the username
    Password : 
    Re-enter Password : 
    What groups do you want this user to belong to? (Please enter a comma separated list, or leave blank for none)[  ]: 
    About to add user 'admin' for realm 'ManagementRealm'
    Is this correct yes/no? yes
    Added user 'admin' to file '/opt/wildfly-9.0.1.Final/standalone/configuration/mgmt-users.properties'
    Added user 'admin' to file '/opt/wildfly-9.0.1.Final/domain/configuration/mgmt-users.properties'
    Added user 'admin' with groups  to file '/opt/wildfly-9.0.1.Final/standalone/configuration/mgmt-groups.properties'
    Added user 'admin' with groups  to file '/opt/wildfly-9.0.1.Final/domain/configuration/mgmt-groups.properties'
    Is this new user going to be used for one AS process to connect to another AS process? 
    e.g. for a slave host controller connecting to the master or for a Remoting connection for server to server EJB calls.
    yes/no? no
    
  2. Retry logging into the Admin Application http://localhost:9990/console

Prepare your server for remote debugging for later.

The application server and application clients used in class require a relational database. Application server vendors generally package a lightweight database with their downloads so that the server can be used immediately for basic scenarios. JBoss comes packaged with the H2 database. This database can run in one of three modes

  • Embedded/in-memory

  • Embedded/file

  • Server-based

File-based versus in-memory allows you to do post-mortem analysis of the database after a test completes. File-based also allows you to initialize the database schema in one process and use the database within another. Using server-based mode allows you to inspect the database while the application is running.

JBoss and the class examples come setup with embedded drivers. You can change the configuration at any time to a server-based configuration using the following instructions.

Choose Right Mode for Right Need

Using embedded mode requires less administration overhead in the test environment.

Using server mode provides access to database state during application execution -- which is good for debugging.

Note

This will create a database folder called "ejava" relative to where you started the database server.

Note

LOCK_MODE refers to how you want your connection impacted by other transactions in progress. A normal application would want some isolation between transactions, but it is useful to have the UI be able to watch in-progress transactions (i.e., perform dirty reads). The options include:

  • 0 - Read Uncommitted - transaction isolation disabled

  • 1 - Serializable - database is (read and write) locked until transaction commits

  • 3 - Read Committed (default) - read locks released after statement completes

The most current version of Eclipse is Mars. However, August/September is the worst time to switch to the newest Eclipse platform since they normally release in the summer and the previous release from last year has had a chance to receive bug fixes and performance patches. I am going to follow my own advice this year and go with Luna. You may use Mars or another IDE entirely.

  1. Download Eclipse IDE for JavaEE Developers https://eclipse.org/downloads/packages/release/Luna/R or latest from http://eclipse.org/downloads

  2. Unzip the downloaded archive.

    $ tar xzf ~/Downloads/eclipse-jee-xxxx-R-linux-gtk-x86_64.tar.gz
    
    $ ls eclipse
    about_files  about.html  artifacts.xml  configuration  dropins  eclipse  eclipse.ini  
    epl-v10.html  features  icon.xpm  notice.html  p2  plugins  readme
    
  3. Create a shortcut for starting Eclipse

  4. Start Eclipse

m2e is a plugin installed into Eclipse that configures Eclipse based on the Maven pom.xml configuration. When adjusting your builds, you should always define changes within the Maven pom.xml and rely on m2e to translate that into Eclipse. Any changes added directly to Eclipse will not be seen by the command-line build.

  1. Add the Java Package Explorer to the JavaEE Perspective. I find this easier to work with than the Project Explorer used by default in the JavaEE perspective.

  2. Import the class examples into Eclipse as a Maven Project

Add the following repository to your Eclipse instance and install the plugin

  1. Open the Eclipse Marketplace panel using Help->Eclipse Marketplace

  2. Type JBoss into the seach field and press Go

  3. Click Install for the JBoss Tools (Luna)

  4. Complete the installation steps for JBoss Tools. There are many tools in the repository. Not all of them are needed for class or obvious how to use them without more investigation. Choose the following suggested minimal set.

  5. Define a Server Instance for JBoss

Ant is used in class to wrap command lines and encapsulate the building of classpaths for stand-alone applications. Just download and add Ant to your PATH here.

Note

The latest version of Ant is 1.9.4. The latest version I have tested with is 1.8.4. Until told otherwise, please download and configure the 1.8.4 version from the Ant archive until I get a chance to try things with the newer release.

In this chapter you will be introduced to a standard module file structure that contains a class we intend to use in production and a unit test to verify the functionality of the production class. You will be asked to form the directory structure of files and execute the commands required to build and run the unit test.

Warning

This chapter is optional!!! It contains many tedious steps that are somewhat shell-specific. The intent is to simply introduce the raw data structure and actions that need to take place and then to later automate all of this through Maven. If you wish to just skim the steps -- please do. Please do not waste time trying to port these bash shell commands to your native shell.

Note

This part requires junit.jar. These should have been downloaded for you when you built the class examples and can be located in $M2_REPO/junit/junit/(version)/. Where M2_REPO is HOME/.m2/repository or the location you have specified in the localRepository element of HOME/.m2/settings.xml.

  1. Set a few logical variables to represent root directories. For the purposes of the follow-on steps, PROJECT_BASEDIR is the root directory for this exercise. In the example below, the user has chosen a directory of $HOME/proj/784/exercises to be the root directory for all class exercises and named the root directory for this project "ex1". An alternative for CLASS_HOME might be c:/jhu/784.

    export CLASS_HOME=$HOME/proj/784
    export PROJECT_BASEDIR=$CLASS_HOME/exercises/ex1
    mkdir -p $PROJECT_BASEDIR
    cd $PROJECT_BASEDIR
  2. Create project directory structure. In this example, the developer used $HOME/proj/784 for all work in this class.

    $PROJECT_BASEDIR
        +---/src/main/java/myorg/mypackage/ex1
        +---/src/test/java/myorg/mypackage/ex1
        +---/target/classes
        +---/target/test-classes
        +---/target/test-reports
    mkdir -p src/main/java/myorg/mypackage/ex1
    mkdir -p src/test/java/myorg/mypackage/ex1
    mkdir -p target/classes
    mkdir -p target/test-classes
    mkdir -p target/test-reports
  3. Add the following Java implementation class to PROJECT_BASEDIR/src/main/java/myorg/mypackage/ex1/App.java

    package myorg.mypackage.ex1;
    
    
    public class App {
        public int returnOne() { 
            System.out.println( "Here's One!" );
            return 1; 
        }
        public static void main( String[] args ) {
            System.out.println( "Hello World!" );
        }
    }
  4. Add the following Java test class to PROJECT_BASEDIR/src/test/java/myorg/mypackage/ex1/AppTest.java

    package myorg.mypackage.ex1;
    
    
    import static org.junit.Assert.*;
    import org.junit.Test;
    /**
     * Unit test for simple App.
     */
    public class AppTest {
        @Test
        public void testApp() {
            System.out.println("testApp");
            App app = new App();
            assertTrue("app didn't return 1", app.returnOne() == 1);
        }
    }

    Note

    Make sure you put AppTest.java in the src/test tree.

  5. Compile the application and place it in target/ex1.jar. The compiled classes will go in target/classes.

    javac src/main/java/myorg/mypackage/ex1/App.java -d target/classes
    jar cvf target/ex1.jar -C target/classes .
    jar tf target/ex1.jar
    $ javac src/main/java/myorg/mypackage/ex1/App.java -d target/classes
    $ jar cvf target/ex1.jar -C target/classes .
    added manifest
    adding: myorg/(in = 0) (out= 0)(stored 0%)
    adding: myorg/mypackage/(in = 0) (out= 0)(stored 0%)
    adding: myorg/mypackage/ex1/(in = 0) (out= 0)(stored 0%)
    adding: myorg/mypackage/ex1/App.class(in = 519) (out= 350)(deflated 32%)
    
    $ jar tf target/ex1.jar
    META-INF/
    META-INF/MANIFEST.MF
    myorg/
    myorg/mypackage/
    myorg/mypackage/ex1/
    myorg/mypackage/ex1/App.class
  6. Compile the JUnit test and place the compiled tests in target/test-classes.

    export JUNIT_JAR="$HOME/.m2/repository/junit/junit/4.10/junit-4.10.jar"
    javac -classpath "target/ex1.jar:$JUNIT_JAR" src/test/java/myorg/mypackage/ex1/AppTest.java -d target/test-classes
  7. Verify you have your "production" class from src/main compiled into target/classes, your unit test class from src/test compiled into target/test-classes, and the Java archive with thr production class in target directory.

    target/
        +---classes/myorg/mypackage/ex1/App.class
        +---test-classes/myorg/mypackage/ex1/AppTest.class
        +---test-reports
        +---ex1.jar
  8. Run the JUnit test framework.

    java -classpath "target/ex1.jar:$JUNIT_JAR:target/test-classes" org.junit.runner.JUnitCore myorg.mypackage.ex1.AppTest
    JUnit version 4.10
    .testApp
    Here's One!
    
    Time: 0.005
    
    OK (1 test)
  9. Change add/remove a test that will fail, re-compile the test class and re-run.

        //AppTest.java
    
        @Test
        public void testFail() {
            System.out.println("testFail");
            App app = new App();
            assertTrue("app didn't return 0", app.returnOne() == 0);
        }
    javac -classpath "target/ex1.jar:$JUNIT_JAR" src/test/java/myorg/mypackage/ex1/AppTest.java -d target/test-classes
    java -classpath "target/ex1.jar:$JUNIT_JAR:target/test-classes" org.junit.runner.JUnitCore myorg.mypackage.ex1.AppTest
    JUnit version 4.10
    .testApp
    Here's One!
    .testFail
    Here's One!
    E
    Time: 0.007
    There was 1 failure:
    1) testFail(myorg.mypackage.ex1.AppTest)
    java.lang.AssertionError: app didn't return 0
            at org.junit.Assert.fail(Assert.java:93)
            at org.junit.Assert.assertTrue(Assert.java:43)
            at myorg.mypackage.ex1.AppTest.testFail(AppTest.java:26)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    ...
            at org.junit.runner.JUnitCore.main(JUnitCore.java:45)
    
    FAILURES!!!
    Tests run: 2,  Failures: 1

This chapter demonstrates the basics of automating the manual steps in the previous chapter using the Apache Ant build tool. If you just skim thru this step, please be sure to take note of how everything gets explicitly defined in Ant. There are not many rules of the road and standard defaults to live by. That will be a big contrast when working with Maven.

Note

All course examples and projects submitted will use Maven. Ant will be used to wrap command lines for Java SE clients executed outside the normal build environment. However, this exercise shows Ant only being used as part of the artifact build and test environment as a stepping stone to understanding some of the basic build and test concepts within Maven.

Note

If you do not have Ant installed on your system, it can be from http://ant.apache.org/

Warning

This chapter is optional!!! It contains many tedious steps to setup a module build using the Ant build tool -- which will not be part of class. It is presented here as an example option to building the module with shell scripts. If you wish to just skim the steps -- please do. Please do not waste time trying to get Ant to build your Java modules for this class.

  1. Create a build.properties file in PROJECT_BASEDIR. This will be used to define any non-portable property values. Place the most non-portable base variables (.e.g, M2_REPO location) towards the top and build lower-level paths from them. This makes the scripts much easier to port to another environment. If you still have your maven repository in your $HOME directory, you can make use of ${user.home} environment variable rather than a hard-coded path.

    #ex1 build.properties
    #M2_REPO=c:/jhu/repository
    M2_REPO=${user.home}/.m2/repository
    
    junit.classpath=${M2_REPO}/junit/junit/4.10/junit-4.10.jar
  2. Create a build.xml file in PROJECT_BASEDIR. Note the following key elements.

    • project - a required root for build.xml files

      • name - not significant, but helpful

      • default - the target to run if none is supplied on command line

      • basedir - specifies current directory for all tasks

    • property - defines an immutable name/value

      • file - imports declarations from a file; in this case build.properties created earlier

      • name/value - specifies a property within the script

    • target - defines an entry point into the build.xml script. It hosts one or more tasks.

      • name - defines name of target, which can be supplied on command line.

    • echo - a useful Ant task to printout status and debug information. See Ant docs for more information.

    
    <?xml version="1.0" encoding="utf-8" ?> 
    <!-- ex1 build.xml 
    -->
    <project name="ex1" default="" basedir=".">
        <property file="build.properties"/>

        <property name="artifactId" value="ex1"/>
        <property name="src.dir"    value="${basedir}/src"/>
        <property name="build.dir"  value="${basedir}/target"/>

        <target name="echo">
            <echo>basedir=${basedir}</echo>
            <echo>artifactId=${artifactId}</echo>
            <echo>src.dir=${src.dir}</echo>
            <echo>build.dir=${build.dir}</echo>
            <echo>junit.classpath=${junit.classpath}</echo>
        </target>
    </project>
  3. Sanity check your build.xml and build.properties file with the echo target.

    $ ant echo
    Buildfile: /home/jcstaff/proj/784/exercises/ex1/build.xml
    
    echo:
         [echo] basedir=/home/jcstaff/proj/784/exercises/ex1
         [echo] artifactId=ex1
         [echo] src.dir=/home/jcstaff/proj/784/exercises/ex1/src
         [echo] build.dir=/home/jcstaff/proj/784/exercises/ex1/target
         [echo] junit.classpath=/home/jcstaff/.m2/repository/junit/junit/4.10/junit-4.10.jar
    
    BUILD SUCCESSFUL
    Total time: 0 seconds
  4. Add the "package" target to compile and archive your /src/main classes. Note the following tasks in this target.

    • mkdir - creates a directory. See Ant docs for more infomation.

    • javac - compiles java sources files. See Ant docs for more information. Note that we are making sure we get JavaSE 7 classes compiled.

    • * jar - builds a java archive. See Ant Docs for more information.

    
        <target name="package">
            <mkdir dir="${build.dir}/classes"/>
            <javac srcdir="${src.dir}/main/java"
                   destdir="${build.dir}/classes"
                   debug="true"
                   source="1.7"
                   target="1.7"
                   includeantruntime="false">
                   <classpath>
                   </classpath>
            </javac>

            <jar destfile="${build.dir}/${artifactId}.jar">
                <fileset dir="${build.dir}/classes"/>
            </jar>
        </target>
  5. Execute the "package" target just added. This should compile the production class from src/main into target/classes and build a Java archive with the production class in target/.

    $ rm -rf target/; ant package
    Buildfile: /home/jcstaff/proj/784/exercises/ex1/build.xml
    
    package:
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/classes
        [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/classes
          [jar] Building jar: /home/jcstaff/proj/784/exercises/ex1/target/ex1.jar
    
    BUILD SUCCESSFUL
    Total time: 2 seconds

    Note

    You may get the following error when you execute the javac task. If so, export JAVA_HOME=(path to JDK_HOME) on your system to provide Ant a reference to a JDK instance.

    build.xml:26: Unable to find a javac compiler;
    com.sun.tools.javac.Main is not on the classpath.
    Perhaps JAVA_HOME does not point to the JDK.
    It is currently set to ".../jre"
    $ find . -type f
    ./src/main/java/myorg/mypackage/ex1/App.java
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./build.properties
    ./build.xml
    ./target/classes/myorg/mypackage/ex1/App.class
    ./target/ex1.jar
  6. Add the "test" target to compile your /src/test classes. Make this the default target for your build.xml file. Note too that it should depend on the successful completion of the "package" target and include the produced archive in its classpath.

    
    <project name="ex1" default="test" basedir=".">
    ...
        <target name="test" depends="package">
            <mkdir dir="${build.dir}/test-classes"/>
            <javac srcdir="${src.dir}/test/java"
                   destdir="${build.dir}/test-classes"
                   debug="true"
                   source="1.7"
                   target="1.7"
                   includeantruntime="false">
                   <classpath>
                       <pathelement location="${build.dir}/${artifactId}.jar"/>
                       <pathelement path="${junit.classpath}"/>
                   </classpath>
            </javac>
        </target>
  7. Execute the new "test" target after clearing out the contents of the target directory. Note that the target directory gets automatically re-populated with the results of the "compile" target and augmented with the test class from src/test compiled into target/test-classes.

    $ rm -rf target/; ant
    Buildfile: /home/jcstaff/proj/784/exercises/ex1/build.xml
    
    package:
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/classes
        [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/classes
          [jar] Building jar: /home/jcstaff/proj/784/exercises/ex1/target/ex1.jar
    
    test:
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/test-classes
        [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/test-classes
    
    BUILD SUCCESSFUL
    Total time: 3 seconds
    > find . -type f
    ./src/main/java/myorg/mypackage/ex1/App.java
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./build.properties
    ./build.xml
    ./target/classes/myorg/mypackage/ex1/App.class
    ./target/ex1.jar
    ./target/test-classes/myorg/mypackage/ex1/AppTest.class
    
  8. Add the junit task to the test target. The junit task is being configured to run in batch mode and write a TXT and XML reports to the target/test-reports directory. See {{{http://ant.apache.org/manual/OptionalTasks/junit.html}Ant docs}} for more details on the junit task. Make special note of the following:

    • printsummary - produce a short summary to standard out showing the number of tests run and a count of errors, etc.

    • fork - since Ant runs in a JVM, any time you run a task that requires a custom classpath, it is usually required that it be forked into a separate process (with its own classpath).

    • batchtest - run all tests found and write results of each test into the test-reports directory.

    • formatter - write a text and XML report of results

    
            <mkdir dir="${build.dir}/test-reports"/>
            <junit printsummary="true" fork="true">
                   <classpath>
                       <pathelement path="${junit.classpath}"/>
                       <pathelement location="${build.dir}/${artifactId}.jar"/>
                       <pathelement location="${build.dir}/test-classes"/>
                   </classpath>

                <batchtest fork="true" todir="${build.dir}/test-reports">
                    <fileset dir="${build.dir}/test-classes">
                        <include name="**/*Test*.class"/>
                    </fileset>
                </batchtest>

                <formatter type="plain"/>
                <formatter type="xml"/>
            </junit>

    Note

    The last time I sanity checked this exercise I got the common error below. I corrected the issue by downloading a full installation from the Ant website and exporting my ANT_HOME to the root of that installation. (export ANT_HOME=/opt/apache-ant-1.9.4) and adding $ANT_HOME/bin to the PATH (export PATH=$ANT_HOME/bin:$PATH) ANT_HOME is required for Ant to locate the junit task.

    BUILD FAILED
    
    /home/jcstaff/proj/784/exercises/ex1/build.xml:57: Problem: failed to create task or type junit
    Cause: the class org.apache.tools.ant.taskdefs.optional.junit.JUnitTask was not found.        
            This looks like one of Ant's optional components.
    Action: Check that the appropriate optional JAR exists in
            -/usr/share/ant/lib
            -/home/jcstaff/.ant/lib
            -a directory added on the command line with the -lib argument
    
    Do not panic, this is a common problem.
    The commonest cause is a missing JAR.
    
    This is not a bug; it is a configuration problem
  9. Execute the updated "test" target with the JUnit test.

    $ rm -rf target; ant
    package:
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/classes
        [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/classes
          [jar] Building jar: /home/jcstaff/proj/784/exercises/ex1/target/ex1.jar
    
    test:
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/test-classes
        [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/test-classes
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/test-reports
        [junit] Running myorg.mypackage.ex1.AppTest
        [junit] Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 15.143 sec
        [junit] Test myorg.mypackage.ex1.AppTest FAILED
    
    BUILD SUCCESSFUL
    Total time: 17 seconds
    $ find . -type f
    ./src/main/java/myorg/mypackage/ex1/App.java
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./build.properties
    ./build.xml
    ./target/classes/myorg/mypackage/ex1/App.class
    ./target/ex1.jar
    ./target/test-classes/myorg/mypackage/ex1/AppTest.class
    ./target/test-reports/TEST-myorg.mypackage.ex1.AppTest.txt
    ./target/test-reports/TEST-myorg.mypackage.ex1.AppTest.xml

    Note

    Note the 17 seconds it took to run/complete the test seems excessive. I was able to speed that up to 0.001 sec by commenting out the XML report option (which we will not use in this exercise).

  10. Test output of each test is in the TXT and XML reports.

    $ more target/test-reports/TEST-myorg.mypackage.ex1.AppTest.txt 
    Testsuite: myorg.mypackage.ex1.AppTest
    Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 15.246 sec
    ------------- Standard Output ---------------
    testApp
    Here's One!
    testFail
    Here's One!
    ------------- ---------------- ---------------
    
    Testcase: testApp took 0.007 sec
    Testcase: testFail took 0.022 sec
            FAILED
    app didn't return 0
    junit.framework.AssertionFailedError: app didn't return 0
            at myorg.mypackage.ex1.AppTest.testFail(AppTest.java:26)
  11. Add a clean target to the build.xml file to delete built artifacts. See Ant docs for details on the delete task.

    
        <target name="clean">
            <delete dir="${build.dir}"/>
        </target>
  12. Re-run and use the new "clean" target you just added.

    $ ant clean test
    Buildfile: /home/jcstaff/proj/784/exercises/ex1/build.xml
    
    clean:
       [delete] Deleting directory /home/jcstaff/proj/784/exercises/ex1/target
    
    package:
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/classes
        [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/classes
          [jar] Building jar: /home/jcstaff/proj/784/exercises/ex1/target/ex1.jar
    
    test:
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/test-classes
        [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/test-classes
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/test-reports
        [junit] Running myorg.mypackage.ex1.AppTest
        [junit] Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 15.123 sec
        [junit] Test myorg.mypackage.ex1.AppTest FAILED
    
    BUILD SUCCESSFUL
    Total time: 17 seconds
  13. Comment out the bogus testFail and rerun.

    $ cat src/test/java/myorg/mypackage/ex1/AppTest.java
    ...
        //@Test
        public void testFail() {
  14. $ ant clean test
    Buildfile: /home/jcstaff/proj/784/exercises/ex1/build.xml
    
    clean:
       [delete] Deleting directory /home/jcstaff/proj/784/exercises/ex1/target
    
    package:
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/classes
        [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/classes
          [jar] Building jar: /home/jcstaff/proj/784/exercises/ex1/target/ex1.jar
    
    test:
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/test-classes
        [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/test-classes
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/test-reports
        [junit] Running myorg.mypackage.ex1.AppTest
        [junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.161 sec
    
    BUILD SUCCESSFUL
    Total time: 17 seconds

In this chapter we will refine the use of print and debug statements by using a "logger". By adopting a logger into your production and test code you can avoid print statements to stdout/stderr and be able to re-direct them to log files, databases, messaging topics etc. There are several to choose from (Java's built-in logger, Commons logging API, SLF's logging API, and log4j to name a few). This exercise just happens to use commons-logging API and log4j implementation.

  1. Change the System.out() calls in App and AppTest from Part A to use commons logging API. The commons-logging Javadoc and guide will be helpful in understanding this interface. The guide goes into some details about log4j configuration as well.

    package myorg.mypackage.ex1;
    
    
    import org.apache.commons.logging.Log;
    import org.apache.commons.logging.LogFactory;
    public class App {
        private static Log log = LogFactory.getLog(App.class);
        public int returnOne() { 
            //System.out.println( "Here's One!" );
            log.debug( "Here's One!" );
            return 1; 
        }
        public static void main( String[] args ) {
            //System.out.println( "Hello World!" );
            log.info( "Hello World!" );
        }
    }
    package myorg.mypackage.ex1;
    
    
    ...
    import org.apache.commons.logging.Log;
    import org.apache.commons.logging.LogFactory;
    public class AppTest {
        private static Log log = LogFactory.getLog(AppTest.class);
    ...
        @Test
        public void testApp() {
            //System.out.println("testApp");
            log.info("testApp");
            App app = new App();
            assertTrue("app didn't return 1", app.returnOne() == 1);
        }
    }
  2. Add a log4j.xml configuration file to the directory structure. Place this file in src/test/resources/log4j.xml. This file is used to control logging output. Refer to the log4j documentation page for possible information on how to configure and use log4j. However, they sell a commercial text; so its hard to find a good, detailed online reference that goes through all options. It doesn't matter whether you use a log4j.xml format or log4j.properties format. However, their quick intro uses the property file format.

    
    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">

    <log4j:configuration 
        xmlns:log4j="http://jakarta.apache.org/log4j/" 
        debug="false">
       
        <appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
            <param name="Target" value="System.out"/>

            <layout class="org.apache.log4j.PatternLayout">
                <param name="ConversionPattern" 
                       value="%-5p %d{dd-MM HH:mm:ss,SSS} (%F:%M:%L)  -%m%n"/>
            </layout>
        </appender>

        <appender name="logfile" class="org.apache.log4j.RollingFileAppender">
            <param name="File" value="target/log4j-out.txt"/>
            <param name="Append" value="false"/>
            <param name="MaxFileSize" value="100KB"/>
            <param name="MaxBackupIndex" value="1"/>
            <layout class="org.apache.log4j.PatternLayout">
                <param name="ConversionPattern" 
                       value="%-5p %d{dd-MM HH:mm:ss,SSS} [%c] (%F:%M:%L)  -%m%n"/>
            </layout>
       </appender>

       <logger name="myorg.mypackage">
          <level value="debug"/>
          <appender-ref ref="logfile"/>  
       </logger>

       <root>
          <priority value="info"/>    
          <appender-ref ref="CONSOLE"/>  
       </root>   
       
    </log4j:configuration>

    Note

    The log4j.xml is placed in the JVM classpath; where log4j will locate it by default. However, it should not be placed in with the main classes (ex1.jar). Placing it in a our JAR file would polute the application assembler and deployer's job of specifying the correct configuration file at runtime. Our test classes and resources are not a part of follow-on deployment.

  3. Add the commons-logging.jar to the compile classpaths and the commons-logging.jar and log4j.jar to the runtime classpath. Also add an additional task to copy the log4j.xml file into target/test-classes so that it is seen by the classloader as a resource. Realize that your classes have no compilation dependencies on log4j. Log4j is only used if it is located at runtime.

    # ex1 build.properties
    commons-logging.classpath=${M2_REPO}/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar
    log4j.classpath=${M2_REPO}/log4j/log4j/1.2.13/log4j-1.2.13.jar
    
        <target name="echo">
            ...
            <echo>commons-logging.classpath=${commons-logging.classpath}</echo>
            <echo>log4j.classpath=${log4j.classpath}</echo>
        </target>
    
            <javac srcdir="${src.dir}/main/java"
                   destdir="${build.dir}/classes"
                   debug="true"
                   source="1.7"
                   target="1.7"
                   includeantruntime="false">
                   <classpath>
                       <pathelement path="${commons-logging.classpath}"/>
                   </classpath>
            </javac>
    
            <javac srcdir="${src.dir}/test/java"
                   destdir="${build.dir}/test-classes"
                   debug="true"
                   source="1.7"
                   target="1.7"
                   includeantruntime="false">
                   <classpath>
                       <pathelement location="${build.dir}/${artifactId}.jar"/>
                       <pathelement path="${junit.classpath}"/>
                       <pathelement path="${commons-logging.classpath}"/>
                   </classpath>
            </javac>

            <copy todir="${build.dir}/test-classes">
                <fileset dir="${src.dir}/test/resources"/>
            </copy>
    
            <junit printsummary="true" fork="true">
                   <classpath>
                       <pathelement path="${junit.classpath}"/>
                       <pathelement location="${build.dir}/${artifactId}.jar"/>
                       <pathelement location="${build.dir}/test-classes"/>
                       <pathelement path="${commons-logging.classpath}"/>
                       <pathelement path="${log4j.classpath}"/>
                   </classpath>
            ...
  4. Test application and inspect reports. All loggers inherit from the root logger and may only extend its definition; not limit it. Notice that the root logger's priority filter "info" value allows log.info() (warning and fatal) messages to printed to the console. The myorg.mypackage logger's level filter allows log.debug() messages from the myorg.mypackage.* classes to appear in both the console and logfile. This means that any Java classes not in our package hierarchy will only have INFO or higher priority messages logged.

    $ ant clean test
    Buildfile: /home/jcstaff/proj/784/exercises/ex1/build.xml
    
    clean:
       [delete] Deleting directory /home/jcstaff/proj/784/exercises/ex1/target
    
    package:
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/classes
        [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/classes
          [jar] Building jar: /home/jcstaff/proj/784/exercises/ex1/target/ex1.jar
    
    test:
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/test-classes
        [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/test-classes
         [copy] Copying 1 file to /home/jcstaff/proj/784/exercises/ex1/target/test-classes
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/test-reports
        [junit] Running myorg.mypackage.ex1.AppTest
        [junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.127 sec
    
    BUILD SUCCESSFUL
    Total time: 17 seconds

    You won't see the output come to stdout when using Ant, but you can locate all output in the FILE logger output defined to be in target/log4j-out.txt. This behavior will get a little better under Maven.

    $ more target/log4j-out.txt 
    INFO  26-08 22:59:23,357 [myorg.mypackage.ex1.AppTest] (AppTest.java:testApp:17)  -testApp
    DEBUG 26-08 22:59:23,361 [myorg.mypackage.ex1.App] (App.java:returnOne:11)  -Here's One! 

    Your project structure should look like the following at this point.

    > find . -type f
    ./src/main/java/myorg/mypackage/ex1/App.java
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./src/test/resources/log4j.xml
    ./build.properties
    ./build.xml
    ./target/classes/myorg/mypackage/ex1/App.class
    ./target/ex1.jar
    ./target/test-classes/myorg/mypackage/ex1/AppTest.class
    ./target/test-classes/log4j.xml
    ./target/test-reports/TEST-myorg.mypackage.ex1.AppTest.txt
    ./target/test-reports/TEST-myorg.mypackage.ex1.AppTest.xml
    ./target/log4j-out.txt
  5. Change the logging level so that only the App class performs logs to the logfile. By extending the logger name specification all the way to the class, we further limit which classes apply to this logger.

    
        <logger name="myorg.mypackage.ex1.App">
          <level value="debug"/>
          <appender-ref ref="logfile"/>
       </logger>

    After re-running the build you should notice the DEBUG for only the App is included because of the change we made to the logger outside the code.

    $ more target/log4j-out.txt 
    DEBUG 26-08 23:07:04,809 [myorg.mypackage.ex1.App] (App.java:returnOne:11)  -Here's One!
  6. Repeat after me. "I will never use System.out.println() in this class." Doing so will make it difficult for your deployed components to have their logs controlled and accessible as it is instantiated in unit testing, integration testing, and deployment environments.

In this chapter you will automate the build using Maven by defining a simple Maven project definition that will go with your project tree. In the previous chapters you worked with a reasonable project tree that could have looked different in a number of ways and could have been accounted for by different path constructs. However, why be different? The project tree we put together that accounted for production classes, test classes, resource files, archives, unit tests, test reports, etc. follows Maven's standard build tree almost exactly (with the exception of the name of the target/test-reports directory). We will be able to add a Maven project definition without much effort.

Tip

The Maven community has a tremendous amount of documentation, examples, and on-line discussions. This course has many examples that are more specific for the work you will be actively performing. Many of these resources are a quick google search away but know that I go out of my way to make sure you spend as much time as possible on design and JavaEE aspects in class. If you are stuck on Maven -- ask. I know what you are trying to do and can likely point you to an example that is relevant to what you are doing in class. If you are still stuck on Maven issues -- send it to me. I will fix it personally. There is nothing more irritating for you than to be fighting with the builds when you want to be spending more time understanding, designing, trying, and mastering the product of what is being built.

Note

Using Maven requires only an initial download and installation. Plugins and dependencies will be downloaded from remote repositories as needed. Connectivity to the internet is required until all dependencies have been satisfied.

Note

Maven will automatically go out and download any missing dependencies and recursively download what they depend upon. If you are running Maven for the first time, this could result in a significant amount of downloading and may encounter an occasional connection failure with repositories. Once a non-SNAPSHOT version is downloaded (e.g., 1.3), Maven will not re-attempt to download it. Maven will, however, go out and check various resources to stay in sync. If you know you already have everything you need, you can run in off-line mode using the "-o" flag on the command line or its equivalent entry within the settings.xml file. This can save you seconds of build time when disconnected from the Internet.

  1. Create a pom.xml file in project basedir. This will be used to define your entire project. Refer to the Maven POM Reference for details about each element.

    • modelVersion - yes; its required

    • groupId - just as it sounds, this value is used to group related artifacts. groupId is a hierarchical value and the individual names are used to form a directory structure in the Maven repository (e.g., artifacts in the myorg.myproject.foo groupId will be located below the HOME/.m2/repository/myorg/myproject/foo directory).

    • version - Maven has a strong versioning system and versions appended with the word SNAPSHOT are handled differently. Projects with a version ending in -SNAPSHOT are thought to be in constant change, with no official release yet available. Projects with a version lacking the -SNAPSHOT ending are meant to be an official release, with no other variants available with the same tag.

    • dependency.scope - this is used to define the scope the dependency gets applied. It defines the visibility within the project for the dependency and whether it is carried along with the module through transitive dependency analysis. With open-source software, a typical JavaEE application could have 10s to 100s of individual modules it dependends upon and the proper use of transitive dependencies makes this manageable.

      • scope=compile is the default and is used to describe artifacts that the src/main directory depends upon and will also be visible by classes in src/test. These dependency artifacts will be brought along with the module when transitive dependencies are evaluated.

      • scope=test is used to define artifacts which src/test depends upon. These will be made available during testing, but will not be visible to classes in src/main and will not be considered a dependency for downstream users of the module. Consult the maven documentation for other scopes, but one other that is commonly used in class is scope=provided.

      • scope=provided is similar to scope=compile in that the src/main tree can see it, however like scope=test, it is not carried forward. Each downstream module is required to know about the dependency and provide a replacement. This is common when using JavaEE APIs that have been packaged by different vendors used by different module developers.

      • maven-compiler-plugin - this declaration is not necessary for this exercise, but like our Ant script, specify the Java version to make sure we get what we need.

      • properties.project.build.sourceEncoding - this defines the default handling of file content for all plugins within a module. The default is platform-specific if left unspecified and we avoid an annoying warning by specifying it.

    Note

    Although the m2e Eclipse plugin reads the pom dependency and creates a classpath within Eclipse, it does not honor the differences between the different scope values. All dependencies are blended together. The result is that something may compile and run fine within Eclipse and report a missing class when built at the command line. If that happens, check for classes using artifacts that have been brought in as scope=test or for classes incorrectly placed within the src/main tree.

    
    <?xml version="1.0"?>
    <project>
        <modelVersion>4.0.0</modelVersion>

        <groupId>myorg.myproject</groupId>
        <artifactId>ex1</artifactId>

        <name>My First Simple Project</name>
        <version>1.0-SNAPSHOT</version>

        <properties>
            <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        </properties>

        <dependencies>
            <dependency>
                <groupId>commons-logging</groupId>
                <artifactId>commons-logging</artifactId>
                <version>1.1.1</version>
                <scope>compile</scope>
            </dependency>

            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <version>4.10</version>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>log4j</groupId>
                <artifactId>log4j</artifactId>
                <version>1.2.13</version>
                <scope>test</scope>
            </dependency>
        </dependencies>

        <build>
            <plugins>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-compiler-plugin</artifactId>
                    <version>3.1</version>
                    <configuration>
                        <source>1.7</source>
                        <target>1.7</target>
                    </configuration>
                </plugin>
            </plugins>
        </build>
    </project>

    Your project tree should look like the following at this point.

    > find . -type f
    ./src/main/java/myorg/mypackage/ex1/App.java
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./src/test/resources/log4j.xml
    ./build.properties
    ./build.xml
    ./pom.xml
  2. Note that the pom.xml file is not required to have an assigned schema. However, adding one does allow for XML editing tools to better assist in creating a more detailed POM. Replace the project element from above with the following declarations to assign an XML schema.

    
    <project xmlns="http://maven.apache.org/POM/4.0.0" 
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  3. Run the package "phase" and watch the project compile, assemble, and test. Maven has many well-known phases that correspond to the lifecycle of build steps that goes into validating, preparing, building, testing, and deploying artifacts of a module. You can find out more about Maven phases here I refer to this page very often.

    $ mvn package
    [INFO] Scanning for projects...
    [INFO]                                                                         
    [INFO] ------------------------------------------------------------------------
    [INFO] Building My First Simple Project 1.0-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    [INFO] 
    [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ ex1 ---
    [INFO] Using 'UTF-8' encoding to copy filtered resources.
    [INFO] skip non existing resourceDirectory /home/jcstaff/proj/784/exercises/ex1/src/main/resources
    [INFO] 
    [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ ex1 ---
    [INFO] Changes detected - recompiling the module!
    [INFO] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/classes
    [INFO] 
    [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ ex1 ---
    [INFO] Using 'UTF-8' encoding to copy filtered resources.
    [INFO] Copying 1 resource
    [INFO] 
    [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ ex1 ---
    [INFO] Changes detected - recompiling the module!
    [INFO] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/test-classes
    [INFO] 
    [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ ex1 ---
    [INFO] Surefire report directory: /home/jcstaff/proj/784/exercises/ex1/target/surefire-reports
    
    -------------------------------------------------------
     T E S T S
    -------------------------------------------------------
    Running myorg.mypackage.ex1.AppTest
    INFO  27-08 00:15:14,277 (AppTest.java:testApp:17)  -testApp
    DEBUG 27-08 00:15:14,283 (App.java:returnOne:11)  -Here's One!
    Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.383 sec
    
    Results :
    
    Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
    
    [INFO] 
    [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ ex1 ---
    [INFO] Building jar: /home/jcstaff/proj/784/exercises/ex1/target/ex1-1.0-SNAPSHOT.jar
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 5.984 s
    [INFO] Finished at: 2014-08-27T00:15:14-04:00
    [INFO] Final Memory: 16M/174M
    [INFO] ------------------------------------------------------------------------
    
  4. > find . -type f
    ./build.xml
    ./build.properties
    ./pom.xml
    ./target/surefire-reports/TEST-myorg.mypackage.ex1.AppTest.xml
    ./target/surefire-reports/myorg.mypackage.ex1.AppTest.txt
    ./target/log4j-out.txt
    ./target/maven-archiver/pom.properties
    ./target/ex1-1.0-SNAPSHOT.jar
    ./target/test-classes/myorg/mypackage/ex1/AppTest.class
    ./target/test-classes/log4j.xml
    ./target/classes/myorg/mypackage/ex1/App.class
    ./target/maven-status/...
    ./src/test/resources/log4j.xml
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./src/main/java/myorg/mypackage/ex1/App.java
    • src/main/java classes were built in the target/classes directory by convention by the maven-compiler plugin that is automatically wired into JAR module builds. We didn't have to configure it because we structured our project using Maven directory structure and used the default packaging=jar module type (since packaging=jar is the default, it could be left unspecified). Many of the standard features are enacted when for modules with packaging=jar type.

    • src/test/java classes where built in the target/test-classes directory by convention by the maven-compiler plugin.

    • src/test/resources where copied to the target/test-classes directory by convention by the maven-resources-plugin that is automatically wired into JAR module builds.

    • test cases were run and their reports were placed in target/surefire-reports by convention by the maven-surefire-plugin that is automatically wired into JAR module builds.

    • The build.xml and build.properties file from our work with Ant is still allowed to exist. We could even delegate from Maven to Ant using the maven-antrun-plugin if we had legacy build.xml scripts that we wanted to leverage.

  5. For *fun*, lets add a README that could be used to describe something about your project and have it be processed as part of the documentation for the module. You do not need to do this for class projects, but walking through this may be helpful in understanding how the class website is created from the source you have on your disk. Maven supports a couple of documentation generation languages, but lets just use HTML to keep this simple. Place the following content to src/site/resources/README.html

    
    mkdir -p src/site/resources
    $ cat src/site/resources/README.html 

    <?xml version="1.0"?>
    <html>
        <head>
            <title>My First Project</title>
        </head>
    <body>
        <section><h1>My First Project</h1></section>
        <p/>
        This is my first project. Take a look at 
        <p/>
        <ul>
            <li>this ....</li>
            <li>that ....</li>
            <li>or <a href="./index.html">go home</a></li>
        </ul>
        </section>
    </body>
    </html>
  6. The above is enough to provide the page. Now add a link to it from the project menu. Add the following content to src/site/site.xml

    
    $ cat src/site/site.xml

    <?xml version="1.0" encoding="UTF-8"?>
    <project name="${project.name}">
      <body>
        <menu name="Content">
            <item name="README" href="README.html"/>
        </menu>
      </body>
    </project>

    You must also specify a version# for the maven-project-info-reports-plugin. Maven is extremely version-aware.

    
                <plugin> 
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-project-info-reports-plugin</artifactId>
                    <version>2.7</version>
                </plugin>
    > find . -type f
    ./src/main/java/myorg/mypackage/ex1/App.java
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./src/test/resources/log4j.xml
    ./src/site/resources/README.html
    ./src/site/site.xml
    ./build.properties
    ./build.xml
    ./pom.xml
  7. Build the site and open target/site/index.html in your browser. You should see a link to the README on the left side.

    $ mvn site                                                                                                                                                       
    [INFO] Scanning for projects...   
    ...
    [INFO] BUILD SUCCESS
    $ find target/site/ -name *.html
    
    target/site/plugin-management.html
    target/site/index.html
    target/site/mail-lists.html
    target/site/issue-tracking.html
    target/site/license.html
    target/site/project-info.html
    target/site/dependency-info.html
    target/site/README.html
    target/site/dependencies.html
    target/site/team-list.html
    target/site/source-repository.html
    target/site/integration.html
    target/site/distribution-management.html
    target/site/project-summary.html
    target/site/plugins.html

    Note

    If you use the posted firstSimpleModuleEx as a starting point for your work you will need to re-enable site generation under the maven-site-plugin. This was turned off since the posted examples do not contain enough information to be posted with the rest of the class examples.

    
    
                <!-- exclude this modules from site generation -->
                <plugin> 
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-site-plugin</artifactId>
                    <version>3.4</version>
                    <configuration>
                        <skip>true</skip>
                        <skipDeploy>true</skipDeploy>
                    </configuration>
                </plugin>
  8. Okay, that was a lot of work to just copy an html file. Now lets add javadoc to our project and create a link to it. Add the following contents to the bottom of the pom.xml file.

    
        <reporting>
            <plugins>
                <plugin> 
                    <artifactId>maven-javadoc-plugin</artifactId>
                    <groupId>org.apache.maven.plugins</groupId>
                    <version>2.9.1</version>
                    <configuration>
                        <detectLinks/>
                        <show>private</show>
                        <source>1.7</source>
                        <links>
                            <link>http://download.oracle.com/javaee/7/api/</link>
                            <link>http://download.oracle.com/javase/7/docs/api/</link>
                        </links>
                    </configuration>
                </plugin>
            </plugins>
        </reporting>
  9. We could create a link the the apidocs/index.html like we did with README.html, but that would be something we'd keep having to update each time we added a new report. Lets add a property to the site.xml menu so a link to Javadoc and other reports can drop in automatically.

    
    <?xml version="1.0" encoding="UTF-8"?>
    <project name="${project.name}">
      <body>
        <menu name="Content">
            <item name="README" href="README.html"/>
        </menu>

        <menu ref="reports"/>

      </body>
    </project>
  10. Re-generate the site documentation with the site target. Open the target/site/index.html page and you should now see a menu item for "Project Reports" -> "JavaDocs". Our App class should be included in the Javadoc.

  11. Note

    The pom.xml file is the main configuration source for 99% of what you develop with Maven. There is an additional $HOME/.m2/settings.xml file where you can specify build site-specific properties. These will be available to all pom.xml files. You want to be careful not to over-populate the settings.xml file (taking advantage of its re-usable specification) since it will make you pom.xml files too dependent on a particulate build site. Refer to the Settings Descriptor for detailed information on settings.xml. The following provides a step-wise generation of the settings.xml file you put in place during Development Environment Setup. Read thru this for reference since you likely already have everything in place you need.

    Let's start a settings.xml file to store properties that are specific to our build site. You can find details about each setting at the following URL.

    
    cat $HOME/.m2/settings.xml

    <?xml version="1.0"?>
    <settings xmlns="http://maven.apache.org/POM/4.0.0"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">

    </settings>
  12. If your $HOME directory path contains spaces, you will want to provide an override for the localRepository. Provide a custom path that does not contain spaces. This value will default to a "$HOME/.m2/repository" directory.

    
        <!-- this overrides the default $HOME/.m2/repository location. -->
        <localRepository>c:/jhu/repository</localRepository>
  13. Add the following specification to either the settings.xml file or the local pom.xml file. If you specify it to the local pom.xml file -- it will only apply to that project. If you specify it in the settings.xml file -- it will be global to all projects in your area. More will be covered on this later. However, it should be noted that this profile is not active unless someone specifically asks for it (-Pdebugger) or the "debugger" environment variable is set (-Ddebugger=(anything)).

    
            <profile>
                <id>debugger</id>
                <!-- this should only be activated when performing interactive
                     debugging -->
                <activation>
                    <property>
                        <name>debugger</name>
                    </property>
                </activation>
                <properties>
                    <surefire.argLine>-Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent -Djava.compiler=NONE</surefire.argLine>
                </properties>                                  
            </profile>
  14. Although not needed for this class -- at times you will need access to a dependency that is not available in a Maven repository. COTS libraries are generally not available at ibiblio.org. You must download it and manually install it locally.

    This step will go though importing a stand-alone archive into the repository to resolve any dependencies. Start by declaring a dependency before we do the import. Note that a new scope property was added. See the Dependency Mechanism Intro Page for a discussion of scope, but in this case it is indicating that it should only be present on the command line and not the runtime classpath.

    
            <dependency>
                <groupId>foo</groupId>
                <artifactId>bar</artifactId>
                <version>1.1</version>
                <scope>provided</scope>
            </dependency>
  15. Attempt the build the module with the missing dependency. The build should fail but note that Maven attempted all known external repositores.

    > mvn package
    [INFO] Scanning for projects...
    [INFO]
    [INFO] ------------------------------------------------------------------------
    [INFO] Building My First Simple Project 1.0-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    Downloading: http://webdev.apl.jhu.edu/~jcs/maven2/foo/bar/1.1/bar-1.1.pom
    Downloading: http://webdev.apl.jhu.edu/~jcs/maven2-snapshot/foo/bar/1.1/bar-1.1.pom
    Downloading: http://repo1.maven.org/maven2/foo/bar/1.1/bar-1.1.pom
    [WARNING] The POM for foo:bar:jar:1.1 is missing, no dependency information available
    Downloading: http://webdev.apl.jhu.edu/~jcs/maven2/foo/bar/1.1/bar-1.1.jar
    Downloading: http://webdev.apl.jhu.edu/~jcs/maven2-snapshot/foo/bar/1.1/bar-1.1.jar
    Downloading: http://repo1.maven.org/maven2/foo/bar/1.1/bar-1.1.jar
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD FAILURE
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 1.437s
    [INFO] Finished at: Wed Feb 02 12:20:51 EST 2011
    [INFO] Final Memory: 2M/15M
    [INFO] ------------------------------------------------------------------------
    [ERROR] Failed to execute goal on project ex1: Could not resolve dependencies for project myorg.myproject:ex1:jar:1.0-SNAPSHOT: 
    Could not find artifact foo:bar:jar:1.1 in webdev-baseline (http://webdev.apl.jhu.edu/~jcs/maven2) -> [Help 1]
  16. The old error message provided for Maven 2 was much better if a manual install is what you really needed. The newer (Maven 3) one does not provide instruction. In this case, manually install a jar file that represents the declaration. Assign it a groupId of foo, an artifactId of bar, and a version of 1.1. Don't forget to add the -DgeneratePom=true or you will get a download warning everytime you try to build. All we need is a valid .jar file. If you don't have one laying around, just create one with valid structure.

    $ touch bar.jar
    $ mvn install:install-file -DgroupId=foo -DartifactId=bar -Dversion=1.1 -Dpackaging=jar -Dfile=bar.jar -DgeneratePom=true
    
    [INFO] Scanning for projects...
    [INFO]                                                                         
    [INFO] ------------------------------------------------------------------------
    [INFO] Building My First Simple Project 1.0-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    [INFO] 
    [INFO] --- maven-install-plugin:2.4:install-file (default-cli) @ ex1 ---
    [INFO] Installing /home/jcstaff/proj/784/exercises/ex1/bar.jar to /home/jcstaff/.m2/repository2/foo/bar/1.1/bar-1.1.jar
    [INFO] Installing /tmp/mvninstall5322334237902777597.pom to /home/jcstaff/.m2/repository2/foo/bar/1.1/bar-1.1.pom
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 0.975 s
    [INFO] Finished at: 2014-08-27T01:22:06-04:00
    [INFO] Final Memory: 6M/105M
    [INFO] ------------------------------------------------------------------------
    
  17. After successfully installing the dummy archive, you should be able to locate the JAR and other supporting files in the local repository. Be sure to look where you have directed lcoalRepository in $HOME/.m2/settings.xml

    
    $ find /home/jcstaff/.m2/repository2/foo/bar/
    /home/jcstaff/.m2/repository2/foo/bar/
    /home/jcstaff/.m2/repository2/foo/bar/1.1
    /home/jcstaff/.m2/repository2/foo/bar/1.1/bar-1.1.pom.lastUpdated
    /home/jcstaff/.m2/repository2/foo/bar/1.1/_remote.repositories
    /home/jcstaff/.m2/repository2/foo/bar/1.1/bar-1.1.jar.lastUpdated
    /home/jcstaff/.m2/repository2/foo/bar/1.1/bar-1.1.jar
    /home/jcstaff/.m2/repository2/foo/bar/1.1/bar-1.1.pom
    /home/jcstaff/.m2/repository2/foo/bar/maven-metadata-local.xml
  18. 
    $ more /home/jcstaff/.m2/repository2/foo/bar/1.1/bar-1.1.pom

    <?xml version="1.0" encoding="UTF-8"?>
    <project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns="http://maven.apache.org/POM/4.0.0"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
      <modelVersion>4.0.0</modelVersion>
      <groupId>foo</groupId>
      <artifactId>bar</artifactId>
      <version>1.1</version>
      <description>POM was created from install:install-file</description>
    </project>
  19. Now try running "mvn package" and it should successfully resolve the fake dependency on the bar.jar.

  20. One last thing...Maven pulls in defintions from many places in the build environment. If you ever want to know what the total sum of those sources are (the "effectice POM"), the execute the help:effective-pom goal.

    
     $ mvn help:effective-pom
    [INFO] Scanning for projects...

    ...

    <project xmlns...
      <modelVersion>4.0.0</modelVersion>
      <groupId>myorg.myproject</groupId>
      <artifactId>ex1</artifactId>
      <version>1.0-SNAPSHOT</version>
      <name>My First Simple Project</name>
      <properties>
        <jboss.home>/opt/jboss-eap-6.1</jboss.home>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
      </properties>
      <dependencies>
        <dependency>
          <groupId>foo</groupId>
          <artifactId>bar</artifactId>
          <version>1.1</version>
          <scope>provided</scope>
        </dependency>
        <dependency>
          <groupId>commons-logging</groupId>
          <artifactId>commons-logging</artifactId>
          <version>1.1.1</version>
    ...

In this chapter we will be importing the project into the Eclipse IDE, running a few project goals, and demonstrating a debug session. IDEs provide very useful code navigation and refactoring tools to name only a few features. However, one of the unique tools offered by the IDEs is the ability to step through the code in a debugging session. Please do not end this exercise before becoming comfortable with the ability to use the debugger.

Note

Maven/Eclipse integration is probably the most volatile aspects of the environment. Over the years the integration has progressed from a Maven plugin (maven-eclipse-plugin; integrating from the Maven side on-demand) to Eclipse plugins (m2e; continuously integrating from the Eclipse side dynamically). In the middle were phases where the Eclipse m2e plugin had to be manually installed after download. However, in recent versions, m2e has been pre-installed into Eclipse. It gets easier to get started every day but harder to keep instructions up to date.

Warning

The Maven/Eclipse integration is a Maven-first approach where the Eclipse project always follows the Maven pom.xml. That is on of the main reasons this exercise started you with a pom.xml file first and progressed later to the IDE. It is wrong (or at least non-portable) to manually adjust the build path of a project within Eclipse. You must adjust the build path of a project by editing the pom.xml and having Eclipse automatically detect and follow that change.

  1. Select File->Import->Maven->Existing Maven Projects, navigate to the directory with the project you have been working with and select OK.


  2. The project should successfully import. Note that Eclipse has imported the project configuration from the Maven POM and has done at least the following...


  1. Right-click on the pom.xml file or project folder and execute Run As->Maven install". You can also get back to this window through the Run As option on the toolbar once you have the project selective. This mode runs the JUnit test you wrote within the context of the full maven project. All pre-test and post-test setup and teardown you wired into the Maven command line build will be executed.


    Note that you can create a separate window for any of the Eclipse tabs. Using dual monitors -- I commonly display the IDE on one page the the Console output on another when using debug statements.


  2. Rerun the tests as a JUnit test. This mode runs the JUnit test raw within Eclipse. This is very efficient for making and testing Java code changes but will not run any maven setup or teardown plugins (which is not always required or can be avoided).


    Always Make Projects Eclipse/JUnit-Friendly

    Maven is a very useful and powerful tool. However, there is a point where the information from Maven has been captured by the IDE and we don't need to run full Maven builds (e.g., RunAs: Maven Install). As you saw from the RunAs: JUnit test we were able to run the unit test and run it exceptionally fast without Maven. I strongly recommend making your unit tests Eclipse/JUnit-friendly so that you can work efficiently in certain areas. That means hard-code reasable defaults without relying on the maven-surefire-plugin passing in properties from the outside environment. Allow overrides, but code in a usable default into the test.

There are two primary ways to use the debugger; separate/remote process and embedded (within Eclipse). The later is much easier to use but is limited by what you can execute within the Eclipse IDE. The second takes a minor amount of setup but can be re-used to debug code running within application servers on your local and remote machines.

  1. Lets start with a remote debugging session by recalling the profile you were asked to add to either your pom.xml or settings.xml. If you have not done so, you can add it to either at this time.

        <profiles>
            <profile> <!-- tells surefire to run JUnit tests with remote debug -->
                <id>debugger</id>
                <activation>
                    <property>
                        <name>debugger</name>
                    </property>
                </activation>
                <properties>
                    <surefire.argLine>-Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent -Djava.compiler=NONE</surefire.argLine>
                </properties>       
            </profile>            
        
        </profiles>
  2. Add a definition for the "surefire.argLine" within the maven-surefire-plugin declaration. Surefire is already being pulled into the project, this declaration just specifies the extra configuration along with a specific version. Maven will start complaining ig you leave off that version.

                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-surefire--plugin</artifactId>
                    <version>2.17</version>
                    <configuration>
                        <argLine>${surefire.argLine}</argLine>
                    </configuration>
                </plugin>
    
  3. Uncomment (or re-add) your failure test in AppTest.java.

        @Test
    
        public void testFail() {
            //System.out.println("testFail");
            log.info("testFail");
            App app = new App();
            assertTrue("app didn't return 0", app.returnOne() == 0);
        }
  4. Execute a Run As Maven test. My selecting the project, right clicking and chosing the right target. You should see the following error in the console.

    Running myorg.mypackage.ex1.AppTest
    INFO  28-08 23:52:31,809 (AppTest.java:testApp:17)  -testApp
    DEBUG 28-08 23:52:31,821 (App.java:returnOne:11)  -Here's One!
    INFO  28-08 23:52:31,829 (AppTest.java:testFail:25)  -testFail
    DEBUG 28-08 23:52:31,831 (App.java:returnOne:11)  -Here's One!
    Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.409 sec <<< FAILURE!
    testFail(myorg.mypackage.ex1.AppTest)  Time elapsed: 0.016 sec  <<< FAILURE!
    java.lang.AssertionError: app didn't return 0
        at org.junit.Assert.fail(Assert.java:93)
        at org.junit.Assert.assertTrue(Assert.java:43)
        at myorg.mypackage.ex1.AppTest.testFail(AppTest.java:27)
  5. Click on the hyperlink to one of the lines in the project source code in the failure stack trace. Place a breakpoint at that line by double-clicking on the line number. A blue ball should appear to the left of the numbers.

  6. Debug As->Maven build..., change the base directory to a re-usable ${project_loc} variable, assign the "test" goal, and activate the "debugger" profile. Click "Debug" when finished. It will automatically save.


    You should see the Maben build start but pause before executing the first JUnit test. Think of this as the "server-side" of the debugger session.

    [INFO] --- maven-surefire-plugin:2.17:test (default-test) @ firstSimpleModuleEx ---
    [INFO] Surefire report directory: /home/jcstaff/workspaces/ejava-class/ejava-student/javase/firstSimpleModuleEx/target/surefire-reports
    
    -------------------------------------------------------
     T E S T S
    -------------------------------------------------------
    Listening for transport dt_socket at address: 8000
  7. Start the "client-side" of the debugger session by clicking on the bug pulldown at the top of the window. Select debug configurations, double click on Remote Java Application, select your project, and notice the localhost:8000 that came up agrees with your server-side configuration. Press "Debug" when you are ready and answer the prompt to change you view.


  8. The IDE should switch to a debugger view and be waiting at the line you set the breakpoint on. From here you can look at the state of any variables (we don't have any) and step into the next call.


  9. Once you are done with the debugging session you can click continue (agreen right arrow at top) or detach from the server (red swiggly line at top).

  10. Terminate the debugger session, retun to one of the JavaEE or Java-based views. Select a specific JUnit test, test method, package, or entire application and click Debug As JUnit test.

  11. Note the IDE again switches to the Debug view and is stopped at the breakpoint. You may step into the call, look at the state of any variable, or terminate the program (red square at top of window).

This chapter will cover setup required to start the development database in server-mode. The database has at least three (3) modes.

  • In-memory mode

  • File-based mode

  • Server-based mode

The in-memory option manages all tables within the memory footprint of the JVM. This is very fast and requires no server-process setup -- which makes it a good option for automated unit tests. However, since the DB instance is created and destroyed with each JVM execution it makes a bad choice for modules relying on multiple tools to pre-populate the database prior to executing tests.

The file-based option stores all information in the filesystem. It is useful for multi-JVM (sequential) setup and post-mortem analysis. However only a single client may access the files at one time. I have seen this used effectively when simulating external databases -- where an external setup script populates the database and the JVM running the unit tests just queries the information as they would do in production. We will use this as an option to server-based mode since we are using separate plugins to initialize the database.

The server-based option requires a separate process activated but allows concurrent connections from database user interface while the JVM is under test. This chapter will focus on the setup required to run the database in server mode.

  1. Prepare your environment to run the database in server mode for this exercise by following the instructions defined in Development Environment Setup.

  2. Start the database and web server server in a directory where you wish to create database files. Your h2.jar file source be located in M2_REPO/com/h2database/h2/*/h2*.jar to name at least one location. Another location is JBOSS_HOME/modules/com/h2database/h2/main/h2*.jar

    cd /tmp
    
    java -jar h2.jar

    This should result in a database server process and access to a web-based database UI through the following local URL: http://localhost:8082/login.jsp

  3. Connect to the database server from the web-based UI.

    Driver Class: org.h2.Driver
    
    JDBC URL: jdbc:h2:tcp://localhost:9092/h2db/ejava</jdbc.url>
    User Name: sa
    Password: 

    If you use file-based mode, the connection information would look like the following where "./h2db/ejava" must point to the exact same path your JVM under test uses. This can be a relative or fully-qualified path.

    Driver Class: org.h2.Driver
    
    JDBC URL: jdbc:h2:./h2db/ejava
    User Name: sa
    Password: 

This chapter will put in place a couple of core constructs that will allow Maven and the IDE recognize the elements of your source tree.

  1. Create a root directory for your project and populate with a pom.xml file

    
    <?xml version="1.0"?>
    <project
        xmlns="http://maven.apache.org/POM/4.0.0"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
        <modelVersion>4.0.0</modelVersion>

        <groupId>myorg.jpa</groupId>
        <artifactId>entityMgrEx</artifactId>
        <version>1.0-SNAPSHOT</version>

        <name>Entity Manager Exercise</name>

    </project>
  2. Define a remote repository to use to download hibernate artifacts

    
     <!-- needed to resolve some hibernate dependencies -->
        <repositories>
            <repository>
                <id>jboss-nexus</id>
                <name>JBoss Nexus Repository</name>
                <url>https://repository.jboss.org/nexus/content/groups/public-jboss/</url>
            </repository>
        </repositories> 
  3. Add property definitions for versions of dependencies and plugins we will be using.

    
    
        <properties>
            <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
            <java.source.version>1.8</java.source.version>
            <java.target.version>1.8</java.target.version>

            <jboss.host>localhost</jboss.host>
            <db.host>${jboss.host}</db.host>

            <maven-compiler-plugin.version>3.3</maven-compiler-plugin.version>
            <maven-jar-plugin.version>2.6</maven-jar-plugin.version>
            <maven-surefire-plugin.version>2.18.1</maven-surefire-plugin.version>
            <sql-maven-plugin.version>1.5</sql-maven-plugin.version>        

            <commons-logging.version>1.1.1</commons-logging.version>
            <h2db.version>1.4.188</h2db.version>
            <hibernate-jpa-2.1-api.version>1.0.0.Final</hibernate-jpa-2.1-api.version>
            <hibernate-entitymanager.version>4.3.4.Final</hibernate-entitymanager.version>
            <hibernate3-maven-plugin.version>3.0</hibernate3-maven-plugin.version>
            <hibernate3.version>3.6.10.Final</hibernate3.version>
            <junit.version>4.12</junit.version>
            <log4j.version>1.2.17</log4j.version>
            <slf4j.version>1.7.12</slf4j.version>
            <wagon-ssh-external.version>2.9</wagon-ssh-external.version>
            <maven-deploy-plugin.version>2.8.2</maven-deploy-plugin.version>
            <maven-site-plugin.version>3.4</maven-site-plugin.version>
        </properties>
  4. Add a dependencyManagement section to define and configure the dependencies we will be using to work with JPA. These are passive definitions -- meaning they don't actually add any dependencies to your project. They just define the version and possible exclusions for artifacts so all child/leaf projects stay consistent.

    
     <dependencyManagement>
            <dependencies>
                <dependency>
                  <groupId>commons-logging</groupId>
                  <artifactId>commons-logging</artifactId>
                  <version>${commons-logging.version}</version>
                </dependency>
                <dependency>
                    <groupId>org.hibernate.javax.persistence</groupId>
                    <artifactId>hibernate-jpa-2.1-api</artifactId>
                    <version>${hibernate-jpa-2.1-api.version}</version>
                </dependency>
                <dependency>
                    <groupId>org.hibernate</groupId>
                    <artifactId>hibernate-entitymanager</artifactId>
                    <version>${hibernate-entitymanager.version}</version>
                </dependency>
                <dependency>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                    <version>${slf4j.version}</version>
                </dependency>
                <dependency>
                    <groupId>junit</groupId>
                    <artifactId>junit</artifactId>
                    <version>${junit.version}</version>
                </dependency>
                <dependency>
                  <groupId>log4j</groupId>
                  <artifactId>log4j</artifactId>
                  <version>${log4j.version}</version>
                </dependency>    
            </dependencies>
        </dependencyManagement> 

    Note

    Knowing this exercise will always be a single module -- we could do this simpler. However, it is assumed that you will soon take the information you learn here to a real enterprise project and that will have many modules and could benefit from reusing a standard parent configuration. All definitions will be housed in a single module during the conduct of this exercise but the properties, dependencyManagement, and pluginManagement sections we will build below can be lifted and moved to a parent pom in a multi-module project.

  5. Add pluginManagement definitions for certain plugins we will use in this module. Like above -- these are passive definitions that define the configuration for certain plugins when the child/leaf projects chose to use them. Lets start with a simple example and add a few more complex ones later. In this example, we are making sure all uses of the jar plugin are of a specific version.

    
     <build>
            <pluginManagement>
                <plugins>

                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-jar-plugin</artifactId>
                        <version>${maven-jar-plugin.version}</version>
                    </plugin>

                </plugins>    
            </pluginManagement>
        </build> 
  6. Add the src/main dependencies. This represents what your code depends upon at compile time and runtime.

    • scope=compile is used when your src/main code depends on the artifact to compile and you wish the design of transitive dependency to automatically bring this dependency with the module.

    • scope=provided is used when your src/main code depends on the artifact to compile but you do not wish this automatically be brought along when used with downstream clients. Normally this type of artifact is an API and the downstream client will be providing their own version of the API packaged with their provider.

    
     <dependencies>
            <dependency>
              <groupId>commons-logging</groupId>
              <artifactId>commons-logging</artifactId>
              <scope>compile</scope>
            </dependency>
        
            <dependency>
                <groupId>org.hibernate.javax.persistence</groupId>
                <artifactId>hibernate-jpa-2.1-api</artifactId>
                <scope>provided</scope>
            </dependency>
            ...
        </dependencies> 

    Note

    Notice how the definitions above are lacking a version element. The dependency declaration actively brings the dependency into the project and inherits the definition specified by the dependencyManagement section above.

  7. Add the src/test standard dependencies.

    • scope=test is used for anything that your src/test code depends upon (but not your src/main) or what your unit tests need at runtime to operate the test. For example, a module may declare a scope=test dependency on h2 database (later) to do some local unit testing and then be ultimately deployed to a postgres server in a downstream client. In this case we are picking JUnit4 as the testing framework, log4j as the logging implementation for commons-logging, and hibernate as the JPA implementation for the jpa-api. We are also adding an extra dependency to allow hibernate slf4j logging to be integrated with log4j.

    
     <dependency>
                <groupId>org.hibernate</groupId>
                <artifactId>hibernate-entitymanager</artifactId>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-log4j12</artifactId>
                <scope>test</scope>
            </dependency>

            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>log4j</groupId>
                <artifactId>log4j</artifactId>
                <scope>test</scope>
            </dependency>    
  8. Add a testResources definition to the build section to get src/test/resource files filtered when copied into the target tree. We do this so we have a chance to replace the variables in the persistence.xml and hibernate.properties file.

    
     <build>
            <!-- filtering will replace URLs, credentials, etc in the 
                 files copied to the target directory and used during testing.
                -->
            <testResources>
                <testResource>
                    <directory>src/test/resources</directory>
                    <filtering>true</filtering>
                </testResource>
            </testResources>

            <pluginManagement>
            ...
        </build> 
  9. Add a compiler specification to control the source and target java versions. In this case we are picking up the specific value from property variables set above and can be overridden in the developer's settings.xml or on the command line using system properties.

    
     <build>
            <pluginManagement>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-compiler-plugin</artifactId>
                        <version>${maven-compiler-plugin.version}</version>
                        <configuration>
                                <source>${java.source.version}</source>
                                <target>${java.target.version}</target>
                        </configuration>                    
                    </plugin>      
                ...      
                </plugins>
            </pluginManagement>
        </build> 
  10. Add a definition for the maven-surefire-plugin so we can set properties needed for testing.

    
     <build>
            <pluginManagement>
                <plugins>
                    ...

                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>${maven-surefire-plugin.version}</version>
                        <configuration>
                            <argLine>${surefire.argLine}</argLine>
                        </configuration>
                    </plugin>            

                </plugins>    
            </pluginManagement>
        </build> 

    Note

    At this point, we are just allowing the argLine defined elsewhere to be optionally specified (for debugging). We do not yet have a need for system properties, but if we did the example shows how -Dname=value would be specified within the plugin configuration. The plugin (not pluginManagement) definition in the child pom will include any necessary system properties to be passed to the unit test.

    
     <build>
            <plugins>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-surefire-plugin</artifactId>
                    <configuration>
                        <systemPropertyVariables>
                            <name1>value1</name1>
                            <name2>value2</name2>
                        </systemPropertyVariables>
                    </configuration>
                </plugin>            
            </plugins>    
        </build> 
  11. Add a set of profiles that define H2 and Hibernate as our database and persistence provider. In the example below we are adding two database definitions (that happen to both be the same vendor). One requires the server to be running and the other uses an embedded server and a local filesystem. The embedded version can be easier to test with. The server version can be easier to debug. Activate which one you want with either your settings.xml#activeProfile settings or using a -Pprofile-name argument on the command line. If you already have a settings.xml#activeProfile setting, you can turn it off using -P\!deactivated-profile-name ((bash) or -P!deactivated-profile-name (dos)) and follow it up with -Pactivated-profile-name.

    
     <profiles>
            <profile> <!-- H2 server-based DB -->
                <id>h2srv</id>
                <properties>
                      <jdbc.driver>org.h2.Driver</jdbc.driver>
                      <jdbc.url>jdbc:h2:tcp://${db.host}:9092/h2db/ejava</jdbc.url>
                      <jdbc.user>sa</jdbc.user>
                      <jdbc.password/>
                      <hibernate.dialect>org.hibernate.dialect.H2Dialect</hibernate.dialect>
                </properties>
                <dependencies>
                    <dependency>
                        <groupId>com.h2database</groupId>
                        <artifactId>h2</artifactId>
                        <version>${h2db.version}</version>
                        <scope>test</scope>
                    </dependency>
                </dependencies>
            </profile>

            <profile> <!-- H2 file-based DB -->
                <id>h2db</id>
                <activation>
                    <property> 
                        <name>!jdbcdb</name>
                    </property>
                </activation>
                <properties>
                      <jdbc.driver>org.h2.Driver</jdbc.driver>
                      <jdbc.url>jdbc:h2:${basedir}/target/h2db/ejava</jdbc.url>
                      <jdbc.user>sa</jdbc.user>
                      <jdbc.password/>
                      <hibernate.dialect>org.hibernate.dialect.H2Dialect</hibernate.dialect>
                </properties>
                <dependencies>
                    <dependency>
                        <groupId>com.h2database</groupId>
                        <artifactId>h2</artifactId>
                        <version>${h2db.version}</version>
                        <scope>test</scope>
                    </dependency>
                </dependencies>
            </profile>
        </profiles> 

    Note

    Profiles can be used to control which options are enabled at build time to make the module more portable. I also use them to help identify which dependencies are brought in for what reason -- especially for profiles that are configure to always activate.

  12. Perform a test of your pom.xml by issuing a sample build command. All should complete even though there is nothing yet in your source tree.

    
    $ mvn clean test

This chapter will take you through steps that will populate your database with a (simple) database schema. A database schema is required by any module that directly interacts with a RDMBS. The JPA provider can automatically generate a database schema but that is generally restricted to early development and quick prototypes. A module within the data tier will ultimately be responsible for providing a separate artifact the create and/or migrate the schema from version-to-version. That is typically finalized by humans knowledgable about particular databases and can be aided by tool(s) we introduce in this exercise.

  1. Create a set of ddl scripts in src/main/resources/ddl to handle creating the schema, deleting rows in the schema, and dropping tables in the schema. Make sure each script has the word "create", "delete", or "drop" in its file name to match some search strings we'll use later. Have the database generate a value for the primary key. That value should not be allowed to be null.

    `-- src
        |-- main
        |   |-- java
        |   `-- resources
        |       |-- ddl
        |       |   |-- emauto_create.ddl
        |       |   |-- emauto_delete.ddl
        |       |   `-- emauto_drop.ddl
        `-- test
            |-- java
            `-- resources 

    Note

    We could actually skip this step and have the persistence provider create the table for us. That approach is great for quick Java-first prototypes. However, creating the schema outside of the persistence provider is a more realistic scenario for larger developments.

    # src/main/resources/ddl/emauto_create.ddl
    CREATE TABLE EM_AUTO (
        ID BIGINT generated by default as identity (start with 1) not null,
        MAKE VARCHAR(32),
        MODEL VARCHAR(32),
        COLOR VARCHAR(32),
        MILEAGE INT,
    
        CONSTRAINT em_autoPK PRIMARY KEY(ID)
    )
    
    # src/main/resources/ddl/emauto_delete.ddl
    DELETE FROM EM_AUTO;
    
    # src/main/resources/ddl/emauto_drop.ddl
    DROP TABLE EM_AUTO if EXISTS;
  2. You can perform a sanity check of the above scripts by pasting them into the DB UI SQL area and executing.

  3. Add the standard database setup and teardown scripts. This allows us to create a legacy database schema and write classes that map to that schema. We will later have the persistence provider create the schema for us when we are in quick prototype mode. First create the reusable portion of the definition in the pluginManagement section. This will define the version, database dependencies, and property information for all to inherit.

    
        <build>
            <pluginManagement>
                <plugins>
                    ...
                    <plugin>
                        <groupId>org.codehaus.mojo</groupId>
                        <artifactId>sql-maven-plugin</artifactId>        
                        <version>${sql-maven-plugin.version}</version>        
                    
                        <dependencies>
                            <dependency>
                                <groupId>com.h2database</groupId>
                                <artifactId>h2</artifactId>
                                <version>${h2db.version}</version>
                            </dependency>
                        </dependencies>
                    
                        <configuration>
                            <username>${jdbc.user}</username>
                            <password>${jdbc.password}</password>
                            <driver>${jdbc.driver}</driver>
                            <url>${jdbc.url}</url>          
                        </configuration>
                    </plugin>          

                </plugins>    
            </pluginManagement>
        </build> 
  4. Next add the potentially project-specific portion to a build-plugins-plugin section that would normally be in the child module. However, when you add this to the module -- do so within a profile that is wired to always run except when the system property -DskipTests is defined. This is a standard maven system property that builders use to build the module and bypass both unit and integration testing. By honoring the property here -- our module will only attempt to work with the database if we ware not skipping tests. Note the !bang-not character means "the absence of this system property".

    
         <profiles>
            ...
            <profile>
                <id>testing</id>
                <activation>
                    <property>
                        <name>!skipTests</name>
                    </property>
                </activation>
          
                <build>
                    <plugins>
                        <plugin>
                            <!-- runs schema against the DB -->
                            <groupId>org.codehaus.mojo</groupId>
                            <artifactId>sql-maven-plugin</artifactId>        

                            <executions>

                                <!-- place execution elements here  -->

                            </executions>
                        </plugin>          
                    </plugins>          
                </build>          
            </profile>
        </profiles> 
  5. Configure the sql-maven-plugin executions element to run any drop scripts in the source tree before running tests.

    
            <execution>
                <id>drop-db-before-test</id>
                <phase>process-test-classes</phase>
                <goals>
                    <goal>execute</goal>
                </goals>
                <configuration>
                    <autocommit>true</autocommit>
                    <fileset>
                        <basedir>${basedir}/src</basedir>
                        <includes>
                            <include>main/resources/ddl/**/*drop*.ddl</include>
                        </includes>
                    </fileset>
                    <onError>continue</onError>
                </configuration>
            </execution> 

    Note

    Note that we are controlling when the scripts are executed using the phase element. This is naming a well known Maven lifecycle phase for the build.

  6. Configure the sql-maven-plugin executions element to run any scripts from the source tree to create schema before running tests.

    
            <execution>
                <id>create-db-before-test</id>
                <phase>process-test-classes</phase>
                <goals>
                    <goal>execute</goal>
                </goals>
                <configuration>
                    <autocommit>true</autocommit>
                    <fileset>
                        <basedir>${basedir}/src</basedir>
                        <includes>
                            <include>main/resources/ddl/**/*create*.ddl</include>

                        </includes>
                    </fileset>
                    <print>true</print>
                </configuration>
            </execution>
  7. Configure the sql-maven-plugin executions element to run any populate scripts from the source tree to add rows to the database before running tests.

    
            <execution>
                <id>populate-db-before-test</id>
                <phase>process-test-classes</phase>
                <goals>
                    <goal>execute</goal>
                </goals>
                <configuration>
                    <autocommit>true</autocommit>
                    <fileset>
                        <basedir>${basedir}/src</basedir>
                        <includes>
                            <include>test/resources/ddl/**/*populate*.ddl</include>
                        </includes>
                    </fileset>
                </configuration>
            </execution>
  8. Configure the sql-maven-plugin executions element to run any drop scripts after testing. You may want to comment this out if you want to view database changes in a GUI after the test.

    
            <execution>
                <id>drop-db-after-test</id>
                <phase>test</phase>
                <goals>
                    <goal>execute</goal>
                </goals>
                <configuration>
                    <autocommit>true</autocommit>
                    <fileset>
                        <basedir>${basedir}/src</basedir>
                        <includes>
                            <include>main/resources/ddl/**/*drop*.ddl</include>     
                            </includes>
                    </fileset>
                </configuration>
            </execution>
  9. Build and run the tests. The schema should show up in the DB UI.

    $mvn clean test -P\!h2db -Ph2srv

    Note

    Remember to turn off (-P!profile-name) the embedded profile (h2db) if active by default and turn on the server profile (h2srv) if you wish to use the server and DB UI while the unit test is running. The DB UI can only inspect the embedded file once all active clients close the file. The backslash is only needed for commands from the bash shell.

In this chapter we are going to add tuning aspects to the schema put in place above. Examples of this include any indexes we believe would enhance the query performance. This example is still quite simple and lacks enough context to determine what would and would not be a helpful index. Simply treat this exercise as a tutorial in putting an index in place when properly identified. Adding the physical files mentioned here could be considered optional if all schema is hand-crafted. You control the contents of each file in a 100% hand-crafted DDL solution. However, for those cases where auto-generated schema files are created, you may want a separate set of files designated for "tuning" the schema that was auto-generated. We will demonstrate using two extra files to create/drop database indexes.

  1. Add a file to add database indexes for your schema

    # src/main/resources/ddl/emauto_tuningadd.ddl
    
    CREATE INDEX EM_AUTO_MAKEMODEL ON EM_AUTO(MAKE, MODEL);
  2. Wire this new file into your SQL plugin definition for creating schema. Have it run after your table creates.

    
        <includes>
            <include>main/resources/ddl/**/*create*.ddl</include>
            <include>main/resources/ddl/**/*tuningadd*.ddl</include>
        </includes>
  3. Add a file to augment the drop script and remove indexes from your schema

    # src/main/resources/ddl/emauto_tuningremove.ddl
    
    DROP INDEX EM_AUTO_MAKEMODEL if exists;
  4. Wire this new file into your SQL plugin definition for dropping schema. Have it run before your table drops.

    
        <includes>
            <include>main/resources/ddl/**/*tuningremove*.ddl</include>
            <include>main/resources/ddl/**/*drop*.ddl</include>
        </includes>
  5. Build the schema for your module

    $ mvn clean process-test-classes
    ...
    [INFO] --- sql-maven-plugin:1.4:execute (drop-db-before-test) @ entityMgrEx ---
    [INFO] Executing file: .../src/main/resources/ddl/emauto_drop.ddl
    [INFO] Executing file: .../src/main/resources/ddl/emauto_tuningremove.ddl
    [INFO] 2 of 2 SQL statements executed successfully
    [INFO] 
    [INFO] --- sql-maven-plugin:1.4:execute (create-db-before-test) @ entityMgrEx ---
    [INFO] Executing file: .../src/main/resources/ddl/emauto_create.ddl
    [INFO] Executing file: .../entityMgrEx/src/main/resources/ddl/emauto_tuningadd.ddl
    [INFO] 2 of 2 SQL statements executed successfully

This chapter will add an entity class, the persistence.xml, and associated property file to define the persistence unit.

The entity class represents one or more tables in the database and each instance of the entity class represents a specific row in those tables.

The persistence.xml defines a JPA persistence unit (along with other related XML files and entity class annotations). Instances of a persistence unit is called a persistence context. Instances of the persistence unit are accessed through an EntityManager.

`-- src
    |-- main
    |   |-- java
    |   |   `-- myorg
    |   |       `-- entitymgrex
    |   |           `-- Auto.java
    |   `-- resources
    |       `-- META-INF
    |           `-- persistence.xml
    `-- test
        |-- java
        `-- resources
            `-- hibernate.properties
  1. Create a (Plain Old Java Object (POJO)) class to represent an automobile. Use class annotations to provide the following:

    package myorg.entitymgrex;
    
    
    import java.io.Serializable;
    import javax.persistence.Column;
    import javax.persistence.Entity;
    import javax.persistence.GenerationType;
    import javax.persistence.GeneratedValue;
    import javax.persistence.Id;
    import javax.persistence.Table;
    @Entity @Table(name="EM_AUTO")
    public class Auto implements Serializable {
        private static final long serialVersionUID = 1L;
        @Id @GeneratedValue(strategy=GenerationType.IDENTITY)
        private long id;
        private String make;
        private String model;
        private String color;
        private int mileage;
        public Auto(){}
        public Auto(int id) { this.id=id; }
        public long getId() { return id; }
        //more getter/setters go here
    }

    Note

    @Entity, @Id, and a default constructor are the only requirements on an entity class. The implementation of java.io.Serializable is not a JPA requirement but will be leveraged by a later example within this exercise.

  2. Add the remaining setter/getter methods to the class. If you are using Eclipse to author the class -- right click->Source->Generate Getters and Setters will generate all of this for you. Since we are using generated primary key values, there is no immediate need for a constructor to set the id. If you add this later, remember to also add a default constructor, which was removed by the compiler when you manually add the first constructor.

        public void setMake(String make) {
    
            this.make = make;
        }
        
        public int getMileage() { return mileage; }
        public void setMileage(int mileage) {
            this.mileage = mileage;
        }
        
        public String getModel() { return model; }
        public void setModel(String model) {
            this.model = model;
        }
        
        public String getColor() { return color; }
        public void setColor(String color) {
            this.color = color;
        }
  3. You may also want to add a public toString():String method to conveniently print your Auto objects. Eclipse can also generate that on demand and configurable.

        @Override
    
        public String toString() {
            StringBuilder builder = new StringBuilder();
            builder
                .append("id=").append(id)
                .append(", make=").append(make)
                .append(", model=").append(model)
                .append(", color=").append(color)
                .append(", mileage=").append(mileage);
            return builder.toString();
        }
  4. Create a META-INF/persistence.xml file to define the persistence unit for our jar file.

    • persistence-unit name: must match what we place in our JUnit test

    • provider: specify that this persistence unit is defined for the org.hibernate.ejb.HibernatePersistence provider.

    • define provider-specific properties that tell the provider how to obtain a connection to the database as well as some other configuration properties.

    Note

    The technique to add the provider-specific properties includes somewhat sensitive information like user credentials. If we place them in the persistence.xml file within the src/main tree, these properties will become part of our deployed artifact. To avoid this, we will define them in a separate hibernate.properties file placed in the src/test tree.

    
    <?xml version="1.0" encoding="UTF-8"?>
    <persistence xmlns="http://java.sun.com/xml/ns/persistence"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd" version="2.0">

        <persistence-unit name="entityMgrEx">
            <provider>org.hibernate.ejb.HibernatePersistence</provider>

            <properties>
               <!-- defined in src/test/resources/hibernate.properties -->
            </properties>
        </persistence-unit>            
    </persistence>
  5. Create a hibernate.properties file in src/test/resources to hold information we want to support testing, but may not want to be part of the deployed artifact. Leave the volatile values as variables so they can be expanded into the target tree during compile time.

    • the variables will be filled in during the build process using "filtering" and the resources plugin.

    • the show and format_sql options are only turned on during early development and debug.

    • the jdbc.batch_size property set to 0 is also used during debug. Setting it to this value will eliminate any batching of SQL commands, allowing errors about the commands to be better reported to the developer.

    hibernate.dialect=${hibernate.dialect}
    hibernate.connection.url=${jdbc.url}
    hibernate.connection.driver_class=${jdbc.driver}
    hibernate.connection.password=${jdbc.password}
    hibernate.connection.username=${jdbc.user}
    #hibernate.hbm2ddl.auto=create
    #hibernate.hbm2ddl.import_files=/ddl/emauto-tuningdrop.ddl,/ddl/emauto-tuning.ddl 
    hibernate.show_sql=true
    hibernate.format_sql=true
    #hibernate.jdbc.batch_size=0
  6. Make sure your project still builds. Your area should look something like the following after the build.

    $ mvn clean install -P\!h2db -Ph2srv
    |-- pom.xml
    |-- src
    |   |-- main
    |   |   |-- java
    |   |   |   `-- myorg
    |   |   |       `-- entitymgrex
    |   |   |           `-- Auto.java
    |   |   `-- resources
    |   |       |-- ddl
    |   |       |   |-- emauto_create.ddl
    |   |       |   |-- emauto_delete.ddl
    |   |       |   |-- emauto_drop.ddl
    |   |       |   |-- emauto_tuningadd.ddl
    |   |       |   `-- emauto_tuningremove.ddl
    |   |       `-- META-INF
    |   |           `-- persistence.xml
    |   `-- test
    |       `-- resources
    |           `-- hibernate.properties
    `-- target
        |-- classes
        |   |-- ddl
        |   |   |-- emauto_create.ddl
        |   |   |-- emauto_delete.ddl
        |   |   |-- emauto_drop.ddl
        |   |   |-- emauto_tuningadd.ddl
        |   |   `-- emauto_tuningremove.ddl
        |   |-- META-INF
        |   |   `-- persistence.xml
        |   `-- myorg
        |       `-- entitymgrex
        |           `-- Auto.class
       ...
        `-- test-classes
            `-- hibernate.properties
    
  7. You should also check that your hibernate.properties file was filtered by your build.testResources definition provided earlier. The variable definitions within your src/test/resources source file(s) should be replaced with property values from your environment.

    $ more src/test/resources/hibernate.properties target/test-classes/hibernate.properties 
    ::::::::::::::
    src/test/resources/hibernate.properties
    ::::::::::::::
    hibernate.dialect=${hibernate.dialect}
    hibernate.connection.url=${jdbc.url}
    hibernate.connection.driver_class=${jdbc.driver}
    hibernate.connection.password=${jdbc.password}
    hibernate.connection.username=${jdbc.user}
    #hibernate.hbm2ddl.auto=create
    hibernate.show_sql=true
    hibernate.format_sql=true
    #hibernate.jdbc.batch_size=0
    
    ::::::::::::::
    target/test-classes/hibernate.properties
    ::::::::::::::
    hibernate.dialect=org.hibernate.dialect.H2Dialect
    hibernate.connection.url=jdbc:h2:tcp://127.0.0.1:9092/h2db/ejava
    hibernate.connection.driver_class=org.h2.Driver
    hibernate.connection.password=
    hibernate.connection.username=sa
    #hibernate.hbm2ddl.auto=create
    #hibernate.hbm2ddl.import_files=/ddl/emauto-tuningdrop.ddl,/ddl/emauto-tuning.ddl 
    hibernate.show_sql=true
    hibernate.format_sql=true
    #hibernate.jdbc.batch_size=0

This chapter will create a JUnit test case that will instantiate an EntityManager, perform some basic operations, and log information about the tests.

`-- src
    `-- test
        |-- java
        |   `-- myorg
        |       `-- entitymgrex
        |           `-- EntityMgrTest.java
        `-- resources
            `-- log4j.xml
  1. Create a JUnit test case to hold your test code. The following is an example of a 4.x JUnit test case that uses @Annotations.

    package myorg.entitymgrex;
    
    
    import java.io.ByteArrayInputStream;
    import java.io.ByteArrayOutputStream;
    import java.io.ObjectInputStream;
    import java.io.ObjectOutputStream;
    import javax.persistence.EntityManager;
    import javax.persistence.EntityManagerFactory;
    import javax.persistence.Persistence;
    import javax.persistence.Query;
    import org.apache.commons.logging.Log;
    import org.apache.commons.logging.LogFactory;
    import static org.junit.Assert.*;
    import org.junit.After;
    import org.junit.AfterClass;
    import org.junit.Before;
    import org.junit.BeforeClass;
    import org.junit.Test;
    public class EntityMgrTest {
        private static Log log = LogFactory.getLog(EntityMgrTest.class);
        @Test
        public void testTemplate() {
            log.info("testTemplate");
        }
    }
  2. Provide a setUpClass() method that runs once before all tests that can create the entity manager. This method must be static.

        private static final String PERSISTENCE_UNIT = "entityMgrEx";
    
        private static EntityManagerFactory emf;
        @BeforeClass
        public static void setUpClass() {
            log.debug("creating entity manager factory");
            emf = Persistence.createEntityManagerFactory(PERSISTENCE_UNIT);
        }
  3. Provide a setUp() method that will be called before each testMethod is executed. Have this method create an entity manager for the tests to use.

        private EntityManager em;    
    
    
        @Before
        public void setUp() throws Exception {
            log.debug("creating entity manager");
            em = emf.createEntityManager();
            //cleanup();
        }
  4. Provide a tearDown() method that will be called after each testMethod. Have this flush all remaining items in the persistence context to the database and close the entity manager.

        @After
    
        public void tearDown() throws Exception {
            try {
                log.debug("tearDown() started, em=" + em);
                em.getTransaction().begin();
                em.flush();            
                //logAutos();            
                em.getTransaction().commit();            
                em.close();
                log.debug("tearDown() complete, em=" + em);
            }
            catch (Exception ex) {
                log.fatal("tearDown failed", ex);
                throw ex;
            }
         }
  5. Provide a tearDownClass() method that will be called after all testMethods have completed. This method must be static and should close the entity manager factory.

        @AfterClass
    
        public static void tearDownClass() {
            log.debug("closing entity manager factory");
            emf.close();
        }
  6. Add in a logAutos() method to query and print all autos in the database. Do this after flushing the entity manager in the tearDown() method so you can see the changes from the previous test. The following example uses the entity manager to create an ad-hoc EJB-QL statement.

        @After
    
        public void tearDown() throws Exception {
    ...
                em.flush();            
                logAutos();            
                em.getTransaction().commit();            
    ...
         }
        public void logAutos() {
            Query query = em.createQuery("select a from Auto as a");
            for (Object o: query.getResultList()) {
                log.info("EM_AUTO:" + o);
            }        
        }
  7. You might also want to add a cleanup() to clear out the Auto table between tests. The example below uses the entity manager to create a native SQL statement.

        @Before
    
        public void setUp() throws Exception {
            ...
            em = emf.createEntityManager();
            cleanup();
        }
        public void cleanup() {
            em.getTransaction().begin();
            Query query = em.createNativeQuery("delete from EM_AUTO");
            int rows = query.executeUpdate();
            em.getTransaction().commit();
            log.info("removed " + rows + " rows");
        }
  8. Add a log4j.xml file to src/test/resources that has your desired settings. The one below produces less timestamp information at the console and more details in the logfile.

    
    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
    <log4j:configuration
        xmlns:log4j="http://jakarta.apache.org/log4j/"
        debug="false">

        <appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
            <param name="Target" value="System.out"/>

            <layout class="org.apache.log4j.PatternLayout">
                <param name="ConversionPattern"
                       value="(%F:%M:%L)  -%m%n"/>
            </layout>
        </appender>

        <appender name="logfile" class="org.apache.log4j.RollingFileAppender">
            <param name="File" value="target/log4j-out.txt"/>
            <param name="Append" value="false"/>
            <param name="MaxFileSize" value="100KB"/>
            <param name="MaxBackupIndex" value="1"/>
            <layout class="org.apache.log4j.PatternLayout">
                <param name="ConversionPattern"
                       value="%-5p %d{dd-MM HH:mm:ss,SSS} [%c] (%F:%M:%L)  -%m%n"/>
            </layout>
       </appender>

       <logger name="myorg">
          <level value="debug"/>
          <appender-ref ref="logfile"/>
       </logger>

       <root>
          <priority value="fatal"/>
          <appender-ref ref="CONSOLE"/>
       </root>

    </log4j:configuration>

    Note

    Although it might be a bit entertaining to set the priority of the root appender to debug to see everything the persistence provider has to say, it is quite noisy. Consider changing to root priority to fatal so that a majority of the log statements are yours.

  9. You should be able to build and test your module at this time.

    $ mvn clean test
    
    Running myorg.entitymgrex.EntityMgrTest
    (EntityMgrTest.java:setUpClass:25)  -creating entity manager factory
    (EntityMgrTest.java:setUp:31)  -creating entity manager
    Hibernate: 
        delete 
        from
            EM_AUTO
    (EntityMgrTest.java:cleanup:58)  -removed 0 rows
    (EntityMgrTest.java:testTemplate:75)  -testTemplate
    (EntityMgrTest.java:tearDown:39)  -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@3e52a475
    Hibernate: 
        select
            auto0_.id as id0_,
            auto0_.color as color0_,
            auto0_.make as make0_,
            auto0_.mileage as mileage0_,
            auto0_.model as model0_ 
        from
            EM_AUTO auto0_
    (EntityMgrTest.java:tearDown:45)  -tearDown() complete, em=org.hibernate.ejb.EntityManagerImpl@3e52a475
    (EntityMgrTest.java:tearDownClass:69)  -closing entity manager factory
    Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.337 sec
    
    Results :
    
    Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
    ...
    [INFO] BUILD SUCCESS
  10. Check that you have the following artifacts in your project tree.

    |-- pom.xml
    
    |-- src
    |   |-- main
    |   |   |-- java
    |   |   |   `-- myorg
    |   |   |       `-- entitymgrex
    |   |   |           `-- Auto.java
    |   |   `-- resources
    |   |       |-- ddl
    |   |       |   |-- emauto_create.ddl
    |   |       |   |-- emauto_delete.ddl
    |   |       |   |-- emauto_drop.ddl
    |   |       |   |-- emauto_tuningadd.ddl
    |   |       |   `-- emauto_tuningremove.ddl
    |   |       `-- META-INF
    |   |           `-- persistence.xml
    |   `-- test
    |       |-- java
    |       |   `-- myorg
    |       |       `-- entitymgrex
    |       |           `-- EntityMgrTest.java
    |       `-- resources
    |           |-- hibernate.properties
    |           `-- log4j.xml
    `-- target
        |-- antrun
        |   `-- build-main.xml
        |-- classes
        |   |-- ddl
        |   |   |-- emauto_create.ddl
        |   |   |-- emauto_delete.ddl
        |   |   |-- emauto_drop.ddl
        |   |   |-- emauto_tuningadd.ddl
        |   |   `-- emauto_tuningremove.ddl
        |   |-- META-INF
        |   |   `-- persistence.xml
        |   `-- myorg
        |       `-- entitymgrex
        |           `-- Auto.class
        |-- generated-sources
        |   `-- annotations
        |-- generated-test-sources
        |   `-- test-annotations
        |-- h2db
        |   `-- ejava.h2.db
        |-- log4j-out.txt
        |-- maven-status
        |   `-- maven-compiler-plugin
        |       |-- compile
        |       |   `-- default-compile
        |       |       |-- createdFiles.lst
        |       |       `-- inputFiles.lst
        |       `-- testCompile
        |           `-- default-testCompile
        |               |-- createdFiles.lst
        |               `-- inputFiles.lst
        |-- surefire-reports
        |   |-- myorg.entitymgrex.EntityMgrTest.txt
        |   `-- TEST-myorg.entitymgrex.EntityMgrTest.xml
        `-- test-classes
            |-- hibernate.properties
            |-- log4j.xml
            `-- myorg
                `-- entitymgrex
                    `-- EntityMgrTest.class

Over the years/versions, Eclipse has progressed from being ignorant of Maven (with all integration coming from the Maven side) to being very much integrated with Maven. In that later/integrated mode, Eclipse will try really hard to do the right thing within Eclipse for what was defined to be done outside of Eclipse. For example, Eclipse will turn Maven dependencies directly into an Eclipse build path. There exists, however, some plugins that Eclipse has yet to learn about and will alert you to that fact. Many of these have no role within Eclipse and you simply need to explicitly give Eclipse instruction to ignore the plugin. Luckily these cases get fewer and fewer each year and Eclipse will update your pom.xml with the necessary configuration when it is needed.

In this chapter we will practively tell Eclipse to ignore the sql-maven-plugin but feel free to wait until Eclipse complains (Who knows, there may be an update that makes this step unecessary).

<listitem>

Add the following profile to your pom.xml. The profile is activated when the m2e.version property is defined -- which is a property m2e (Maven To Eclipse) sets within Eclipse.


    <profiles>
       ...
        <!--  tell Eclipse what to do with some of the plugins -->
        <profile>
          <id>m2e</id>
          <activation>
            <property>
              <name>m2e.version</name>
            </property>
          </activation>
          <build>
            <pluginManagement>
                <plugins>
                    <plugin>
                      <groupId>org.eclipse.m2e</groupId>
                      <artifactId>lifecycle-mapping</artifactId>
                      <version>1.0.0</version>
                      <configuration>
                        <lifecycleMappingMetadata>
                          <pluginExecutions>
                            
                            <pluginExecution>
                              <pluginExecutionFilter>
                                <groupId>org.codehaus.mojo</groupId>
                                <artifactId>hibernate3-maven-plugin</artifactId>
                                <versionRange>[3.0,)</versionRange>
                                <goals>
                                  <goal>run</goal>
                                </goals>
                              </pluginExecutionFilter>
                              <action>
                                <ignore/>
                              </action>
                            </pluginExecution>

                            <pluginExecution>
                              <pluginExecutionFilter>
                                <groupId>org.codehaus.mojo</groupId>
                                <artifactId>sql-maven-plugin</artifactId>
                                <versionRange>[1.0.0,)</versionRange>
                                <goals>
                                  <goal>execute</goal>
                                </goals>
                              </pluginExecutionFilter>
                              <action>
                                <ignore />
                              </action>
                            </pluginExecution>

                          </pluginExecutions>
                        </lifecycleMappingMetadata>
                      </configuration>
                    </plugin>

                </plugins>
            </pluginManagement>
           </build>
        </profile>
    ...
    </profiles>
</listitem>
<listitem>

Import the project into Eclipse using "Existing Maven Projects" option.

</listitem>

This chapter will demonstrate various methods to perform create, read, update, and delete (CRUD) operations on the database using the EntityManager, the persistence unit/context, and the entity class.

Note

The following changes are all made to the EntityMgrTest.java JUnit test class. Everything is being done within this file to keep things simple. This test case is playing the role of the business and persistence (Data Access Object (DAO)) logic.

  1. add a testCreate() method to test the functionality of EntityManager.create(). This will add an object to the database once associated with a transaction.

        @Test
    
        public void testCreate() {
            log.info("testCreate");
            
            Auto car = new Auto();
            car.setMake("Chrysler");
            car.setModel("Gold Duster");
            car.setColor("Gold");
            car.setMileage(60*1000);
            
            log.info("creating auto:" + car);                        
            em.persist(car);        
        }
     -testCreate
     -creating auto:myorg.entitymgrex.Auto@140984b, id=0, make=Chrysler, model=Gold Duster, color=Gold, mileage=60000
     -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@3ac93e
     -EM_AUTO:myorg.entitymgrex.Auto@140984b, id=1, make=Chrysler, model=Gold Duster, color=Gold, mileage=60000
     -removed 1 rows
  2. add a testMultiCreate() to test creating several objects. This should also help verify that unique primary keys are being generated.

        @Test
    
        public void testMultiCreate() {
            log.info("testMultiCreate");
            for(int i=0; i<5; i++) {
                Auto car = new Auto();
                car.setMake("Plymouth " + i);
                car.setModel("Grand Prix");
                car.setColor("Green");
                car.setMileage(80*1000);            
                log.info("creating auto:" + car);                        
                em.persist(car);        
            }
        }
     -testMultiCreate
     -creating auto:myorg.entitymgrex.Auto@c3e9e9, id=0, make=Plymouth 0, model=Grand Prix, color=Green, mileage=80000
     -creating auto:myorg.entitymgrex.Auto@31f2a7, id=0, make=Plymouth 1, model=Grand Prix, color=Green, mileage=80000
     -creating auto:myorg.entitymgrex.Auto@131c89c, id=0, make=Plymouth 2, model=Grand Prix, color=Green, mileage=80000
     -creating auto:myorg.entitymgrex.Auto@1697b67, id=0, make=Plymouth 3, model=Grand Prix, color=Green, mileage=80000
     -creating auto:myorg.entitymgrex.Auto@24c4a3, id=0, make=Plymouth 4, model=Grand Prix, color=Green, mileage=80000
     -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@1e9c82e
     -EM_AUTO:myorg.entitymgrex.Auto@c3e9e9, id=2, make=Plymouth 0, model=Grand Prix, color=Green, mileage=80000
     -EM_AUTO:myorg.entitymgrex.Auto@31f2a7, id=3, make=Plymouth 1, model=Grand Prix, color=Green, mileage=80000
     -EM_AUTO:myorg.entitymgrex.Auto@131c89c, id=4, make=Plymouth 2, model=Grand Prix, color=Green, mileage=80000
     -EM_AUTO:myorg.entitymgrex.Auto@1697b67, id=5, make=Plymouth 3, model=Grand Prix, color=Green, mileage=80000
     -EM_AUTO:myorg.entitymgrex.Auto@24c4a3, id=6, make=Plymouth 4, model=Grand Prix, color=Green, mileage=80000
  3. add a testFind() to test the ability to find an object by its primary key value.

        @Test
    
        public void testFind() {
            log.info("testFind");
            
            Auto car = new Auto();
            car.setMake("Ford");
            car.setModel("Bronco II");
            car.setColor("Red");
            car.setMileage(0*1000);
            log.info("creating auto:" + car);                        
            em.persist(car);
            
            //we need to associate the em with a transaction to get a 
            //primary key generated and assigned to the auto
            em.getTransaction().begin();
            em.getTransaction().commit();
            
            Auto car2 = em.find(Auto.class, car.getId());
            assertNotNull("car not found:" + car.getId(), car2);
            log.info("found car:" + car2);
        }
     -testFind
     -creating auto:myorg.entitymgrex.Auto@aae86e, id=0, make=Ford, model=Bronco II, color=Red, mileage=0
     -found car:myorg.entitymgrex.Auto@aae86e, id=7, make=Ford, model=Bronco II, color=Red, mileage=0
     -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@97d026
     -EM_AUTO:myorg.entitymgrex.Auto@aae86e, id=7, make=Ford, model=Bronco II, color=Red, mileage=0
  4. add a getReference() to test the ability to get a reference to an object. With such a shallow object, this will act much like find().

        @Test
    
        public void testGetReference() {
            log.info("testGetReference");
            
            Auto car = new Auto();
            car.setMake("Ford");
            car.setModel("Escort");
            car.setColor("Red");
            car.setMileage(0*1000);
            log.info("creating auto:" + car);                        
            em.persist(car);
            
            //we need to associate the em with a transaction to get a 
            //primary key generated and assigned to the auto
            em.getTransaction().begin();
            em.getTransaction().commit();
            
            Auto car2 = em.getReference(Auto.class, car.getId());
            assertNotNull("car not found:" + car.getId(), car2);
            log.info("found car:" + car2);        
        }
     -testGetReference
     -creating auto:myorg.entitymgrex.Auto@608760, id=0, make=Ford, model=Escort, color=Red, mileage=0
     -found car:myorg.entitymgrex.Auto@608760, id=8, make=Ford, model=Escort, color=Red, mileage=0
     -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@157ea4a
     -EM_AUTO:myorg.entitymgrex.Auto@608760, id=8, make=Ford, model=Escort, color=Red, mileage=0
  5. add a testUpdate() method to test the ability to have the setter() of a managed ubject update the database.

        @Test
    
        public void testUpdate() {
            log.info("testUpdate");
            
            Auto car = new Auto();
            car.setMake("Pontiac");
            car.setModel("Gran Am");
            car.setColor("Red");
            car.setMileage(0*1000);
            log.info("creating auto:" + car);                        
            em.persist(car);
            
            //we need to associate the em with a transaction to get a 
            //primary key generated and assigned to the auto
            em.getTransaction().begin();
            em.getTransaction().commit();
            
            for(int mileage=car.getMileage(); mileage<(100*1000); mileage+=20000) {
                //here's where the update is done
                car.setMileage(mileage);
                
                //commit the update to the database for query 
                em.getTransaction().begin();
                em.getTransaction().commit();
                
                //inspect database for value
                int value = getMileage(car.getId());
                assertTrue("unexpected mileage:" + value, value == mileage);
                log.info("found mileage:" + value);        
            }
            
        }
        private int getMileage(long id) {
            Query query = 
                em.createQuery("select a.mileage from Auto as a where a.id=:pk");
            query.setParameter("pk", id);
            return (Integer)query.getSingleResult();        
        }
     -testUpdate
     -creating auto:myorg.entitymgrex.Auto@6a3960, id=0, make=Pontiac, model=Gran Am, color=Red, mileage=0
     -found mileage:0
     -found mileage:20000
     -found mileage:40000
     -found mileage:60000
     -found mileage:80000
     -EM_AUTO:myorg.entitymgrex.Auto@6a3960, id=9, make=Pontiac, model=Gran Am, color=Red, mileage=80000
  6. add a testMerge() method to test the ability to perform updates based on the current values of a detached object. Note that we are using Java serialization to simulate sending a copy of the object to/from a remote process and then performing the merge based on the updated object.

        @Test
    
        public void testMerge() throws Exception {
            log.info("testMerge");
            
            Auto car = new Auto();
            car.setMake("Chrystler");
            car.setModel("Concord");
            car.setColor("Red");
            car.setMileage(0*1000);
            log.info("creating auto:" + car);                        
            car = em.merge(car); //using merge to persist new
            
            //we need to associate the em with a transaction to get a 
            //primary key generated and assigned to the auto
            em.getTransaction().begin();
            em.getTransaction().commit();
            
            for(int mileage=(10*1000); mileage<(100*1000); mileage+=20000) {
                //simulate sending to remote system for update
                Auto car2 = updateMileage(car, mileage);
                
                //verify the object is not being managed by the EM
                assertFalse("object was managed", em.contains(car2));
                assertTrue("object wasn't managed", em.contains(car));
                assertTrue("mileage was same", 
                        car.getMileage() != car2.getMileage());
                
                //commit the update to the database for query 
                em.merge(car2);
                assertTrue("car1 not merged:" + car.getMileage(), 
                        car.getMileage() == mileage);
                em.getTransaction().begin();
                em.getTransaction().commit();
                
                //inspect database for value
                int value = getMileage(car.getId());
                assertTrue("unexpected mileage:" + value, value == mileage);
                log.info("found mileage:" + value);        
            }        
        }
        
        private Auto updateMileage(Auto car, int mileage) throws Exception {
            //simulate sending the object to a remote system
            ByteArrayOutputStream bos = new ByteArrayOutputStream();
            ObjectOutputStream oos = new ObjectOutputStream(bos);
            oos.writeObject(car);
            oos.close();
            
            //simulate receiving an update to the object from remote system
            ByteArrayInputStream bis = 
                new ByteArrayInputStream(bos.toByteArray());
            ObjectInputStream ois = new ObjectInputStream(bis);
            Auto car2 = (Auto)ois.readObject();
            ois.close();
            
            //here's what they would have changed in remote process 
            car2.setMileage(mileage);
            
            return car2;
        }
     -testMerge
     -creating auto:myorg.entitymgrex.Auto@147358f, id=0, make=Chrystler, model=Concord, color=Red, mileage=0
     -found mileage:10000
     -found mileage:30000
     -found mileage:50000
     -found mileage:70000
     -found mileage:90000
     -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@1b4c1d7
     -EM_AUTO:myorg.entitymgrex.Auto@147358f, id=10, make=Chrystler, model=Concord, color=Red, mileage=90000
    
  7. add a testRemove() method to verify that we can delete objects from the database.

        @Test
    
        public void testRemove() {
            log.info("testRemove");
            
            Auto car = new Auto();
            car.setMake("Jeep");
            car.setModel("Cherokee");
            car.setColor("Green");
            car.setMileage(30*1000);
            log.info("creating auto:" + car);                        
            em.persist(car);
            //we need to associate the em with a transaction to get a 
            //primary key generated and assigned to the auto
            em.getTransaction().begin();
            em.getTransaction().commit();
            
            Auto car2 = em.find(Auto.class, car.getId());
            assertNotNull("car not found:" + car.getId(), car2);
            log.info("found car:" + car2);
            
            //now remove the car
            log.info("removing car:" + car);
            em.remove(car);
            //we need to associate the em with a transaction to  
            //physically remove from database
            em.getTransaction().begin();
            em.getTransaction().commit();
            
            Auto car3 = em.find(Auto.class, car.getId());
            assertNull("car found", car3);
        }
     -testRemove
     -creating auto:myorg.entitymgrex.Auto@28305d, id=0, make=Jeep, model=Cherokee, color=Green, mileage=30000
     -found car:myorg.entitymgrex.Auto@28305d, id=11, make=Jeep, model=Cherokee, color=Green, mileage=30000
     -removing car:myorg.entitymgrex.Auto@28305d, id=11, make=Jeep, model=Cherokee, color=Green, mileage=30000
    

In a previous chapter, you manually created a set of DDL files to create schema, delete rows from the schema in the database, and drop the schema from the database. Since your persistence provider knows how to work with schema, you can optionally get it to create schema for you rather than generating it manually. Even if you are working with legacy schema (and won't be changing the database), it is extremely helpful to see the persistence providers version of the schema to be able to more quickly determine a mis-match in the mapping rather than waiting until runtime testing. In order to add schema generation to your projects you can add one of the following; runtime schema generation or compile-time schema generation. Runtime schema generation is fine for examples and small prototypes, but compile-time generation is suitable for more realistic development scenarios.

  1. runtime schema generation can be added to your project by adding the following property to your persistence-unit or hibernate.properties. Coldstart your database, comment out your SQL plugin, and re-run your tests if you want to verify the above will create the database at runtime.

    
    #persistence.xml
       <property name="hibernate.hbm2ddl.auto" value="create"/> 

    #hibernate.properties
        hibernate.hbm2ddl.auto=create
  2. compile-time schema generation can be added to your project with the following plugin entry. Add the following to your pluginManagement section. The following passive definition defines the reusable details for how we want to setup the hibernate plugin for generating database schema. It will write a drop script in a file called ...dropJPA.ddl and a create script called ...createJPA.ddl. It cannot create a delete script.

    
        <build>
            <pluginManagement>
                <plugins>
                    ...
                    <plugin>
                        <groupId>org.codehaus.mojo</groupId>
                        <artifactId>hibernate3-maven-plugin</artifactId>
                        <version>${hibernate3-maven-plugin.version}</version>
                        <extensions>true</extensions>
                        <dependencies>
                            <dependency>
                                <groupId>org.hibernate</groupId>
                                <artifactId>hibernate-entitymanager</artifactId>
                                <version>${hibernate3.version}</version>
                            </dependency>
                        </dependencies>
                        <executions>
                            <execution>
                                <id>generate-drop-ddl</id>
                                <phase>process-test-classes</phase>
                                <goals>
                                    <goal>run</goal>
                                </goals>
                                <configuration>
                                    <hibernatetool>
                                        <hbm2ddl export="false" create="false" drop="true" format="true" 
                                            outputfilename="${project.artifactId}-dropJPA.ddl"/>
                                    </hibernatetool>
                                </configuration>
                            </execution>
                            <execution>
                                <id>generate-create-ddl</id>
                                <phase>process-test-classes</phase>
                                <goals>
                                    <goal>run</goal>
                                </goals>
                                <configuration>
                                    <hibernatetool>
                                        <hbm2ddl export="false" create="true" drop="false" format="true" 
                                            outputfilename="${project.artifactId}-createJPA.ddl"/>
                                    </hibernatetool>
                                </configuration>
                            </execution>
                        </executions>
                    </plugin>
                    ...
                </plugins>
            </pluginManagement>
        </build>
  3. Add the following active declaration to you pom to activate the plugin and fill in the module-specifics. Since this plugin can operate without a database server -- add it to the global build.plugins section and not within a profile.

    
        <build>
            ...
            <plugins>
                <plugin>
                    <artifactId>hibernate3-maven-plugin</artifactId>
                    <groupId>org.codehaus.mojo</groupId>
                    <configuration>
                        <hibernatetool destdir="target/classes/ddl">
                            <classpath>
                                <path location="${project.build.directory}/classes" />
                                <path location="${project.build.directory}/test-classes" />
                            </classpath>
                            <jpaconfiguration persistenceunit="entityMgrEx"
                                propertyfile="${basedir}/target/test-classes/hibernate.properties" />
                        </hibernatetool>
                    </configuration>
                </plugin>
            <plugins>
  4. Build your module and notice the generated JPA.ddl files

    $ mvn clean process-test-classes
    
    
    ...
    [hibernatetool] Executing Hibernate Tool with a JPA Configuration
    [hibernatetool] 1. task: hbm2ddl (Generates database schema)
    (...SLF4J warnings...)
    [INFO] Executed tasks
    [INFO] 
    [INFO] --- hibernate3-maven-plugin:3.0:run (generate-create-ddl) @ entityMgrEx ---
    [INFO] Executing tasks
    main:
    [hibernatetool] Executing Hibernate Tool with a JPA Configuration
    [hibernatetool] 1. task: hbm2ddl (Generates database schema)
    ...
    ---
    ---
    `-- target
       ...
        |-- classes
        |   |-- ddl
        |   |   |-- emauto_create.ddl
        |   |   |-- emauto_delete.ddl
        |   |   |-- emauto_drop.ddl
        |   |   |-- emauto_tuningadd.ddl
        |   |   |-- emauto_tuningremove.ddl
        |   |   |-- entityMgrEx-createJPA.ddl
        |   |   `-- entityMgrEx-dropJPA.ddl
  5. (Optionally) update your SQL plugin defintion added in previous chapter to reference the dynamically generated schema in the target tree.

  6. If Eclipse reports an error for the plugin, add a lifecycle mapping for the hibernate3-maven-plugin to tell Eclipse to ignore the functionality of the plugin and eliminate any errors Eclipse might display. This goes with the definition you created for the sql-maven-plugin.

    
    
                                <pluginExecution>
                                  <pluginExecutionFilter>
                                    <groupId>org.codehaus.mojo</groupId>
                                    <artifactId>hibernate3-maven-plugin</artifactId>
                                    <versionRange>[2.2,)</versionRange>
                                    <goals>
                                      <goal>run</goal>
                                    </goals>
                                  </pluginExecutionFilter>
                                  <action>
                                    <ignore/>
                                  </action>
                                </pluginExecution>

Since you will likely have many JPA modules in your enterprise application, lets take a moment to break the current module into a parent and child before you quit. That way you can better visualize which parts are specific to the child module and which are reusable from a common parent.

  1. Create a sibling module called ../jpa-parent

    
    $ mkdir ../jpa-parent
  2. Add the module definition (../jpa-parent/pom.xml)

    
    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
        <modelVersion>4.0.0</modelVersion>

        <groupId>myorg.jpa</groupId>
        <artifactId>jpa-parent</artifactId>
        <version>1.0-SNAPSHOT</version>
        <packaging>pom</packaging>

        <name>JPA Parent POM</name>
        <description>
            This parent pom is intended to provide common and re-usable 
            definitions and constructs across JPA projects.
        </description>
    </project>
  3. Add the following parent declaration to your existing module. The relativePath is only useful if you find yourself changing the parent pom on a frequent basis. Otherwise the parent module can be found in the localRepository once it has been installed.

    
        <parent>
            <groupId>myorg.jpa</groupId>
            <artifactId>jpa-parent</artifactId>
            <version>1.0-SNAPSHOT</version>
            <relativePath>../jpa-parent</relativePath>
        </parent>

        <groupId>myorg.jpa</groupId>
        <artifactId>entityMgrEx-child</artifactId>

        <name>Entity Manager Exercise</name>
  4. Verify your project still builds. This will verify your relativePath is correct.

    $mvn clean verify
    ...
    [INFO] BUILD SUCCESS
  5. Move the following constructs from the entityMgrEx module to the jpa-parent module. These represent the *passive* definitions that will not directly impact the child module until the child requests that feature. Your child module should still have the same build and test functionality except now it should look a little smaller. One could also make a case for moving some of the SQL/DDL script execution definitions also to the parent -- which would make this module almost of trivial size.

    • properties

    • repositories

    • dependencyManagement

    • pluginManagement

    • select profiles

    
    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
        <modelVersion>4.0.0</modelVersion>

        <groupId>myorg.jpa</groupId>
        <artifactId>jpa-parent</artifactId>
        <version>1.0-SNAPSHOT</version>
        <packaging>pom</packaging>

        <name>JPA Parent POM</name>
        <description>
            This parent pom is intended to provide common and re-usable 
            definitions and constructs across JPA projects.
        </description>

       <!-- needed to resolve some hibernate dependencies -->
        <repositories>
            <repository>
                <id>jboss-nexus</id>
                <name>JBoss Nexus Repository</name>
                <url>https://repository.jboss.org/nexus/content/groups/public-jboss/</url>
            </repository>
        </repositories>

        <properties>
            <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
            <java.source.version>1.8</java.source.version>
            <java.target.version>1.8</java.target.version>

            <jboss.host>localhost</jboss.host>
            <db.host>${jboss.host}</db.host>

            <maven-compiler-plugin.version>3.1</maven-compiler-plugin.version>
            <maven-jar-plugin.version>2.5</maven-jar-plugin.version>
            <maven-surefire-plugin.version>2.17</maven-surefire-plugin.version>
            <sql-maven-plugin.version>1.5</sql-maven-plugin.version>        

            <commons-logging.version>1.1.1</commons-logging.version>
            <h2db.version>1.3.168</h2db.version>
            <hibernate-jpa-2.1-api.version>1.0.0.Final</hibernate-jpa-2.1-api.version>
            <hibernate-entitymanager.version>4.3.4.Final</hibernate-entitymanager.version>
            <hibernate3-maven-plugin.version>3.0</hibernate3-maven-plugin.version>
            <hibernate3.version>3.6.10.Final</hibernate3.version>
            <junit.version>4.10</junit.version>
            <log4j.version>1.2.13</log4j.version>
            <slf4j.version>1.7.2</slf4j.version>
        </properties>

        <dependencyManagement>
            <dependencies>
                <dependency>
                  <groupId>commons-logging</groupId>
                  <artifactId>commons-logging</artifactId>
                  <version>${commons-logging.version}</version>
                </dependency>
                <dependency>
                    <groupId>org.hibernate.javax.persistence</groupId>
                    <artifactId>hibernate-jpa-2.1-api</artifactId>
                    <version>${hibernate-jpa-2.1-api.version}</version>
                </dependency>
                <dependency>
                    <groupId>org.hibernate</groupId>
                    <artifactId>hibernate-entitymanager</artifactId>
                    <version>${hibernate-entitymanager.version}</version>
                </dependency>
                <dependency>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                    <version>${slf4j.version}</version>
                </dependency>
                <dependency>
                    <groupId>junit</groupId>
                    <artifactId>junit</artifactId>
                    <version>${junit.version}</version>
                </dependency>
                <dependency>
                  <groupId>log4j</groupId>
                  <artifactId>log4j</artifactId>
                  <version>${log4j.version}</version>
                </dependency>    
            </dependencies>
        </dependencyManagement>

        <build>
            <pluginManagement>
                <plugins>
                      <!-- make sure we are building java6 -->
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-compiler-plugin</artifactId>
                        <version>${maven-compiler-plugin.version}</version>
                        <configuration>
                                <source>${java.source.version}</source>
                                <target>${java.target.version}</target>
                        </configuration>                    
                    </plugin>      

                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-jar-plugin</artifactId>
                        <version>${maven-jar-plugin.version}</version>
                    </plugin>
              
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>${maven-surefire-plugin.version}</version>
                        <configuration>
                            <argLine>${surefire.argLine}</argLine>
                        </configuration>
                    </plugin>            

                    <plugin>
                        <groupId>org.codehaus.mojo</groupId>
                        <artifactId>hibernate3-maven-plugin</artifactId>
                        <version>${hibernate3-maven-plugin.version}</version>
                        <extensions>true</extensions>
                        <dependencies>
                            <dependency>
                                <groupId>org.hibernate</groupId>
                                <artifactId>hibernate-entitymanager</artifactId>
                                <version>${hibernate3.version}</version>
                            </dependency>
                        </dependencies>
                        <executions>
                            <execution>
                                <id>generate-drop-ddl</id>
                                <phase>process-test-classes</phase>
                                <goals>
                                    <goal>run</goal>
                                </goals>
                                <configuration>
                                    <hibernatetool>
                                        <hbm2ddl export="false" create="false" drop="true" format="true" 
                                            outputfilename="${project.artifactId}-dropJPA.ddl"/>
                                    </hibernatetool>
                                </configuration>
                            </execution>
                            <execution>
                                <id>generate-create-ddl</id>
                                <phase>process-test-classes</phase>
                                <goals>
                                    <goal>run</goal>
                                </goals>
                                <configuration>
                                    <hibernatetool>
                                        <hbm2ddl export="false" create="true" drop="false" format="true" 
                                            outputfilename="${project.artifactId}-createJPA.ddl"/>
                                    </hibernatetool>
                                </configuration>
                            </execution>
                        </executions>
                    </plugin>

                    <plugin>
                        <groupId>org.codehaus.mojo</groupId>
                        <artifactId>sql-maven-plugin</artifactId>        
                        <version>${sql-maven-plugin.version}</version>        
                    
                        <dependencies>
                            <dependency>
                                <groupId>com.h2database</groupId>
                                <artifactId>h2</artifactId>
                                <version>${h2db.version}</version>
                            </dependency>
                        </dependencies>
                    
                        <configuration>
                            <username>${jdbc.user}</username>
                            <password>${jdbc.password}</password>
                            <driver>${jdbc.driver}</driver>
                            <url>${jdbc.url}</url>          
                        </configuration>
                    </plugin>          

                </plugins>    
            </pluginManagement>
        </build>

        <profiles>
            <profile> <!-- H2 server-based DB -->
                <id>h2srv</id>
                <properties>
                      <jdbc.driver>org.h2.Driver</jdbc.driver>
                      <jdbc.url>jdbc:h2:tcp://${db.host}:9092/h2db/ejava</jdbc.url>
                      <jdbc.user>sa</jdbc.user>
                      <jdbc.password/>
                      <hibernate.dialect>org.hibernate.dialect.H2Dialect</hibernate.dialect>
                </properties>
                <dependencies>
                    <dependency>
                        <groupId>com.h2database</groupId>
                        <artifactId>h2</artifactId>
                        <version>${h2db.version}</version>
                        <scope>test</scope>
                    </dependency>
                </dependencies>
            </profile>

            <profile> <!-- H2 file-based DB -->
                <id>h2db</id>
                <activation>
                    <property> 
                        <name>!jdbcdb</name>
                    </property>
                </activation>
                <properties>
                      <jdbc.driver>org.h2.Driver</jdbc.driver>
                      <jdbc.url>jdbc:h2:${basedir}/target/h2db/ejava</jdbc.url>
                      <jdbc.user>sa</jdbc.user>
                      <jdbc.password/>
                      <hibernate.dialect>org.hibernate.dialect.H2Dialect</hibernate.dialect>
                </properties>
                <dependencies>
                    <dependency>
                        <groupId>com.h2database</groupId>
                        <artifactId>h2</artifactId>
                        <version>${h2db.version}</version>
                        <scope>test</scope>
                    </dependency>
                </dependencies>
            </profile>

            <!--  tell Eclipse what to do with some of the plugins -->
            <profile>
              <id>m2e</id>
              <activation>
                <property>
                  <name>m2e.version</name>
                </property>
              </activation>
              <build>
                <pluginManagement>
                    <plugins>
                        <plugin>
                          <groupId>org.eclipse.m2e</groupId>
                          <artifactId>lifecycle-mapping</artifactId>
                          <version>1.0.0</version>
                          <configuration>
                            <lifecycleMappingMetadata>
                              <pluginExecutions>
                                
                                <pluginExecution>
                                  <pluginExecutionFilter>
                                    <groupId>org.codehaus.mojo</groupId>
                                    <artifactId>hibernate3-maven-plugin</artifactId>
                                    <versionRange>[3.0,)</versionRange>
                                    <goals>
                                      <goal>run</goal>
                                    </goals>
                                  </pluginExecutionFilter>
                                  <action>
                                    <ignore/>
                                  </action>
                                </pluginExecution>

                                <pluginExecution>
                                  <pluginExecutionFilter>
                                    <groupId>org.codehaus.mojo</groupId>
                                    <artifactId>sql-maven-plugin</artifactId>
                                    <versionRange>[1.0.0,)</versionRange>
                                    <goals>
                                      <goal>execute</goal>
                                    </goals>
                                  </pluginExecutionFilter>
                                  <action>
                                    <ignore />
                                  </action>
                                </pluginExecution>

                              </pluginExecutions>
                            </lifecycleMappingMetadata>
                          </configuration>
                        </plugin>

                    </plugins>
                </pluginManagement>
               </build>
            </profile>
        </profiles>
    </project>
  6. Leave the following the child project. This is a collection of *active* project constructs.

    • plugins

    • dependencies

    • module-specific properties

    • profiles that declare plugins and dependencies

    
     <project
        xmlns="http://maven.apache.org/POM/4.0.0"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
        <modelVersion>4.0.0</modelVersion>
        <parent>
            <groupId>myorg.jpa</groupId>
            <artifactId>jpa-parent</artifactId>
            <version>1.0-SNAPSHOT</version>
            <relativePath>../jpa-parent</relativePath>
        </parent>

        <groupId>myorg.jpa</groupId>
        <artifactId>entityMgrEx-child</artifactId>

        <name>Entity Manager Exercise</name>
        <build>
            <!-- filtering will replace URLs, credentials, etc in the 
                 files copied to the target directory and used during testing.
                -->
            <testResources>
                <testResource>
                    <directory>src/test/resources</directory>
                    <filtering>true</filtering>
                </testResource>
            </testResources>

            <plugins>
                <plugin>
                    <artifactId>hibernate3-maven-plugin</artifactId>
                    <groupId>org.codehaus.mojo</groupId>
                    <configuration>
                        <hibernatetool destdir="target/classes/ddl">
                            <classpath>
                                <path location="${project.build.directory}/classes" />
                                <path location="${project.build.directory}/test-classes" />
                            </classpath>
                            <jpaconfiguration persistenceunit="entityMgrEx"
                                propertyfile="${basedir}/target/test-classes/hibernate.properties" />
                        </hibernatetool>
                    </configuration>
                </plugin>
            </plugins>
        </build>

        <dependencies>
            <dependency>
              <groupId>commons-logging</groupId>
              <artifactId>commons-logging</artifactId>
              <scope>compile</scope>
            </dependency>
        
            <dependency>
                <groupId>org.hibernate.javax.persistence</groupId>
                <artifactId>hibernate-jpa-2.1-api</artifactId>
                <scope>provided</scope>
            </dependency>

            <dependency>
                <groupId>org.hibernate</groupId>
                <artifactId>hibernate-entitymanager</artifactId>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-log4j12</artifactId>
                <scope>test</scope>
            </dependency>

            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <scope>test</scope>
            </dependency>
            <dependency>
              <groupId>log4j</groupId>
              <artifactId>log4j</artifactId>
              <scope>test</scope>
            </dependency>    
        </dependencies>

        <profiles>
            <profile>
                <id>testing</id>
                <activation>
                    <property>
                        <name>!skipTests</name>
                    </property>
                </activation>
          
                <build>
                    <plugins>
                        <plugin>
                            <!-- runs schema against the DB -->
                            <groupId>org.codehaus.mojo</groupId>
                            <artifactId>sql-maven-plugin</artifactId>        

                            <executions>

                                <!-- place execution elements here  -->
                                <execution>
                                    <id>drop-db-before-test</id>
                                    <phase>process-test-classes</phase>
                                    <goals>
                                        <goal>execute</goal>
                                    </goals>
                                    <configuration>
                                        <autocommit>true</autocommit>
                                        <fileset>
                                            <basedir>${basedir}/src</basedir>
                                            <includes>
                                                <include>main/resources/ddl/**/*drop*.ddl</include>
                                            </includes>
                                        </fileset>
                                        <onError>continue</onError>
                                    </configuration>
                                </execution>

                                <execution>
                                    <id>create-db-before-test</id>
                                    <phase>process-test-classes</phase>
                                    <goals>
                                        <goal>execute</goal>
                                    </goals>
                                    <configuration>
                                        <autocommit>true</autocommit>
                                        <fileset>
                                            <basedir>${basedir}/src</basedir>
                                            <includes>
                                                <include>main/resources/ddl/**/*create*.ddl</include>

                                            </includes>
                                        </fileset>
                                        <print>true</print>
                                    </configuration>
                                </execution>

                                <execution>
                                    <id>populate-db-before-test</id>
                                    <phase>process-test-classes</phase>
                                    <goals>
                                        <goal>execute</goal>
                                    </goals>
                                    <configuration>
                                        <autocommit>true</autocommit>
                                        <fileset>
                                            <basedir>${basedir}/src</basedir>
                                            <includes>
                                                <include>test/resources/ddl/**/*populate*.ddl</include>
                                            </includes>
                                        </fileset>
                                    </configuration>
                                </execution>

                                <!--
                                <execution>
                                    <id>drop-db-after-test</id>
                                    <phase>test</phase>
                                    <goals>
                                        <goal>execute</goal>
                                    </goals>
                                    <configuration>
                                        <autocommit>true</autocommit>
                                        <fileset>
                                            <basedir>${basedir}/src</basedir>
                                            <includes>
                                                <include>main/resources/ddl/**/*drop*.ddl</include>     
                                                </includes>
                                        </fileset>
                                    </configuration>
                                </execution>
                                -->
                            </executions>
                        </plugin>          
                    </plugins>          
                </build>          
            </profile>
        </profiles>
    </project> 
  7. Verify your project still builds. This will verify your relativePath is correct.

    $mvn clean verify
    ...
    [INFO] BUILD SUCCESS
  8. Optionally change your jpa-parent dependency to the class examples base parent project.

    
    
        <parent>
            <groupId>info.ejava.examples.build</groupId>
            <artifactId>dependencies</artifactId>
            <version>x.x.x-SNAPSHOT</version>
            <relativePath>build/dependencies/pom.xml</relativePath>
        </parent>

    Note

    Replace x.x.x-SNAPSHOT with the correct version for class.

Note

It is never a good idea to declare *active* POM constructs in a parent of a multi-module project unless *ALL* child modules are of the same purpose. Strive for parent Maven projects to define standards to follow without inserting unecessary dependencies or other constructs.

This chapter will take you through the steps to register a Java POJO with the JPA persistence unit using both orm.xml mapping-file descriptors and Java class annotations. It will also take you through the steps to define a POJO class legal to be used as JPA entity class.

JPA Classes are required to ...

  • Be identified as being a JPA entity class

  • Have a non-private default constructor

  • At least have one property defined as the primary key

  1. Create a POJO Java class in the ...mapped Java package

    package myorg.entityex.mapped;
    
    
    import java.util.Date;
    public class Animal {
        private int id;
        private String name;
        private Date dob;
        private double weight;
        
        public Animal(String name, Date dob, double weight) {
            this.name = name;
            this.dob = dob;
            this.weight = weight;
        }
        
        public int getId() { return id; }
        public void setId(int id) {
            this.id = id;
        }
        
        public String getName() { return name; }
        public void setName(String name) {
            this.name = name;
        }
        
        public Date getDob() { return dob; }
        public void setDob(Date dob) {
            this.dob = dob;
        }
        
        public double getWeight() { return weight; }
        public void setWeight(double weight) {
            this.weight = weight;
        }
    }
  2. Copy the existing AutoTest.java to AnimalTest.java and remove (or ignore) references to the Auto class from AnimalTest.java

  3. Attempt to persist the Animal by adding the following @Test method to the AnimalTest.java JUnit class.

    # src/test/java/myorg/entityex/AnimalTest.java
    
     
        @Test
        public void testCreateAnimal() {
            log.info("testCreateAnimal");
            Animal animal = new Animal("bessie", 
                    new GregorianCalendar(1960, 1, 1).getTime(), 1400.2);
            em.persist(animal);        
            
            assertNotNull("animal not found", em.find(Animal.class,animal.getId()));
        }
  4. Attempt to build and run your test. Your test should fail with the following error message. This means that although your class is a valid Java POJO, it has not been made known to the persistence unit as a JPA entity.

    testCreateAnimal(myorg.entityex.AutoTest): Unknown entity: myorg.entityex.mapped.Animal
    ...
    java.lang.IllegalArgumentException: Unknown entity: myorg.entityex.mapped.Animal
            at org.hibernate.ejb.AbstractEntityManagerImpl.persist(AbstractEntityManagerImpl.java:856)
            at myorg.entityex.AutoTest.testCreateAnimal(AutoTest.java:100)
    
  5. Add the POJO class to the persistence unit by adding an orm.xml JPA mapping file to your project. Place the file in the src/main/resources/orm directory.

    
    # src/main/resources/orm/Animal-orm.xml

    <?xml version="1.0" encoding="UTF-8"?>
    <entity-mappings xmlns="http://java.sun.com/xml/ns/persistence/orm"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://java.sun.com/xml/ns/persistence/orm http://java.sun.com/xml/ns/persistence/orm_2_0.xsd" version="2.0">

        <entity class="myorg.entityex.mapped.Animal"/>

    </entity-mappings>
  6. Register the orm.xml file with the persistence unit by adding a mapping-file element reference.

    
    # src/test/resources/META-INF/persistence.xml

        <persistence-unit name="entityEx-test">
            <provider>org.hibernate.ejb.HibernatePersistence</provider>

            <mapping-file>orm/Animal-orm.xml</mapping-file>
            <class>myorg.entityex.Auto</class>
            <properties>
            ...
  7. Attempt to build and run your test. Your test should fail with the following error message. The specifics of the error message will depend upon whether you are running just the JUnit test or building within Maven since the pom is configured to build database schema from the JPA mappings prior to running the JUnit test.

    
    PersistenceUnit: entityEx-test] Unable to configure EntityManagerFactory: No identifier specified for entity: myorg.entityex.mapped.Animal

    Caused by: org.hibernate.AnnotationException: No identifier specified for entity: myorg.entityex.mapped.Animal

    Although the class is a valid POJO and we followed the deployment descriptor mechanism for registering it with the persistence unit, it is not a legal entity. The error message indicates it is lacking a primary key field.

  8. Update the orm.xml file and define the "id" column as the primary key property for the entity.

    
        <entity class="myorg.entityex.mapped.Animal">
            <attributes>
                <id name="id"/>
            </attributes>
        </entity>
  9. Rebuild your module and it should now persist the POJO as a JPA entity. The SQL should be printed in the debug output.

    $ mvn clean test
    ...
    Hibernate: 
        insert 
        into
            Animal
            (dob, name, weight, id) 
        values
            (?, ?, ?, ?)
     -tearDown() complete, em=org.hibernate.ejb.EntityManagerImpl@12a80ea3
     -closing entity manager factory
     -HHH000030: Cleaning up connection pool [jdbc:h2:/home/jcstaff/workspaces/ejava-javaee/git/jpa/jpa-entity/entityEx/target/h2db/ejava]
    Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.94 sec
    
    Results :
    
    Tests run: 2, Failures: 0, Errors: 0, Skipped: 0
    
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
  10. Update your JUnit test method to look like the following. The unit test now clears the cache of entities and forces the entity manager to instantiate a new instance for the value returned from the find().

        @Test
    
        public void testCreateAnimal() {
            log.info("testCreateAnimal");
            Animal animal = new Animal("bessie", 
                    new GregorianCalendar(1960, 1, 1).getTime(), 1400.2);
            em.persist(animal);        
            
            assertNotNull("animal not found", em.find(Animal.class,animal.getId()));
            
            em.flush(); //make sure all writes were issued to DB
            em.clear(); //purge the local entity manager entity cache to cause new instance
            assertNotNull("animal not found", em.find(Animal.class,animal.getId()));
        }
  11. Attempt to rebuild your module. It should fail because the entity class does not have a default constructor. Remember that default constructors are provided for free in POJOs until you add the first constructor. Once you add a custom constructor you are required to add a default constructor to make it a legal entity class.

    javax.persistence.PersistenceException: org.hibernate.InstantiationException: No default constructor for entity: myorg.entityex.mapped.Animal
    
  12. Update the POJO with a default constructor.

        public Animal() {} //must have default ctor
    
        public Animal(String name, Date dob, double weight) {
            this.name = name;
            this.dob = dob;
            this.weight = weight;
        }
  13. Rebuild the module. It should now pass because you have defined and registered a compliant entity class. The class was

  1. Copy the POJO class to a new java package and class name (Animal2).

    package myorg.entityex.annotated;
    
    
    import java.util.Date;
    public class Animal2 {
        private int id;
        private String name;
        private Date dob;
        private double weight;
        
        public Animal2() {} //must have default ctor
    ...
    }
  2. Add a javax.persistence.Entity annotation to the class

    import javax.persistence.Entity;
    
    
    @javax.persistence.Entity
    public class Animal2 {
  3. Register the new entity with the persistence.xml using a class element reference

    
        <persistence-unit name="entityEx-test">
            <provider>org.hibernate.ejb.HibernatePersistence</provider>

            <mapping-file>orm/Animal-orm.xml</mapping-file>
            <class>myorg.entityex.Auto</class>
            <class>myorg.entityex.annotated.Animal2</class>
            <properties>
  4. Add a new test method to work with the new class added to the module.

        @Test
    
        public void testCreateAnimalAnnotated() {
            log.info("testCreateAnimalAnnotated");
            myorg.entityex.annotated.Animal2 animal = new myorg.entityex.annotated.Animal2("bessie", 
                    new GregorianCalendar(1960, 1, 1).getTime(), 1400.2);
            em.persist(animal);        
            
            assertNotNull("animal not found", em.find(myorg.entityex.annotated.Animal2.class,animal.getId()));
            
            em.flush(); //make sure all writes were issued to DB
            em.clear(); //purge the local entity manager entity cache to cause new instance
            assertNotNull("animal not found", em.find(myorg.entityex.annotated.Animal2.class,animal.getId()));
  5. Attempt to build/run your module at this point. You should get a familiar error about Animal2 not having an identifier.

    Unable to configure EntityManagerFactory: No identifier specified for entity: myorg.entityex.annotated.Animal2
    
  6. Since we want to use annotations for the new class, fix the issue by adding a @javax.persistence.Id annotation to the id attribute. This is called FIELD access in JPA. You can alternately use PROPERTY access by moving the annotation to the getId() method.

        @javax.persistence.Id
    
        private int id;
  7. Re-run you test. It should succeed this time.

    $ mvn clean test
    ...
    [INFO] BUILD SUCCESS
    ...
  8. If you would like to observe the data in the database, do two things

  9. Type the following command in the H2 browser UI

    SELECT * FROM ANIMAL2;
    ID      DOB     NAME    WEIGHT  
    0   1960-02-01 00:00:00.0   bessie  1400.2

In this chapter we will create custom class/database mappings for some class properties

  • Map a class to a specific table

  • Map a property to a specific column

  • Define constraints for properties

  • Take a look at using getters and setters

  1. Copy your Animal.java class to Cat.java

    package myorg.entityex.mapped;
    
    
    import java.util.Date;
    public class Cat {
        private int id;
        private String name;
        private Date dob;
        private double weight;
        
        public Cat() {} //must have default ctor
        public Cat(String name, Date dob, double weight) {
            this.name = name;
            this.dob = dob;
            this.weight = weight;
        }
        
        public int getId() { return id; }
    ...
  2. Copy your Animal2.java class to Cat2.java

    package myorg.entityex.annotated;
    
    
    import java.util.Date;
    @javax.persistence.Entity
    public class Cat2 {
        private int id;
        private String name;
        private Date dob;
        private double weight;
        
        public Cat2() {} //must have default ctor
        public Cat2(String name, Date dob, double weight) {
            this.name = name;
            this.dob = dob;
            this.weight = weight;
        }
        
        @javax.persistence.Id
        public int getId() { return id; }
    ...
  3. Name the new Cat entity class in the Animal-orm.xml

    
    # src/main/resources/orm/Animal-orm.xml

        <entity class="myorg.entityex.mapped.Animal">
    ...
        <entity class="myorg.entityex.mapped.Cat">
            <attributes>
                <id name="id"/>
            </attributes>
        </entity>
  4. Name the new Cat2 entity class in the persistence.xml

    
    # src/test/resources/META-INF/persistence.xml

            <mapping-file>orm/Animal-orm.xml</mapping-file>
            <class>myorg.entityex.Auto</class>
            <class>myorg.entityex.annotated.Animal2</class>
            <class>myorg.entityex.annotated.Cat2</class>
  5. Rebuild your module form the command line and observe the create schema generated for Cat and Cat2. Notice that the JPA provider used the class name as the default entity name and will be attempting to map the entity to a database table by the same name as the entity.

    $ more target/classes/ddl/*
    ...
       create table Cat (
            id integer not null,
            dob timestamp,
            name varchar(255),
            weight double not null,
            primary key (id)
        );
    
        create table Cat2 (
            id integer not null,
            dob timestamp,
            name varchar(255),
            weight double not null,
            primary key (id)
        );
  6. Add a table element to the orm.xml definition to map Cat to the ENTITYEX_CAT table.

    
    
        <entity class="myorg.entityex.mapped.Cat">
            <table name="ENTITYEX_CAT"/>
            <attributes>
  7. Add a @javax.persistence.Table annotation to the Cat2 class to map instances to the ENTITYEX_CAT table.

    @javax.persistence.Entity
    
    @javax.persistence.Table(name="ENTITYEX_CAT")
    public class Cat2 {
        private int id;
  8. Rebuild your module form the command line and observe the create schema generated for Cat and Cat2. Notice now that we have mapped two entity classes to the same table using a custom table name.

    $ more target/classes/ddl/*
    ...
       create table ENTITYEX_CAT (
            id integer not null,
            dob timestamp,
            name varchar(255),
            weight double not null,
            primary key (id)
        );
  9. Map the id property for both the Cat and Cat2 to the CAT_ID column. Also have the persistence provider automatically generate a value for the primary key during the persist(). The exercise will go into generated primary key types in more detaiu

        @javax.persistence.Id
    
        @javax.persistence.Column(name="CAT_ID")
        @javax.persistence.GeneratedValue
        private int id;
    
        <entity class="myorg.entityex.mapped.Cat">
            <table name="ENTITYEX_CAT"/>
            <attributes>
                <id name="id">
                    <column name="CAT_ID"/>
                    <generated-value/>
                </id>
            </attributes>
        </entity>
  10. Make the name column mandatory (nullable=false) and define the length of the string to be 20 characters. Note that these property assignments are only useful as documentation and generating schema. Many of the column properties are not used at runtime by the provider.

        @javax.persistence.Column(nullable=false, length=20)
    
        private String name;
    
                <basic name="name">
                    <column nullable="false" length="20"/>
                </basic>
  11. Have the weight column stored with a precision of 3 digits, with 1 digit (scale) to the right of the decimal place. You will need to change the datatype of the mapped property to BigDecimal to fully leverage this capability.

    # src/main/java/myorg/entityex/annotated/Cat2.java
    
    
        @javax.persistence.Column(precision=3, scale=1)  //10.2lbs
        private BigDecimal weight;
    ...
        public Cat2(String name, Date dob, BigDecimal weight) {
    ...
        public BigDecimal getWeight() { return weight; }
        public void setWeight(BigDecimal weight) {
    # src/main/java/myorg/entityex/mapped/Cat.java
    
    
        private BigDecimal weight;
    ...
        public Cat(String name, Date dob, BigDecimal weight) {
    ...
        public BigDecimal getWeight() { return weight; }
        public void setWeight(BigDecimal weight) {
    
    # src/main/resources/orm/Animal-orm.xml
                <basic name="weight">
                    <column precision="3" scale="1"/>
                </basic>
  12. Rebuild the module from the command line and observe the database schema generated generated for the ENTITEX_CAT table.

    # target/classes/ddl/entityEx-createJPA.ddl
    
        create table ENTITYEX_CAT (
            CAT_ID integer generated by default as identity,
            dob date,
            name varchar(20) not null,
            weight decimal(3,1),
            primary key (CAT_ID)
        );

    Notice how

In the above example, you used FIELD access to the property values. This is the preferred method if your business object attributes provide an accurate representation as to what should be stored in the database. FIELD access was chosen by the provider by the fact that our annotated class placed the @Id annotation on a Java attribute and not a Java getter().

# implies FIELD access


    @javax.persistence.Id
    @javax.persistence.Column(name="CAT_ID")
    @javax.persistence.GeneratedValue
    private int id;
...    
    public int getId() { return id; }

If moved the @Id property definitions to the getter(), then the access would have been switched to PROPERTY. That was how JPA 1.0 annotated classed worked and it was always one way or another.

# implies PROPERTY access


    private int id;
...    
    @javax.persistence.Id
    @javax.persistence.Column(name="CAT_ID")
    @javax.persistence.GeneratedValue
    public int getId() { return id; }

Since it was always one way or the other with JPA 1.0, the specification in the orm.xml file was placed on the root element of the entity


    <entity class="myorg.entityex.mapped.Cat"
        access="FIELD">

Starting with JPA 2.0, we can also make the specification more explicit (like the XML technique) with the addition of the @Access annotation

@javax.persistence.Access(javax.persistence.AccessType.FIELD)

public class Cat2 {

Although switching between FIELD and PROPERTY access was always a capability in JPA 1.0 -- JPA 2.0 added the ability to chose on a per-property basis. This is done by applying the @Access annotation to the getter() you want to have property access. In this section, we will continue to expose all our properties to the provider through FIELD access, but define a PROPERTY access for the "weight" property.