Enterprise Java Development@TOPIC@

Enterprise Computing with Java (605.784.31) Course Exercises

Revision: v2018-08-21

Built on: 2018-10-16 22:46 EST

Abstract

This book contains lab exercises covering for JHU 605.784.31


General Instructions
1. Lab Exercises
I. Enterprise Java (605.784.31) Development Environment Setup
Purpose
2. Java JDK Setup
2.1. Download and Install JDK
2.2. Verify your JDK is installed
3. Git Client Setup
3.1. Install Git Client
3.2. Get Class Repository
4. Maven Environment Setup
4.1. Maven Installation
4.2. Maven Configuration
4.3. Test Maven Build
4.4. Missing Dependencies
4.5. Build Local Site
5. JBoss (Wildfly) Setup
5.1. Download and Install Wildfly 13.0.0.Final
5.2. Configure JBoss Server
5.3. Add JBoss Admin Account
5.4. Enable JBoss Remote Debugging
5.5. (Optional) Using Alternate Networks
6. H2 Database Setup
6.1. Locate the h2*.jar file
6.2. Start Server
6.3. Access DB User Interface
6.4. Activate H2 Server Profile for Builds
6.5. Update JBoss to use Server Mode
7. Eclipse Setup
7.1. Download and Install Eclipse
7.2. Define JDK location
7.3. Setup Maven Eclipse Integration (m2e)
7.4. Setup Git Eclipse Team Provider
7.5. Setup JBoss Eclipse Integration
8. Ant Setup
8.1. Install Ant
II. JavaSE: First Simple Module Exercise
Purpose
1. Goals
2. Objectives
9. Develop and Test Module using Command Line Tools (OPTIONAL!)
9.1. Summary
10. Automate Build and Testing with Ant (OPTIONAL!)
10.1. Summary
11. Adding Logging
11.1. Summary
12. Creating Portable and Repeatable Project Builds with Maven
12.1. Summary
13. Leverage IDE using Eclipse
13.1. Import a project into Eclipse
13.2. Setup Eclipse to be able to execute Maven project goals
13.3. Setup environment to enable interactive debugging
13.4. Summary
III. Java Persistence API: Entity Manager Exercise
Purpose
1. Goals
2. Objectives
14. Setup and Start Database
14.1. Summary
15. Create Core Project POM
15.1. Summary
16. Setup Database Schema
16.1. Summary
17. Add SQL Tuning
17.1. Summary
18. JPA Persistence Unit Setup
18.1. Summary
19. Setup JPA TestCase
19.1. Summary
20. Ready Project for Import into Eclipse
20.1. Summary
21. Test EntityManager Methods
21.1. Summary
22. Automatically Generate Database Schema
22.1. Summary
23. Create JPA Parent POM
23.1. Summary
IV. Java Persistence API: Entity Mapping Exercise
Purpose
24. JPA Entity Exercise Setup
24.1. Setup Maven Project
25. JPA Entity Class Basics
25.1. Create POJO Class using descriptor
25.2. Create POJO Class using annotations
25.3. Summary
26. Mapping Class Properties
26.1. Map Entity to Specific Table
26.2. Using JPA Property Access
26.3. Summary
27. JPA Enum Mapping
27.1. Mapping Enum Ordinal Values
27.2. Mapping Enum Name Values
27.3. Mapping Enum Alternate Values
27.4. Summary
28. Mapping Temporal Types
28.1. Mapping Temporal Types
28.2. Summary
29. Mapping Large Objects
29.1. Mapping CLOBS
29.2. Mapping BLOBS
29.3. Summary
30. Primary Key Generation
30.1. IDENTITY Primary Key Generation Strategy
30.2. SEQUENCE Primary Key Generation Strategy
30.3. TABLE Primary Key Generation Strategy
30.4. Summary
31. Mapping Compound Primary Keys
31.1. Using Embedded Compound Primary Keys
31.2. Using Compound Primary Keys as IdClass
31.3. Summary
32. Mapping Embedded Objects within Classes
32.1. Mapping an Embedded Object
32.2. Mapping Muti-level Embedded Objects
32.3. Summary
33. Objects Mapped to Multiple Tables
33.1. Mapping to Secondary Tables
33.2. Summary
V. Java Persistence API: Relationship Exercise
Purpose
34. JPA Entity Exercise Setup
34.1. Setup Maven Project
35. Mapping One-to-One Relationships
35.1. Setup
35.2. One-to-One Uni-directional Relationships
35.2.1. One-to-One Uni-directional Using a Foreign Key
35.2.2. One-to-One Uni-directional Using a Join Table
35.2.3. One-to-One Uni-directional Using a Primary Key Join
35.2.4. One-to-One Uni-directional Using MapsId
35.2.5. One-to-One Uni-directional Using Composite Primary/Foreign Keys
35.3. Mapping One-to-One Bi-directional Relationships
35.3.1. One-to-One Bi-directional Joined By Primary Key
35.3.2. One-to-One Bi-directional 0..1 Owner Relationship
35.3.3. One-to-One Bi-directional 0..1 Inverse Relationship
35.4. One-to-One EntityManager Automated Actions
35.4.1. One-to-One Using Cascades From Owner
35.4.2. One-to-One Using Cascades From Inverse
35.4.3. One-to-One Using Orphan Removal
35.5. Summary
36. Mapping One-to-Many Uni-directional Relationships
36.1. Setup
36.2. One-to-Many Uni-directional
36.2.1. One-to-Many Uni-directional with Join Table
36.2.2. One-to-Many Uni-directional using Foreign Key Join (from Child Table)
36.2.3. One-to-Many Uni-directional Mapping of Simple Types using ElementCollection
36.2.4. One-to-Many Uni-directional Mapping of Embeddable Type using ElementCollection
36.3. One-to-Many Provider Actions
36.3.1. One-to-Many Orphan Removal
36.4. Summary
37. JPA Collections
37.1. Setup
37.2. Entity Identity
37.2.1. Instance Id
37.2.2. Primary Key Id
37.2.3. Switching Ids
37.2.4. Business Id
37.3. Collection Ordering
37.3.1. Ordered Collections
37.4. Collection Interfaces
37.4.1. Using Maps with Collections
37.5. Summary
38. Mapping Many-to-One, Uni-directional Relationships
38.1. Setup
38.2. Many-to-One Uni-directional
38.2.1. Many-to-One Uni-directional Using a Foreign Key
38.2.2. Many-to-One Uni-directional Using a Compound Foreign Key
38.2.3. Many-to-One Uni-directional Using a MapsId
38.3. Summary
39. Mapping One-to-Many/Many-to-One Bi-directional Relationships
39.1. Setup
39.2. One-to-Many Bi-directional using Foreign Key Join
39.3. One-to-Many Bi-directional using Join Table
39.4. One-to-Many Bi-directional using Derived Composite Primary
39.5. Summary
40. Mapping Many-to-Many Relationships
40.1. Setup
40.2. Many-to-Many Uni-directional
40.3. Many-to-Many Bi-directional
40.4. Summary
VI. Java Persistence API: Query Exercise
Purpose
41. Exercise Data Model
41.1. Class Model
41.2. Database Schema
41.3. Object Instances
42. JPA Entity Exercise Setup
42.1. Setup Maven Project
43. Creating JPA Queries
43.1. Setup
43.2. Create/Execute Query
43.2.1. Multiple Results
43.2.2. Single Result
43.2.3. Single Result - NoResultException
43.2.4. Single Result - NonUniqueResultException
43.3. Query Parameters
43.4. Paging Query Results
43.5. Named Query
43.6. Value Queries
43.6.1. Retrieve Value
43.6.2. Retrieve Function Result Value
43.6.3. Retrieve Multiple Values
43.6.4. Encapsulate Row Values with ResultClass
43.7. Summary
44. SQL Queries
44.1. Setup
44.2. Create/Execute SQL Query
44.3. SQL Query Entity Result Mapping
44.4. SQL Result Set Mapping
44.5. Summary
45. Bulk Updates
45.1. Setup
45.2. Additional Setup
45.3. Using JPQL Bulk Update
45.4. Using Native SQL Bulk Update
45.5. Summary
46. Query Locks
46.1. Setup
46.2. Additional Setup
46.3. Using No Locks
46.4. Adding Lock Mode
46.5. Using Pessimistic Write Lock
46.6. Summary
VII. Basic EJB Development Exercise
Purpose
1. Goals
2. Objectives
47. Multi-Module JavaEE Project
47.1. Purpose
47.1.1. Goals
47.1.2. Objectives
47.2. Create Root Module
47.3. Create EJB Module
47.4. Manage Application Server
47.4.1. Application Server Setup
47.4.2. Standalone Application Server
47.4.3. Embedded Application Server
47.5. Summary
48. EAR Deployment
48.1. Purpose
48.1.1. Goals
48.1.2. Objectives
48.2. Create EAR Module
48.3. Create RMI Test Module
48.4. Deploy the EAR
48.5. Lookup and Invoke @Remote Interface
48.6. Directory Sanity Check
48.7. Summary
49. WAR Deployment
49.1. Purpose
49.1.1. Goals
49.1.2. Objectives
49.2. Create WAR Module
49.3. Add RMI Test
49.4. Embed EJB in WAR Module
49.5. Summary
50. Build Commands
50.1. Purpose
50.1.1. Goals
50.1.2. Objectives
50.2. mvn (phase)
50.3. mvn (phase) -rf :module
50.4. mvn (phase) -f (path to module)
50.5. mvn clean -Pundeploy
50.6. mvn clean -Dit.test=fully.qualified.ITPath#testMethod
50.7. Summary
51. Controlling JNDI Names
51.1. Purpose
51.1.1. Goals
51.1.2. Objectives
51.2. Eliminate Version# from EAR-based JNDI Name
51.3. Eliminate Version# from WAR-based JNDI Name
51.4. Summary
52. Debug Remote EJB
52.1. Purpose
52.1.1. Goals
52.1.2. Objectives
52.2. Running IT Tests in IDE
52.3. Debugging Deployment to Standalone Server
52.4. Debugging Deployment to Embedded Server
52.5. Summary
53. EJB Parent POM
53.1. Purpose
53.1.1. Goals
53.1.2. Objectives
53.2. Create Root POM
53.3. Summary


You will need a copy of Java 8 SDK installed.

Keep 32/64-bit choices consistent

Keep the 32/64-bit choice consistent with what you download later for Eclipse.

You will use Git in this class to perform an initial checkout and get updates for source files. Any Git client should be able to perform that function. You can determine if you have a command line Git client already installed using the following simple command.


There are a number of options and some are going to be based on on your platform. Your basic options include command line or using an Eclipse plugin

The class repository is located on github and can be browsed using the following http URL https://github.com/ejavaguy/ejava-student. With a cloned copy, you can receive file updates during the semester.

  1. CD to a directory you wish to place source code. Make sure the path to this directory contains no spaces.

  2. Clone the class repository using the following URL git://github.com/ejavaguy/ejava-student.git

    $ git clone git://github.com/ejavaguy/ejava-student.git
    Cloning into 'ejava-student'...
    ...
    Checking out files: 100% (1289/1289), done.
    ...
    
    $ ls ejava-student/
    ...
    
    $ cd ejava-student
    $ git branch -a    //list all branches -- local and remote
    * master
      remotes/origin/HEAD -> origin/master
      remotes/origin/master
    

    Note

    Git leaves you with all branches fetched and a local master branch referencing the class' master branch on github. You will be using the master branch for the duration of the semester. Other branches may show up, including my working branches where I am actively working on the next wave of updates. The master branch is usually updated the evening before or the day of class and should always be stable.

  3. Perform a mock update. This is what you will be doing several times this semester to get file updates.

    $ git checkout master #switches to master branch
    $ git pull            #downloads changes and attempts merge
    Already up-to-date.
    

    Note

    There are many modules within the class repository. Some are ready for use, some are still being worked, and some are not for use this semester. The ones ready for your use will be wired into the build and will be compiled during a follow-on section. The list will increase as the semester moves forward. Please ignore these extra modules. Keeping them within the baseline helps me keep related things centralized.

    Note

    If you ever make changes to the class examples and would like to keep those changes separate from the updates. Store them in a new branch at any time using the following git commands.

    $ git checkout -b new-branch       #creates new branch from current branch 
                                       #and switches to that branch
    $ git commit -am "saving my stuff" #commits all dirty files to new branch
    $ git checkout master              #switches back to the master branch 
    

    If you simply want to throw away any changes you made, you can discard those changes to tracked files using the following git commands.

    $ git reset --hard master
    $ git clean -rn  #shows you what it would delete without deleting 
    $ git clean -rf  #deletes files not managed or specifically ignored by git
    

  1. Download Maven 3 http://maven.apache.org/download.html

  2. Unzip the contents into a directory with no spaces in its path.

    $ ls apache-maven-3.5.4
    bin  boot  conf  lib  LICENSE  NOTICE  README.txt
  3. Add an environment variable for MAVEN_HOME and add MAVEN_HOME/bin to your PATH

    # my linux system -- should be done in .bashrc
    export MAVEN_HOME=/opt/apache-maven-3.5.4
    export PATH=$MAVEN_HOME/bin:$PATH
    
    # my windows system -- should be done in Advanced System Settings->Environment Variables
    set MAVEN_HOME=/apps/apache-maven-3.5.4
    set PATH=%MAVEN_HOME%\bin;%PATH%
    
  4. Verify maven is installed and in the path

    //my fedora system
    $ mvn -version
    Apache Maven 3.5.4 (1edded0938998edf8bf061f1ceb3cfdeccf443fe; 2018-06-17T14:33:14-04:00)
    Maven home: /opt/apache-maven-3.5.4
    Java version: 1.8.0_181, vendor: Oracle Corporation, runtime: /usr/java/jdk1.8.0_181-amd64/jre
    Default locale: en_US, platform encoding: UTF-8
    OS name: "linux", version: "4.1.13-100.fc21.x86_64", arch: "amd64", family: "unix"
    
    //my windows xp system
    > mvn --version
    
  1. Add a skeletal settings.xml file that will be used to provide local overrides for the build. This is the place where you can customize the build for local environment specifics like directory locations, server address, server ports, etc.

    1. Add the following to the.m2/settings.xml file in your HOME directory.

      
      <?xml version="1.0"?>
      <settings xmlns="http://maven.apache.org/POM/4.0.0"
          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
          xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">

          <offline>false</offline>
          
          <profiles>
          </profiles>
          
          <activeProfiles>
          </activeProfiles>
      </settings>    
    2. You can test whether your settings.xml file is seen by Maven by temporarily making it an invalid XML file and verifying that the next Maven build command fails with a parsing error.

      $ mvn clean
      [ERROR] Error executing Maven.
      [ERROR] 1 problem was encountered while building the effective settings
      [FATAL] Non-parseable settings /home/user/.m2/settings.xml: only whitespace content allowed before start tag and not s (position: START_DOCUMENT seen <?xml version="1.0"?>\ns... @2:2)  @ /home/user/.m2/settings.xml, line 2, column 2
      
    3. Add a default specification for the database profile we will be using for class at the bottom of the .m2/settings.xml file in your HOME directory.

      
      
          <activeProfiles>
              <activeProfile>h2db</activeProfile>
          </activeProfiles>
    4. If your operating system HOME directory has spaces in the path (e.g., Windows XP's Documents and Settings) then add a localRepository path specification to the .m2/settings.xml file and have it point to a location that does not have spaces in the path. The path does not have to exist. It will be created during the next build.

      
      
          <offline>false</offline>
          <!-- this overrides the default $HOME/.m2/repository location. --> 
          <localRepository>c:/jhu/repository</localRepository>

There are a few cases where dependencies cannot be hosted in public repositories and must be downloaded and installed manually. Oracle DB Client is one example.


If the message is a warning (i.e., for site/javadoc documentation -- it can be ignored). If you want to eliminate the warning or it is coming up as an error, you can download the artifact directly from the vendor and manually install it in your local repository.

Note

This is only an example. You are *not* required to download if the Oracle database driver for class. You can create a dummy file ($ touch dummy.jar) and register it using a dummy groupId, artifactId, and version if you wish.

  1. Download the driver jar from Oracle accept the license agreement.

  2. Install it manually into your localRepository

    $ mvn install:install-file -Dfile=/home/jcstaff/Downloads/ojdbc6.jar -DgroupId=com.oracle -DartifactId=ojdbc6 -Dversion=11.2.0.3 -Dpackaging=jar
    [INFO] Scanning for projects...
    ...
    [INFO] --- maven-install-plugin:2.4:install-file (default-cli) @ standalone-pom ---
    [INFO] Installing /home/jcstaff/Downloads/ojdbc6.jar to /home/jcstaff/.m2/repository/com/oracle/ojdbc6/11.2.0.3/ojdbc6-11.2.0.3.jar
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    ...
    

We will be using the JBoss/Wildfly Application Server this semester. This is a fully-compliant JavaEE 7 application server that includes a preview mode for JavaEE 8.

JBoss Application Server/Wildfly (AS) and JBoss Enterprise Application Platform (EAP)

JBoss has a community version (formerly call JBoss AS - renamed Wildfly ~2012) and commercial version (JBoss EAP) of their JavaEE application server. Both are open source and built off the same code base. In theory, changes propagate through the community version first in daily changes and short iterations. In theory, commercial version is a roll-up of a stable version of the community version with the ability to purchase support on that specific version. With commercial version support - you can receive patches for a specific issue prior to upgrading to the latest release. With the community version - you pretty much need to keep up with the latest release to get any patches. Of course, with either version you are free to perform your own support and code changes, but you can only get this commercially with the EAP release. There is a newsgoup post and slide show that provides a decent, short description of the two.

JBoss makes the EAP version available for *development* use from jboss.org but lags behind Wildfly (at wildfly.org) for obvious reasons. We will be using the open source/Wildfly version of the server.

JBoss AS/Wildfly version numbers are ahead of JBoss EAP because not every community version becomes a commercial version. JBoss AS 6 was skipped entirely by EAP.

  1. Download Wildfly 13.0.0.Final http://www.wildfly.org/downloads/. The 'Quickstarts' examples are also helpful but class notes, exercises, and guidance may have simplified, generalized, or alternative approaches to what is contained in the guides.

  2. Install JBoss into a directory that does not have any spaces in its path.

    $ unzip ~/Downloads/wildfly-13.0.0.Final.zip                
                    
    $ ls wildfly-13.0.0.Final/
    appclient  bin  copyright.txt  docs  domain  jboss-modules.jar  LICENSE.txt  modules  README.txt  standalone  welcome-content
    
  3. Test the installation by starting the default configuration installation.

    $ ./wildfly-13.0.0.Final/bin/standalone.sh
    =========================================================================
    
      JBoss Bootstrap Environment
    
      JBOSS_HOME: /opt/wildfly-13.0.0.Final
    
      JAVA: java
    
      JAVA_OPTS:  -server -Xms64m -Xmx512m -XX:MetaspaceSize=96M -XX:MaxMetaspaceSize=256m -Djava.net.preferIPv4Stack=true -Djboss.modules.system.pkgs=org.jboss.byteman -Djava.awt.headless=true
    
    =========================================================================
    
    12:09:04,979 INFO  [org.jboss.modules] (main) JBoss Modules version 1.8.5.Final
    12:10:29,008 INFO  [org.jboss.msc] (main) JBoss MSC version 1.4.2.Final
    12:10:29,051 INFO  [org.jboss.threads] (main) JBoss Threads version 2.3.2.Final
    12:10:29,319 INFO  [org.jboss.as] (MSC service thread 1-2) WFLYSRV0049: WildFly Full 13.0.0.Final (WildFly Core 5.0.0.Final) starting
    ...
    
    12:10:56,481 INFO  [org.jboss.as] (Controller Boot Thread) WFLYSRV0060: Http management interface listening on http://127.0.0.1:9990/management
    12:10:56,483 INFO  [org.jboss.as] (Controller Boot Thread) WFLYSRV0051: Admin console listening on http://127.0.0.1:9990
    12:10:56,486 INFO  [org.jboss.as] (Controller Boot Thread) WFLYSRV0025: WildFly Full 13.0.0.Final (WildFly Core 5.0.0.Final) started in 46320ms - Started 290 of 511 services (308 services are lazy, passive or on-demand)
    

    Note

    There are .sh version of scripts for *nix platforms and .bat forms of the scripts for Windows platforms. Use the one that is appropriate for your environment.

  4. Verify you can access the server

  1. Shutdown the server using Control-C

  2. Copy over the class example server files from what you cloned and built from github earlier.

    $ cd wildfly-13.0.0.Final
    
    wildfly-13.0.0.Final]$ unzip -l .../ejava-student/servers/ejava-wildfly1300/target/ejava-wildfly1300-5.0.0-SNAPSHOT-server.zip 
    Archive:  /home/jim/workspaces/ejava-class/ejava-student2/servers/ejava-wildfly1300/target/ejava-wildfly1300-5.0.0-SNAPSHOT-server.zip
      Length      Date    Time    Name
    ---------  ---------- -----   ----
            0  08-13-2018 12:21   domain/
            0  08-13-2018 12:21   domain/configuration/
            0  08-13-2018 12:21   standalone/
            0  08-13-2018 12:21   standalone/configuration/
         1774  08-13-2018 12:21   domain/configuration/application-roles.properties
         2402  08-13-2018 12:21   domain/configuration/application-users.properties
        37888  08-13-2018 12:21   standalone/configuration/standalone.xml
         1774  08-13-2018 12:21   standalone/configuration/application-roles.properties
         1190  08-13-2018 12:21   standalone/configuration/server.keystore
         2490  08-13-2018 12:21   standalone/configuration/application-users.properties
          575  08-13-2018 12:21   standalone/configuration/server.cer
    ---------                     -------
        48093                     11 files
    
    wildfly-13.0.0.Final]$ unzip .../ejava-student/servers/ejava-wildfly1300/target/ejava-wildfly1300-5.0.0-SNAPSHOT-server.zip
    Archive:  .../ejava-student2/servers/ejava-wildfly1300/target/ejava-wildfly1300-5.0.0-SNAPSHOT-server.zip
    replace domain/configuration/application-roles.properties? [y]es, [n]o, [A]ll, [N]one, [r]ename: A
      inflating: domain/configuration/application-roles.properties  
    ...
    
  3. Restart the server

  1. Use the batch script to add an admin user to the system. Note the password must have at least one digit and one non-alphanumeric character. If you run the application server on a remote machine or under a different account, please use the jboss.user and jboss.password supplied in build/dependencies/pom.xml. JBoss/Wildfly will bypass user credentials when the client executes on the same machine by the same user that started the server.

    ejava-student]$ egrep 'jboss.user|jboss.password' build/dependencies/pom.xml
            <jboss.user>admin</jboss.user>
            <jboss.password>password1!</jboss.password>
    
    $ ./bin/add-user.sh 
    
    What type of user do you wish to add? 
     a) Management User (mgmt-users.properties) 
     b) Application User (application-users.properties)
    (a): 
    
    Enter the details of the new user to add.
    Using realm 'ManagementRealm' as discovered from the existing property files.
    Username : admin
    User 'admin' already exists and is disabled, would you like to... 
     a) Update the existing user password and roles 
     b) Enable the existing user 
     c) Type a new username
    (a): a
    
    What groups do you want this user to belong to? (Please enter a comma separated list, or leave blank for none)[  ]: 
    Updated user 'admin' to file '/opt/wildfly-13.0.0.Final/standalone/configuration/mgmt-users.properties'
    Updated user 'admin' to file '/opt/wildfly-13.0.0.Final/domain/configuration/mgmt-users.properties'
    Updated user 'admin' with groups  to file '/opt/wildfly-13.0.0.Final/standalone/configuration/mgmt-groups.properties'
    Updated user 'admin' with groups  to file '/opt/wildfly-13.0.0.Final/domain/configuration/mgmt-groups.properties'
    Is this new user going to be used for one AS process to connect to another AS process? 
    e.g. for a slave host controller connecting to the master or for a Remoting connection for server to server EJB calls.
    yes/no? no
    
  2. Retry logging into the Admin Application http://localhost:9990/console

If you already have a process listening on localhost:8080 or any of the other JBoss ports on 127.0.0.1, you can switch addresses by editing the interfaces section of standandalone.xml. You can also do this at runtime by adding -Djboss.bind.address.management=... and/or -Djboss.bind.address=... on the command line.

The application server and application clients used in class require a relational database. Application server vendors generally package a lightweight database with their downloads so that the server can be used immediately for basic scenarios. JBoss comes packaged with the H2 database. This database can run in one of three modes

  • Embedded/in-memory

  • Embedded/file

  • Server-based

File-based versus in-memory allows you to do post-mortem analysis of the database after a test completes. File-based also allows you to initialize the database schema in one process and use the database within another. Using server-based mode allows you to inspect the database while the application is running.

JBoss and the class examples come setup with embedded drivers. You can change the configuration at any time to a server-based configuration using the following instructions.

Choose Right Mode for Right Need

Using embedded mode requires less administration overhead in the test environment.

Using server mode provides access to database state during application execution -- which is good for debugging.

Note

This will create a database folder called "ejava" relative to where you started the database server.

Note

LOCK_MODE refers to how you want your connection impacted by other transactions in progress. A normal application would want some isolation between transactions, but it is useful to have the UI be able to watch in-progress transactions (i.e., perform dirty reads). The options include:

  • 0 - Read Uncommitted - transaction isolation disabled

  • 1 - Serializable - database is (read and write) locked until transaction commits

  • 3 - Read Committed (default) - read locks released after statement completes

The most current version of Eclipse is Photon. However, August/September is the worst time to switch to the newest Eclipse platform since they normally release in the summer and the previous release from last year has had a chance to receive bug fixes and performance patches. I am going to follow my own advice and stick last year's model (Oxygen) -- which has had some time to mature. You may use Photon or another IDE entirely.

  1. Download Eclipse IDE for JavaEE Developers https://www.eclipse.org/downloads/packages/release/oxygen/3a/eclipse-ide-java-ee-developers or latest from https://www.eclipse.org/downloads/packages/release/oxygen or http://eclipse.org/downloads

  2. Unzip the downloaded archive.

    $ tar xzf ~/Downloads/eclipse-jee-xxxx-xx-linux-gtk-x86_64.tar.gz
    
    $ ls eclipse
    artifacts.xml  configuration  dropins  eclipse  eclipse.ini  features  icon.xpm  p2  plugins  readme
    
  3. Create a shortcut for starting Eclipse

  4. Start Eclipse

m2e is a plugin installed into Eclipse that configures Eclipse based on the Maven pom.xml configuration. When adjusting your builds, you should always define changes within the Maven pom.xml and rely on m2e to translate that into Eclipse. Any changes added directly to Eclipse will not be seen by the command-line build.

  1. Add the Java Package Explorer to the JavaEE Perspective. I find this easier to work with than the Project Explorer used by default in the JavaEE perspective.

  2. Import the class examples into Eclipse as a Maven Project

JBoss maintains a set of Eclipse plugins to help with development and use of their products. There are too many to describe -- let alone understand in total. However, we can make use of a few. The primary one is to optionally run/manage Wildfly within Eclipse versus the command line. Follow these steps if you want to enable a few additional productivity JBoss Tools plugins.

  1. Open the Eclipse Marketplace panel using Help->Eclipse Marketplace

  2. Type Wildfly into the seach field and press Go

  3. Click Install for the JBoss Tools

  4. Complete the installation steps for JBoss Tools. There are many tools in the repository. Not all of them are needed for class or obvious how to use them without more investigation. Choose the following suggested minimal set.

  5. Define a Server Instance for JBoss

Ant is used in class to wrap command lines and encapsulate the building of classpaths for stand-alone applications. Just download and add Ant to your PATH here.

Note

The latest version of Ant is 1.10.5. Older versions of Ant will work as well (e.g., 1.9.x) if you already have it installed.

In this chapter you will be introduced to a standard module file structure that contains a class we intend to use in production and a unit test to verify the functionality of the production class. You will be asked to form the directory structure of files and execute the commands required to build and run the unit test.

Warning

This chapter is optional!!! It contains many tedious steps that are somewhat shell-specific. The intent is to simply introduce the raw data structure and actions that need to take place and then to later automate all of this through Maven. If you wish to just skim the steps -- please do. Please do not waste time trying to port these bash shell commands to your native shell.

Note

This part requires junit.jar. These should have been downloaded for you when you built the class examples and can be located in $M2_REPO/junit/junit/(version)/. Where M2_REPO is HOME/.m2/repository or the location you have specified in the localRepository element of $HOME/.m2/settings.xml.

  1. Set a few shell variables to represent root directories. For the purposes of the follow-on steps, PROJECT_BASEDIR is the root directory for this exercise. In the example below, the user has chosen a directory of $HOME/proj/784/exercises to be the root directory for all class exercises and named the root directory for this project "ex1". An alternative for CLASS_HOME might be c:/jhu/784. M2_REPO is the path to your Maven repository.

    export CLASS_HOME=$HOME/proj/784
    export PROJECT_BASEDIR=$CLASS_HOME/exercises/ex1
    mkdir -p $PROJECT_BASEDIR
    cd $PROJECT_BASEDIR
    export M2_REPO=$HOME/.m2/repository
    
  2. Create project directory structure. In this example, the developer used $HOME/proj/784 for all work in this class.

    $PROJECT_BASEDIR
    |-- src
    |   |-- main
    |   |   `-- java
    |   |       `-- myorg
    |   |           `-- mypackage
    |   |               `-- ex1
    |   `-- test
    |       |-- resources
    |       `-- java
    |           `-- myorg
    |               `-- mypackage
    |                   `-- ex1
    `-- target
        |-- classes
        |-- test-classes
        `-- test-reports
        
    
    mkdir -p src/main/java/myorg/mypackage/ex1
    mkdir -p src/test/java/myorg/mypackage/ex1
    mkdir -p src/test/resources
    mkdir -p target/classes
    mkdir -p src/test/java/myorg/mypackage/ex1
    mkdir -p target/test-classes
    mkdir -p target/test-reports
  3. Add the following Java implementation class to $PROJECT_BASEDIR/src/main/java/myorg/mypackage/ex1/App.java

    package myorg.mypackage.ex1;
    
    
    public class App {
        public int returnOne() { 
            System.out.println( "Here's One!" );
            return 1; 
        }
        public static void main( String[] args ) {
            System.out.println( "Hello World!" );
        }
    }
  4. Add the following Java test class to $PROJECT_BASEDIR/src/test/java/myorg/mypackage/ex1/AppTest.java

    package myorg.mypackage.ex1;
    
    
    import static org.junit.Assert.*;
    import org.junit.Test;
    /**
     * Unit test for simple App.
     */
    public class AppTest {
        @Test
        public void testApp() {
            System.out.println("testApp");
            App app = new App();
            assertTrue("app didn't return 1", app.returnOne() == 1);
        }
    }

    Note

    Make sure you put AppTest.java in the src/test tree.

  5. Compile the application and place it in target/ex1.jar. The compiled classes will go in target/classes.

    javac src/main/java/myorg/mypackage/ex1/App.java -d target/classes
    jar cvf target/ex1.jar -C target/classes .
    jar tf target/ex1.jar
    $ javac src/main/java/myorg/mypackage/ex1/App.java -d target/classes
    $ jar cvf target/ex1.jar -C target/classes .
    added manifest
    adding: myorg/(in = 0) (out= 0)(stored 0%)
    adding: myorg/mypackage/(in = 0) (out= 0)(stored 0%)
    adding: myorg/mypackage/ex1/(in = 0) (out= 0)(stored 0%)
    adding: myorg/mypackage/ex1/App.class(in = 519) (out= 350)(deflated 32%)
    
    $ jar tf target/ex1.jar
    META-INF/
    META-INF/MANIFEST.MF
    myorg/
    myorg/mypackage/
    myorg/mypackage/ex1/
    myorg/mypackage/ex1/App.class
  6. Compile the JUnit test and place the compiled tests in target/test-classes.

    export JUNIT_JARS="$M2_REPO/junit/junit/4.12/junit-4.12.jar:$M2_REPO/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar"
    javac -classpath "target/ex1.jar:$JUNIT_JARS" src/test/java/myorg/mypackage/ex1/AppTest.java -d target/test-classes
  7. Verify you have your "production" class from src/main compiled into target/classes directory, your unit test class from src/test compiled into target/test-classes directory, and the Java archive with the production class is in target directory.

    target
    |-- classes
    |   `-- myorg
    |       `-- mypackage
    |           `-- ex1
    |               `-- App.class
    |-- ex1.jar
    |-- test-classes
    |   `-- myorg
    |       `-- mypackage
    |           `-- ex1
    |               `-- AppTest.class
    `-- test-reports
    
  8. Run the JUnit test framework.

    java -classpath "target/ex1.jar:$JUNIT_JARS:target/test-classes" org.junit.runner.JUnitCore myorg.mypackage.ex1.AppTest
    
    JUnit version 4.12
    .testApp
    Here's One!
    
    Time: 0.003
    
    OK (1 test)
    
  9. Change add/remove a test that will fail, re-compile the test class, and re-run.

        //AppTest.java
    
        @Test
        public void testFail() {
            System.out.println("testFail");
            App app = new App();
            assertTrue("app didn't return 0", app.returnOne() == 0);
        }
    javac -classpath "target/ex1.jar:$JUNIT_JARS" src/test/java/myorg/mypackage/ex1/AppTest.java -d target/test-classes
    java -classpath "target/ex1.jar:$JUNIT_JARS:target/test-classes" org.junit.runner.JUnitCore myorg.mypackage.ex1.AppTest
    JUnit version 4.12
    .testApp
    Here's One!
    .testFail
    Here's One!
    E
    Time: 0.007
    There was 1 failure:
    1) testFail(myorg.mypackage.ex1.AppTest)
    java.lang.AssertionError: app didn't return 0
            at org.junit.Assert.fail(Assert.java:93)
            at org.junit.Assert.assertTrue(Assert.java:43)
            at myorg.mypackage.ex1.AppTest.testFail(AppTest.java:26)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    ...
            at org.junit.runner.JUnitCore.main(JUnitCore.java:45)
    
    FAILURES!!!
    Tests run: 2,  Failures: 1

This chapter demonstrates the basics of automating the manual steps in the previous chapter using the Apache Ant build tool. If you just skim thru this step, please be sure to take note of how everything gets explicitly defined in Ant. There are not many rules of the road and standard defaults to live by. That will be a big contrast when working with Maven.

Note

All course examples and projects submitted will use Maven. Ant will be used to wrap command lines for Java SE clients executed outside the normal build environment. However, this exercise shows Ant only being used as part of the artifact build and test environment as a stepping stone to understanding some of the basic build and test concepts within Maven.

Note

If you do not have Ant installed on your system, it can be from http://ant.apache.org/

Warning

This chapter is optional!!! It contains many tedious steps to setup a module build using the Ant build tool -- which will not be part of class. It is presented here as an example option to building the module with shell scripts. If you wish to just skim the steps -- please do. Please do not waste time trying to get Ant to build your Java modules for this class.

  1. Create a build.properties file in $PROJECT_BASEDIR. This will be used to define any non-portable property values. Place the most non-portable base variables (.e.g, M2_REPO location) towards the top and build lower-level paths from them. This makes the scripts much easier to port to another environment. If you still have your maven repository in your $HOME directory, you can make use of ${user.home} environment variable rather than a hard-coded path.

    #ex1 build.properties
    #M2_REPO=c:/jhu/repository
    M2_REPO=${user.home}/.m2/repository
    
    junit.classpath=${M2_REPO}/junit/junit/4.10/junit-4.10.jar
  2. Create a build.xml file in $PROJECT_BASEDIR. Note the following key elements.

    • project - a required root for build.xml files

      • name - not significant, but helpful

      • default - the target to run if none is supplied on command line

      • basedir - specifies current directory for all tasks

    • property - defines an immutable name/value

      • file - imports declarations from a file; in this case build.properties created earlier

      • name/value - specifies a property within the script

    • target - defines an entry point into the build.xml script. It hosts one or more tasks.

      • name - defines name of target, which can be supplied on command line.

    • echo - a useful Ant task to printout status and debug information. See Ant docs for more information.

    
    <?xml version="1.0" encoding="utf-8" ?> 
    <!-- ex1 build.xml 
    -->
    <project name="ex1" default="" basedir=".">
        <property file="build.properties"/>

        <property name="artifactId" value="ex1"/>
        <property name="src.dir"    value="${basedir}/src"/>
        <property name="build.dir"  value="${basedir}/target"/>

        <target name="echo">
            <echo>basedir=${basedir}</echo>
            <echo>artifactId=${artifactId}</echo>
            <echo>src.dir=${src.dir}</echo>
            <echo>build.dir=${build.dir}</echo>
            <echo>junit.classpath=${junit.classpath}</echo>
        </target>
    </project>
  3. Sanity check your build.xml and build.properties file with the echo target.

    $ ant echo
    Buildfile: /home/jim/proj/784/exercises/ex1/build.xml
    
    echo:
         [echo] basedir=/home/jim/proj/784/exercises/ex1
         [echo] artifactId=ex1
         [echo] src.dir=/home/jim/proj/784/exercises/ex1/src
         [echo] build.dir=/home/jim/proj/784/exercises/ex1/target
         [echo] junit.classpath=/home/jim/.m2/repository/junit/junit/4.10/junit-4.10.jar
    
    BUILD SUCCESSFUL
    Total time: 0 seconds
  4. Add the "package" target to compile and archive your /src/main classes. Note the following tasks in this target.

    • mkdir - creates a directory. See Ant Mkdir docs for more infomation.

    • javac - compiles java sources files. See Ant Javac docs for more information. Note that we are making sure we get JavaSE 8 classes compiled.

    • jar - builds a java archive. See Ant Jar Docs for more information.

    
        <target name="package">
            <mkdir dir="${build.dir}/classes"/>
            <javac srcdir="${src.dir}/main/java"
                   destdir="${build.dir}/classes"
                   debug="true"
                   source="1.8"
                   target="1.8"
                   includeantruntime="false">
                   <classpath>
                   </classpath>
            </javac>

            <jar destfile="${build.dir}/${artifactId}.jar">
                <fileset dir="${build.dir}/classes"/>
            </jar>
        </target>
  5. Execute the "package" target just added. This should compile the production class from src/main into target/classes and build a Java archive with the production class in target/.

    $ rm -rf target/; ant package
    Buildfile: /home/jim/proj/784/exercises/ex1/build.xml
    
    package:
        [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/classes
        [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/classes
          [jar] Building jar: /home/jim/proj/784/exercises/ex1/target/ex1.jar
    
    BUILD SUCCESSFUL
    Total time: 2 seconds

    Note

    You may get the following error when you execute the javac task. If so, export JAVA_HOME=(path to JDK_HOME) on your system to provide Ant a reference to a JDK instance.

    build.xml:26: Unable to find a javac compiler;
    com.sun.tools.javac.Main is not on the classpath.
    Perhaps JAVA_HOME does not point to the JDK.
    It is currently set to ".../jre"
    $ find . -type f
    ./src/main/java/myorg/mypackage/ex1/App.java
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./build.properties
    ./build.xml
    ./target/classes/myorg/mypackage/ex1/App.class
    ./target/ex1.jar
  6. Add the "test" target to compile your /src/test classes. Make this the default target for your build.xml file. Note too that it should depend on the successful completion of the "package" target and include the produced archive in its classpath.

    
    <project name="ex1" default="test" basedir=".">
    ...
        <target name="test" depends="package">
            <mkdir dir="${build.dir}/test-classes"/>
            <javac srcdir="${src.dir}/test/java"
                   destdir="${build.dir}/test-classes"
                   debug="true"
                   source="1.8"
                   target="1.8"
                   includeantruntime="false">
                   <classpath>
                       <pathelement location="${build.dir}/${artifactId}.jar"/>
                       <pathelement path="${junit.classpath}"/>
                   </classpath>
            </javac>
        </target>
  7. Execute the new "test" target after clearing out the contents of the target directory. Note that the target directory gets automatically re-populated with the results of the "compile" target and augmented with the test class from src/test compiled into target/test-classes.

    $ rm -rf target/; ant
    Buildfile: /home/jim/proj/784/exercises/ex1/build.xml
    
    package:
        [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/classes
        [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/classes
          [jar] Building jar: /home/jim/proj/784/exercises/ex1/target/ex1.jar
    
    test:
        [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-classes
        [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/test-classes
    
    BUILD SUCCESSFUL
    Total time: 3 seconds
    > find . -type f
    ./src/main/java/myorg/mypackage/ex1/App.java
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./build.properties
    ./build.xml
    ./target/classes/myorg/mypackage/ex1/App.class
    ./target/ex1.jar
    ./target/test-classes/myorg/mypackage/ex1/AppTest.class
    
  8. Add the junit task to the test target. The junit task is being configured to run in batch mode and write a TXT and XML reports to the target/test-reports directory. See Ant docs for more details on the junit task. Make special note of the following:

    • printsummary - produce a short summary to standard out showing the number of tests run and a count of errors, etc.

    • fork - since Ant runs in a JVM, any time you run a task that requires a custom classpath, it is usually required that it be forked into a separate process (with its own classpath).

    • batchtest - run all tests found and write results of each test into the test-reports directory.

    • formatter - write a text and XML report of results

    
            <mkdir dir="${build.dir}/test-reports"/>
            <junit printsummary="true" fork="true">
                   <classpath>
                       <pathelement path="${junit.classpath}"/>
                       <pathelement location="${build.dir}/${artifactId}.jar"/>
                       <pathelement location="${build.dir}/test-classes"/>
                   </classpath>

                <batchtest fork="true" todir="${build.dir}/test-reports">
                    <fileset dir="${build.dir}/test-classes">
                        <include name="**/*Test*.class"/>
                    </fileset>
                </batchtest>

                <formatter type="plain"/>
                <formatter type="xml"/>
            </junit>

    Note

    A few years ago when I sanity checked this exercise I got the common error below. I corrected the issue by downloading a full installation from the Ant website and exporting my ANT_HOME to the root of that installation. (export ANT_HOME=/opt/apache-ant-1.9.4) and adding $ANT_HOME/bin to the PATH (export PATH=$ANT_HOME/bin:$PATH) ANT_HOME is required for Ant to locate the junit task.

    BUILD FAILED
    
    /home/jim/proj/784/exercises/ex1/build.xml:57: Problem: failed to create task or type junit
    Cause: the class org.apache.tools.ant.taskdefs.optional.junit.JUnitTask was not found.        
            This looks like one of Ant's optional components.
    Action: Check that the appropriate optional JAR exists in
            -/usr/share/ant/lib
            -/home/jim/.ant/lib
            -a directory added on the command line with the -lib argument
    
    Do not panic, this is a common problem.
    The commonest cause is a missing JAR.
    
    This is not a bug; it is a configuration problem
  9. Execute the updated "test" target with the JUnit test.

    $ rm -rf target; ant
    package:
        [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/classes
        [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/classes
          [jar] Building jar: /home/jim/proj/784/exercises/ex1/target/ex1.jar
    
    test:
        [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-classes
        [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/test-classes
        [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-reports
        [junit] Running myorg.mypackage.ex1.AppTest
        [junit] Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 15.143 sec
        [junit] Test myorg.mypackage.ex1.AppTest FAILED
    
    BUILD SUCCESSFUL
    Total time: 17 seconds
    $ find . -type f
    ./src/main/java/myorg/mypackage/ex1/App.java
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./build.properties
    ./build.xml
    ./target/classes/myorg/mypackage/ex1/App.class
    ./target/ex1.jar
    ./target/test-classes/myorg/mypackage/ex1/AppTest.class
    ./target/test-reports/TEST-myorg.mypackage.ex1.AppTest.txt
    ./target/test-reports/TEST-myorg.mypackage.ex1.AppTest.xml

    Note

    Note the 17 seconds it took to run/complete the test seems excessive. I was able to speed that up to 0.001 sec by commenting out the XML report option (which we will not use in this exercise).

  10. Test output of each test is in the TXT and XML reports.

    $ more target/test-reports/TEST-myorg.mypackage.ex1.AppTest.txt 
    Testsuite: myorg.mypackage.ex1.AppTest
    Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 15.246 sec
    ------------- Standard Output ---------------
    testApp
    Here's One!
    testFail
    Here's One!
    ------------- ---------------- ---------------
    
    Testcase: testApp took 0.007 sec
    Testcase: testFail took 0.022 sec
            FAILED
    app didn't return 0
    junit.framework.AssertionFailedError: app didn't return 0
            at myorg.mypackage.ex1.AppTest.testFail(AppTest.java:26)
  11. Add a clean target to the build.xml file to delete built artifacts. See Ant docs for details on the delete task.

    
        <target name="clean">
            <delete dir="${build.dir}"/>
        </target>
  12. Re-run and use the new "clean" target you just added.

    $ ant clean test
    Buildfile: /home/jim/proj/784/exercises/ex1/build.xml
    
    clean:
       [delete] Deleting directory /home/jim/proj/784/exercises/ex1/target
    
    package:
        [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/classes
        [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/classes
          [jar] Building jar: /home/jim/proj/784/exercises/ex1/target/ex1.jar
    
    test:
        [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-classes
        [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/test-classes
        [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-reports
        [junit] Running myorg.mypackage.ex1.AppTest
        [junit] Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 15.123 sec
        [junit] Test myorg.mypackage.ex1.AppTest FAILED
    
    BUILD SUCCESSFUL
    Total time: 17 seconds
  13. Comment out the bogus testFail and rerun.

    $ cat src/test/java/myorg/mypackage/ex1/AppTest.java
    ...
        //@Test
        public void testFail() {
    $ ant clean test
    Buildfile: /home/jim/proj/784/exercises/ex1/build.xml
    
    clean:
       [delete] Deleting directory /home/jim/proj/784/exercises/ex1/target
    
    package:
        [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/classes
        [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/classes
          [jar] Building jar: /home/jim/proj/784/exercises/ex1/target/ex1.jar
    
    test:
        [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-classes
        [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/test-classes
        [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-reports
        [junit] Running myorg.mypackage.ex1.AppTest
        [junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.161 sec
    
    BUILD SUCCESSFUL
    Total time: 17 seconds

In this chapter we will refine the use of print and debug statements by using a "logger". By adopting a logger into your production and test code you can avoid print statements to stdout/stderr and be able to re-direct them to log files, databases, messaging topics etc. There are several to choose from (Java's built-in logger, Commons logging API, SLF's logging API, and log4j to name a few). This course uses the SLF API and and the log4j implementation.

  1. Change the System.out() calls in App and AppTest from Part A to use SLF logging API. The slf4j-api Javadoc and manual will be helpful in understanding this interface.

    package myorg.mypackage.ex1;
    
    
    import org.slf4j.Logger;
    import org.slf4j.LoggerFactory;
    public class App {
        private static Logger logger = LoggerFactory.getLogger(App.class);
        public int returnOne() { 
            //System.out.println( "Here's One!" );
            logger.debug( "Here's One!" );
            return 1; 
        }
        public static void main( String[] args ) {
            //System.out.println( "Hello World!" );
            logger.info( "Hello World!" );
        }
    }
    package myorg.mypackage.ex1;
    
    ...
    import static org.junit.Assert.*;
    import org.junit.Test;
    import org.slf4j.Logger;
    import org.slf4j.LoggerFactory;
    public class AppTest {
        private static Logger logger = LoggerFactory.getLogger(AppTest.class);
    ...
        @Test
        public void testApp() {
            //System.out.println("testApp");
            logger.info("testApp");
            App app = new App();
            assertTrue("app didn't return 1", app.returnOne() == 1);
        }
    }
  2. Add a log4j.xml configuration file to the directory structure. Place this file in src/test/resources/log4j.xml. This file is used to control logging output. Refer to the log4j manual for possible information on how to configure and use log4j. It doesn't matter whether you use a log4j.xml format or log4j.properties format. Of note, the implementation we are using is based on "Log4j 1" -- which reached its end of life in 2015. It still works but all energy in this area is within "Log4j 2".

    
    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">

    <log4j:configuration 
        xmlns:log4j="http://jakarta.apache.org/log4j/" 
        debug="false">
       
        <appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
            <param name="Target" value="System.out"/>

            <layout class="org.apache.log4j.PatternLayout">
                <param name="ConversionPattern" 
                       value="%-5p %d{dd-MM HH:mm:ss,SSS} (%F:%M:%L)  -%m%n"/>
            </layout>
        </appender>

        <appender name="logfile" class="org.apache.log4j.RollingFileAppender">
            <param name="File" value="target/log4j-out.txt"/>
            <param name="Append" value="false"/>
            <param name="MaxFileSize" value="100KB"/>
            <param name="MaxBackupIndex" value="1"/>
            <layout class="org.apache.log4j.PatternLayout">
                <param name="ConversionPattern" 
                       value="%-5p %d{dd-MM HH:mm:ss,SSS} [%c] (%F:%M:%L)  -%m%n"/>
            </layout>
       </appender>

       <logger name="myorg.mypackage">
          <level value="debug"/>
          <appender-ref ref="logfile"/>  
       </logger>

       <root>
          <priority value="info"/>    
          <appender-ref ref="CONSOLE"/>  
       </root>   
       
    </log4j:configuration>

    Note

    The log4j.xml is placed in the JVM classpath; where log4j will locate it by default. However, it should not be placed in with the main classes (ex1.jar). Placing it in a our JAR file would polute the application assembler and deployer's job of specifying the correct configuration file at runtime. Our test classes and resources are not a part of follow-on deployment.

  3. Add the slf4j-api.jar to the compile classpaths and the slf4j-api.jar, slf4j-log4j.jar, and log4j.jar to the runtime classpath used during tests. Also add an additional task to copy the log4j.xml file into target/test-classes so that it is seen by the classloader as a resource. Realize that your classes have no compilation dependencies on log4j. Log4j is only used if it is located at runtime.

    # ex1 build.properties
    junit.classpath=${M2_REPO}/junit/junit/4.12/junit-4.12.jar:\
    ${M2_REPO}/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar
    
    slf4j-api.classpath=${M2_REPO}/org/slf4j/slf4j-api/1.7.25/slf4j-api-1.7.25.jar
    slf4j-log4j.classpath=${M2_REPO}/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar
    log4j.classpath=${M2_REPO}/log4j/log4j/1.2.17/log4j-1.2.17.jar
    
    
        <target name="echo">
            ...
            <echo>slf4j-api.classpath=${slf4j-api.classpath}</echo>
            <echo>slf4j-log4j.classpath=${slf4j-log4j.classpath}</echo>
            <echo>log4j.classpath=${log4j.classpath}</echo>
        </target>
    
            <javac srcdir="${src.dir}/main/java"
                   destdir="${build.dir}/classes"
                   debug="true"
                   source="1.8"
                   target="1.8"
                   includeantruntime="false">
                   <classpath>
                       <pathelement path="${slf4j-api.classpath}"/>
                   </classpath>
            </javac>
    
            <javac srcdir="${src.dir}/test/java"
                   destdir="${build.dir}/test-classes"
                   debug="true"
                   source="1.8"
                   target="1.8"
                   includeantruntime="false">
                   <classpath>
                       <pathelement location="${build.dir}/${artifactId}.jar"/>
                       <pathelement path="${junit.classpath}"/>
                       <pathelement path="${slf4j-api.classpath}"/>
                   </classpath>
            </javac>

            <copy todir="${build.dir}/test-classes">
                <fileset dir="${src.dir}/test/resources"/>
            </copy>
    
            <junit printsummary="true" fork="true">
                   <classpath>
                       <pathelement path="${junit.classpath}"/>
                       <pathelement location="${build.dir}/${artifactId}.jar"/>
                       <pathelement location="${build.dir}/test-classes"/>
                       <pathelement path="${commons-logging.classpath}"/>
                       <pathelement path="${log4j.classpath}"/>
                   </classpath>
            ...
  4. Test application and inspect reports. All loggers inherit from the root logger and may only extend its definition; not limit it. Notice that the root logger's priority filter "info" value allows log.info() (warning and fatal) messages to printed to the console. The myorg.mypackage logger's level filter allows log.debug() messages from the myorg.mypackage.* classes to appear in both the console and logfile. This means that any Java classes not in our package hierarchy will only have INFO or higher priority messages logged.

    $ ant clean test
    Buildfile: /home/jcstaff/proj/784/exercises/ex1/build.xml
    
    clean:
       [delete] Deleting directory /home/jcstaff/proj/784/exercises/ex1/target
    
    package:
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/classes
        [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/classes
          [jar] Building jar: /home/jcstaff/proj/784/exercises/ex1/target/ex1.jar
    
    test:
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/test-classes
        [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/test-classes
         [copy] Copying 1 file to /home/jcstaff/proj/784/exercises/ex1/target/test-classes
        [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/test-reports
        [junit] Running myorg.mypackage.ex1.AppTest
        [junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.127 sec
    
    BUILD SUCCESSFUL
    Total time: 17 seconds

    You won't see the output come to stdout when using Ant, but you can locate all output in the FILE logger output defined to be in target/log4j-out.txt. This behavior will get a little better under Maven.

    $ more target/log4j-out.txt 
    INFO  13-08 18:24:21,983 [myorg.mypackage.ex1.AppTest] (AppTest.java:testApp:18)  -testApp
    DEBUG 13-08 18:24:21,986 [myorg.mypackage.ex1.App] (App.java:returnOne:11)  -Here's One!
    

    Your project structure should look like the following at this point.

    > find . -type f
    ./src/main/java/myorg/mypackage/ex1/App.java
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./src/test/resources/log4j.xml
    ./build.properties
    ./build.xml
    ./target/classes/myorg/mypackage/ex1/App.class
    ./target/ex1.jar
    ./target/test-classes/myorg/mypackage/ex1/AppTest.class
    ./target/test-classes/log4j.xml
    ./target/test-reports/TEST-myorg.mypackage.ex1.AppTest.txt
    ./target/test-reports/TEST-myorg.mypackage.ex1.AppTest.xml
    ./target/log4j-out.txt
  5. Change the logging level so that only the App class performs logs to the logfile. By extending the logger name specification all the way to the class, we further limit which classes apply to this logger.

    
        <logger name="myorg.mypackage.ex1.App">
          <level value="debug"/>
          <appender-ref ref="logfile"/>
       </logger>

    After re-running the build you should notice the DEBUG for only the App is included because of the change we made to the logger outside the code.

    $ more target/log4j-out.txt 
    DEBUG 26-08 23:07:04,809 [myorg.mypackage.ex1.App] (App.java:returnOne:11)  -Here's One!
  6. Repeat after me. "I will never use System.out.println() in this class." Doing so will make it difficult for your deployed components to have their logs controlled and accessible as it is instantiated in unit testing, integration testing, and deployment environments.

In this chapter you will automate the build using Maven by defining a simple Maven project definition that will go with your project tree. In the previous chapters you worked with a reasonable project tree that could have looked different in a number of ways and could have been accounted for by different path constructs. However, why be different? The project tree we put together that accounted for production classes, test classes, resource files, archives, unit tests, test reports, etc. follows Maven's standard build tree almost exactly (with the exception of the name of the target/test-reports directory). We will be able to add a Maven project definition without much effort.

Tip

The Maven community has a tremendous amount of documentation, examples, and on-line discussions. This course has many examples that are more specific for the work you will be actively performing. Many of these resources are a quick google search away but know that I go out of my way to make sure you spend as much time as possible on design and JavaEE aspects in class. If you are stuck on Maven -- ask. I know what you are trying to do and can likely point you to an example that is relevant to what you are doing in class. If you are still stuck on Maven issues -- send it to me. I will fix it personally. There is nothing more irritating for you than to be fighting with the builds when you want to be spending more time understanding, designing, trying, and mastering the product of what is being built.

Note

Using Maven requires only an initial download and installation. Plugins and dependencies will be downloaded from remote repositories as needed. Connectivity to the internet is required until all dependencies have been satisfied.

Note

Maven will automatically go out and download any missing dependencies and recursively download what they depend upon. If you are running Maven for the first time, this could result in a significant amount of downloading and may encounter an occasional connection failure with repositories. Once a non-SNAPSHOT version is downloaded (e.g., 1.3), Maven will not re-attempt to download it. Maven will, however, go out and check various resources to stay in sync. If you know you already have everything you need, you can run in off-line mode using the "-o" flag on the command line or its equivalent entry within the settings.xml file. This can save you seconds of build time when disconnected from the Internet. If you want to make sure that all dependencies have been resolved, use the mvn dependency:go-offline goal to eagerly resolve all dependencies.

  1. Create a pom.xml file in project basedir. This will be used to define your entire project. Refer to the Maven POM Reference for details about each element.

    • modelVersion - yes; its required

    • groupId - just as it sounds, this value is used to group related artifacts. groupId is a hierarchical value and the individual names are used to form a directory structure in the Maven repository (e.g., artifacts in the myorg.myproject.foo groupId will be located below the HOME/.m2/repository/myorg/myproject/foo directory).

    • version - Maven has a strong versioning system and versions appended with the word SNAPSHOT are handled differently. Projects with a version ending in -SNAPSHOT are thought to be in constant change, with no official release yet available. Projects with a version lacking the -SNAPSHOT ending are meant to be an official release, with no other variants available with the same tag.

    • dependency.scope - this is used to define the scope the dependency gets applied. It defines the visibility within the project for the dependency and whether it is carried along with the module through transitive dependency analysis. With open-source software, a typical JavaEE application could have 10s to 100s of individual modules it dependends upon and the proper use of transitive dependencies makes this manageable.

      • scope=compile is the default and is used to describe artifacts that the src/main directory depends upon and will also be visible by classes in src/test. These dependency artifacts will be brought along with the module when transitive dependencies are evaluated.

      • scope=test is used to define artifacts which src/test depends upon. These will be made available during testing, but will not be visible to classes in src/main and will not be considered a dependency for downstream users of the module. Consult the maven documentation for other scopes, but one other that is commonly used in class is scope=provided.

      • scope=provided is similar to scope=compile in that the src/main tree can see it, however like scope=test, it is not carried forward. Each downstream module is required to know about the dependency and provide a replacement. This is common when using JavaEE APIs that have been packaged by different vendors used by different module developers.

      • maven-compiler-plugin - this declaration is only necessary to if we need to override the default Java version -- like what we did in our Ant script.

      • properties.project.build.sourceEncoding - this defines the default handling of file content for all plugins within a module. The default is platform-specific if left unspecified and we avoid an annoying warning by specifying it.

    Note

    Although the m2e Eclipse plugin reads the pom dependency and creates a classpath within Eclipse, it does not honor the differences between the different scope values. All dependencies are blended together when inside the IDE. The result is that something may compile and run fine within Eclipse and report a missing class when built at the command line. If that happens, check for classes using artifacts that have been brought in as scope=test or for classes incorrectly placed within the src/main tree.

    
    <?xml version="1.0"?>
    <project>
        <modelVersion>4.0.0</modelVersion>

        <groupId>myorg.myproject</groupId>
        <artifactId>ex1</artifactId>

        <name>My First Simple Project</name>
        <version>1.0-SNAPSHOT</version>

        <properties>
            <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        </properties>

        <dependencies>
            <dependency>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-api</artifactId>
                <version>1.7.25</version>
                <scope>compile</scope>
            </dependency>

            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <version>4.12</version>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-log4j12</artifactId>
                <version>1.7.25</version>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>log4j</groupId>
                <artifactId>log4j</artifactId>
                <version>1.2.17</version>
                <scope>test</scope>
            </dependency>
        </dependencies>

        <build>
            <plugins>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-compiler-plugin</artifactId>
                    <version>3.7.0</version>
                    <configuration>
                        <source>1.8</source>
                        <target>1.8</target>
                    </configuration>
                </plugin>
            </plugins>
        </build>
    </project>

    Your project tree should look like the following at this point.

    > find . -type f
    ./src/main/java/myorg/mypackage/ex1/App.java
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./src/test/resources/log4j.xml
    ./build.properties
    ./build.xml
    ./pom.xml
  2. Note that the pom.xml file is not required to have an assigned schema. However, adding one does allow for XML editing tools to better assist in creating a more detailed POM. Replace the project element from above with the following declarations to assign an XML schema.

    
    <project xmlns="http://maven.apache.org/POM/4.0.0" 
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  3. Run the package "phase" and watch the project compile, assemble, and test. Maven has many well-known phases that correspond to the lifecycle of build steps that goes into validating, preparing, building, testing, and deploying artifacts of a module. You can find out more about Maven phases here I refer to this page very often.

    $ mvn package
    [INFO] Scanning for projects...
    [INFO]
    [INFO] ------------------------------------------------------------------------
    [INFO] Building JavaSE::Simple Module Exercise Solution 5.0.0-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    [INFO]
    [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ firstSimpleModuleEx ---
    [INFO] Using 'UTF-8' encoding to copy filtered resources.
    [INFO] skip non existing resourceDirectory /home/jim/proj/784/exercises/ex1/src/main/resources
    [INFO]
    [INFO] --- maven-compiler-plugin:3.7.0:compile (default-compile) @ firstSimpleModuleEx ---
    [INFO] Changes detected - recompiling the module!
    [INFO] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/classes
    [INFO]
    [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ firstSimpleModuleEx ---
    [INFO] Using 'UTF-8' encoding to copy filtered resources.
    [INFO] Copying 1 resource
    [INFO]
    [INFO] --- maven-compiler-plugin:3.7.0:testCompile (default-testCompile) @ firstSimpleModuleEx ---
    [INFO] Changes detected - recompiling the module!
    [INFO] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/test-classes
    [INFO]
    [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ firstSimpleModuleEx ---
    [INFO] Surefire report directory: /home/jim/proj/784/exercises/ex1/target/surefire-reports
    
    -------------------------------------------------------
     T E S T S
    -------------------------------------------------------
    Running myorg.mypackage.ex1.AppTest
    INFO  13-08 19:03:19,264 (AppTest.java:testApp:18)  -testApp
    DEBUG 13-08 19:03:19,267 (App.java:returnOne:11)  -Here's One!
    Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.13 sec
    
    Results :
    
    Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
    
    [INFO]
    [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ firstSimpleModuleEx ---
    [INFO] Building jar: /home/jim/proj/784/exercises/ex1/target/firstSimpleModuleEx-5.0.0-SNAPSHOT.jar
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 1.553 s
    [INFO] Finished at: 2018-08-13T19:03:19-04:00
    [INFO] Final Memory: 19M/309M
    [INFO] ------------------------------------------------------------------------
    [WARNING] The requested profile "h2db" could not be activated because it does not exist.
    

    Ignore WARNING for Non-existent Profile

    You were asked to declare a h2db profile as active within $HOME/.m2/settings.xml during the software installation instructions. Maven will warn you about any profile you request but is not found within the module to help identify spelling errors. In this case, we simply do not need the profile and have not defined it.

  4. The contents of your development tree should look as follows.

    > find . -type f
    ./build.xml
    ./build.properties
    ./pom.xml
    ./target/surefire-reports/TEST-myorg.mypackage.ex1.AppTest.xml
    ./target/surefire-reports/myorg.mypackage.ex1.AppTest.txt
    ./target/log4j-out.txt
    ./target/maven-archiver/pom.properties
    ./target/ex1-1.0-SNAPSHOT.jar
    ./target/test-classes/myorg/mypackage/ex1/AppTest.class
    ./target/test-classes/log4j.xml
    ./target/classes/myorg/mypackage/ex1/App.class
    ./target/maven-status/...
    ./src/test/resources/log4j.xml
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./src/main/java/myorg/mypackage/ex1/App.java
    • src/main/java classes were built in the target/classes directory by convention by the maven-compiler plugin that is automatically wired into JAR module builds. We didn't have to configure it because we structured our project using Maven directory structure and used the default packaging=jar module type (since packaging=jar is the default, it could be left unspecified). Many of the standard features are enacted when for modules with packaging=jar type.

    • src/test/java classes where built in the target/test-classes directory by convention by the maven-compiler plugin.

    • src/test/resources where copied to the target/test-classes directory by convention by the maven-resources-plugin that is automatically wired into JAR module builds.

    • test cases were run and their reports were placed in target/surefire-reports by convention by the maven-surefire-plugin that is automatically wired into JAR module builds.

    • The build.xml and build.properties file from our work with Ant is still allowed to exist. We could even delegate from Maven to Ant using the maven-antrun-plugin if we had legacy build.xml scripts that we wanted to leverage.

  5. For *fun*, lets add a README that could be used to describe something about your project and have it be processed as part of the documentation for the module. You do not need to do this for class projects, but walking through this may be helpful in understanding how the course website is created from the source you have on your disk. Maven supports a couple of documentation generation languages, but lets just use HTML to keep this simple. Place the following content to src/site/resources/README.html

    
    mkdir -p src/site/resources
    $ cat src/site/resources/README.html 

    <?xml version="1.0"?>
    <html>
        <head>
            <title>My First Project</title>
        </head>
    <body>
        <section><h1>My First Project</h1></section>
        <p/>
        This is my first project. Take a look at 
        <p/>
        <ul>
            <li>this ....</li>
            <li>that ....</li>
            <li>or <a href="./index.html">go home</a></li>
        </ul>
        </section>
    </body>
    </html>
  6. The above is enough to provide the page. Now add a link to it from the project menu. Add the following content to src/site/site.xml

    
    $ cat src/site/site.xml

    <?xml version="1.0" encoding="UTF-8"?>
    <project name="${project.name}">
      <body>
        <menu name="Content">
            <item name="README" href="README.html"/>
        </menu>
      </body>
    </project>

    You must also specify a version# for the maven-site-plugin and maven-project-info-reports-plugin in the pom.xml. Maven is extremely version-aware.

    
    <plugins>
    ...
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-site-plugin</artifactId>
                    <version>3.4</version>
                </plugin>

                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-project-info-reports-plugin</artifactId>
                    <version>2.8</version>
                </plugin>
    </plugins>
    > find . -type f
    ./src/main/java/myorg/mypackage/ex1/App.java
    ./src/test/java/myorg/mypackage/ex1/AppTest.java
    ./src/test/resources/log4j.xml
    ./src/site/resources/README.html
    ./src/site/site.xml
    ./build.properties
    ./build.xml
    ./pom.xml
  7. Build the site and open target/site/index.html in your browser. You should see a link to the README on the left side.

    $ mvn site                                                                                                                                                       
    [INFO] Scanning for projects...   
    ...
    [INFO] BUILD SUCCESS
    $ find target/site/ -name *.html
    
    target/site/plugin-management.html
    target/site/index.html
    target/site/mail-lists.html
    target/site/issue-tracking.html
    target/site/license.html
    target/site/project-info.html
    target/site/dependency-info.html
    target/site/README.html
    target/site/dependencies.html
    target/site/team-list.html
    target/site/source-repository.html
    target/site/integration.html
    target/site/distribution-management.html
    target/site/project-summary.html
    target/site/plugins.html

    Note

    If you use the posted firstSimpleModuleEx as a starting point for your work you will need to re-enable site generation under the maven-site-plugin. This was turned off since the posted examples do not contain enough information to be posted with the rest of the class examples.

    
    
                <!-- exclude this modules from site generation -->
                <plugin> 
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-site-plugin</artifactId>
                    <version>3.4</version>
                    <configuration>
                        <skip>false</skip>
                        <skipDeploy>false</skipDeploy>
                    </configuration>
                </plugin>
  8. Okay, that was a lot of work to just copy an html file. Now lets add javadoc to our project and create a link to it. Add the following contents to the bottom of the pom.xml file.

    
    ...
        </build>
        <reporting>
            <plugins>
                <plugin>
                    <artifactId>maven-javadoc-plugin</artifactId>
                    <groupId>org.apache.maven.plugins</groupId>
                    <version>3.0.1</version>
                    <configuration>
                        <detectLinks>false</detectLinks>
                        <detectOfflineLinks>true</detectOfflineLinks>
                        <show>private</show>
                        <source>1.8</source>
                        <additionalparam>-Xdoclint:none</additionalparam>
                        <failOnError>false</failOnError>
                        <links>
                            <link>http://download.oracle.com/javase/8/docs/api/</link>
                            <link>https://javaee.github.io/javaee-spec/javadocs/</link>
                        </links>
                    </configuration>
                </plugin>
            </plugins>
        </reporting>
  9. We could create a link the the apidocs/index.html like we did with README.html, but that would be something we'd keep having to update each time we added a new report. Lets add a property to the site.xml menu so a link to Javadoc and other reports can drop in automatically.

    
    # src/site/site.xml
    <?xml version="1.0" encoding="UTF-8"?>
    <project name="${project.name}">
      <body>
        <menu name="Content">
            <item name="README" href="README.html"/>
        </menu>

        <menu ref="reports"/>

      </body>
    </project>
  10. Re-generate the site documentation with the site target. Open the target/site/index.html page and you should now see a menu item for "Project Reports" -> "JavaDocs". Our App class should be included in the Javadoc.

  11. Note

    The pom.xml file is the main configuration source for 99% of what you develop with Maven. There is an additional $HOME/.m2/settings.xml file where you can specify build site-specific properties. These will be available to all pom.xml files. You want to be careful not to over-populate the settings.xml file (taking advantage of its re-usable specification) since it will make you pom.xml files too dependent on a particular build location. Refer to the Settings Descriptor for detailed information on settings.xml. The following provides a step-wise generation of the settings.xml file you put in place during Development Environment Setup. Read thru this for reference since you likely already have everything in place you need.

    Let's start a settings.xml file to store properties that are specific to our build site. You can find details about each setting at the following URL.

    
    cat $HOME/.m2/settings.xml

    <?xml version="1.0"?>
    <settings xmlns="http://maven.apache.org/POM/4.0.0"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">

    </settings>
  12. If your $HOME directory path contains spaces, you will want to provide an override for the localRepository. Provide a custom path that does not contain spaces. This value will default to a "$HOME/.m2/repository" directory.

    
        <!-- this overrides the default $HOME/.m2/repository location. -->
        <localRepository>c:/jhu/repository</localRepository>
  13. Add the following specification to either the settings.xml file or the local pom.xml file. If you specify it to the local pom.xml file -- it will only apply to that project. If you specify it in the settings.xml file -- it will be global to all projects in your area. More will be covered on this later. However, it should be noted that this profile is not active unless someone specifically asks for it (-Pdebugger) or the "debugger" environment variable is set (-Ddebugger=(anything)).

    
        <profiles>
            <profile>
                <id>debugger</id>
                <!-- this should only be activated when performing interactive
                     debugging -->
                <activation>
                    <property>
                        <name>debugger</name>
                    </property>
                </activation>
                <properties>
                    <surefire.argLine>-Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent -Djava.compiler=NONE</surefire.argLine>
                </properties>                                  
            </profile>
        </profiles>
  14. Although not needed for this class -- at times you will need access to a dependency that is not available in a Maven repository. COTS libraries are generally not available at ibiblio.org. You must download it and manually install it locally.

    This step will go though importing a stand-alone archive into the repository to resolve any dependencies. Start by declaring a dependency before we do the import. Note that a new scope property was added. See the Dependency Mechanism Intro Page for a discussion of scope, but in this case it is indicating that it should only be present on the command line and not the runtime classpath.

    
            <dependency>
                <groupId>foo</groupId>
                <artifactId>bar</artifactId>
                <version>1.1</version>
                <scope>provided</scope>
            </dependency>
  15. Attempt the build the module with the missing dependency. The build should fail but note that Maven attempted all known external repositores.

    > mvn package
    [INFO] Scanning for projects...
    [INFO]
    [INFO] ------------------------------------------------------------------------
    [INFO] Building My First Simple Project 1.0-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    Downloading: http://webdev.apl.jhu.edu/~jcs/maven2/foo/bar/1.1/bar-1.1.pom
    Downloading: http://webdev.apl.jhu.edu/~jcs/maven2-snapshot/foo/bar/1.1/bar-1.1.pom
    Downloading: http://repo1.maven.org/maven2/foo/bar/1.1/bar-1.1.pom
    [WARNING] The POM for foo:bar:jar:1.1 is missing, no dependency information available
    Downloading: http://webdev.apl.jhu.edu/~jcs/maven2/foo/bar/1.1/bar-1.1.jar
    Downloading: http://webdev.apl.jhu.edu/~jcs/maven2-snapshot/foo/bar/1.1/bar-1.1.jar
    Downloading: http://repo1.maven.org/maven2/foo/bar/1.1/bar-1.1.jar
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD FAILURE
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 1.437s
    [INFO] Finished at: Wed Feb 02 12:20:51 EST 2011
    [INFO] Final Memory: 2M/15M
    [INFO] ------------------------------------------------------------------------
    [ERROR] Failed to execute goal on project ex1: Could not resolve dependencies for project myorg.myproject:ex1:jar:1.0-SNAPSHOT: 
    Could not find artifact foo:bar:jar:1.1 in webdev-baseline (http://webdev.apl.jhu.edu/~jcs/maven2) -> [Help 1]
  16. The old error message provided for Maven 2 was much better if a manual install is what you really needed. The newer (Maven 3) one does not provide instruction. In this case, manually install a jar file that represents the declaration. Assign it a groupId of foo, an artifactId of bar, and a version of 1.1. Don't forget to add the -DgeneratePom=true or you will get a download warning everytime you try to build. All we need is a valid .jar file. If you don't have one laying around, just create one with valid structure.

    $ touch bar.jar
    $ mvn install:install-file -DgroupId=foo -DartifactId=bar -Dversion=1.1 -Dpackaging=jar -Dfile=bar.jar -DgeneratePom=true
    
    [INFO] Scanning for projects...
    [INFO]                                                                         
    [INFO] ------------------------------------------------------------------------
    [INFO] Building My First Simple Project 1.0-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    [INFO] 
    [INFO] --- maven-install-plugin:2.4:install-file (default-cli) @ ex1 ---
    [INFO] Installing /home/jim/proj/784/exercises/ex1/bar.jar to /home/jim/.m2/repository/foo/bar/1.1/bar-1.1.jar
    [INFO] Installing /tmp/mvninstall5322334237902777597.pom to /home/jim/.m2/repository/foo/bar/1.1/bar-1.1.pom
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    
  17. After successfully installing the dummy archive, you should be able to locate the JAR and other supporting files in the local repository. Be sure to look where you have directed localRepository in $HOME/.m2/settings.xml

    
    $ find /home/jim/.m2/repository/foo/bar/
    /home/jim/.m2/repository/foo/bar/
    /home/jim/.m2/repository/foo/bar/1.1
    /home/jim/.m2/repository/foo/bar/1.1/bar-1.1.pom.lastUpdated
    /home/jim/.m2/repository/foo/bar/1.1/_remote.repositories
    /home/jim/.m2/repository/foo/bar/1.1/bar-1.1.jar.lastUpdated
    /home/jim/.m2/repository/foo/bar/1.1/bar-1.1.jar
    /home/jim/.m2/repository/foo/bar/1.1/bar-1.1.pom
    /home/jim/.m2/repository/foo/bar/maven-metadata-local.xml
  18. Notice that Maven always makes sure there is a POM file present -- whether it had to generate it or not.

    
    $ more /home/jim/.m2/repository/foo/bar/1.1/bar-1.1.pom

    <?xml version="1.0" encoding="UTF-8"?>
    <project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns="http://maven.apache.org/POM/4.0.0"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
      <modelVersion>4.0.0</modelVersion>
      <groupId>foo</groupId>
      <artifactId>bar</artifactId>
      <version>1.1</version>
      <description>POM was created from install:install-file</description>
    </project>
  19. Now try running "mvn package" and it should successfully resolve the fake dependency on the bar.jar.

  20. One last thing...Maven pulls in definitions from many places in the build environment. If you ever want to know what the total sum of those sources are (the "effective POM"), the execute the help:effective-pom goal.

    
     $ mvn help:effective-pom
    [INFO] Scanning for projects...

    ...

    <project xmlns...
      <modelVersion>4.0.0</modelVersion>
      <groupId>myorg.myproject</groupId>
      <artifactId>ex1</artifactId>
      <version>1.0-SNAPSHOT</version>
      <name>My First Simple Project</name>
      <properties>
        <jboss.home>/opt/wildfly-13.0.0.Final</jboss.home>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
      </properties>
      <dependencies>
        <dependency>
          <groupId>foo</groupId>
          <artifactId>bar</artifactId>
          <version>1.1</version>
          <scope>provided</scope>
        </dependency>
    ...

In this chapter we will be importing the project into the Eclipse IDE, running a few project goals, and demonstrating a debug session. IDEs provide very useful code navigation and refactoring tools to name only a few features. However, one of the unique tools offered by the IDEs is the ability to step through the code in a debugging session. Please do not end this exercise before becoming comfortable with the ability to use the debugger.

Note

Maven/Eclipse integration was once the most volatile aspects of the environment. However, over the years the two have been designed to work well together as long as you keep things Maven-centric.

Warning

The Maven/Eclipse integration is a Maven-first approach where the Eclipse project always follows the Maven pom.xml. That is on of the main reasons this exercise started you with a pom.xml file first and progressed later to the IDE. It is wrong (or at least non-portable) to manually adjust the build path of a project within Eclipse. You must adjust the build path of a project by editing the pom.xml and having Eclipse automatically detect and follow that change.

  1. Select File->Import->Maven->Existing Maven Projects, navigate to the directory with the project you have been working with and select OK.


  2. The project should successfully import. Note that Eclipse has imported the project configuration from the Maven POM and has done at least the following...


  1. Right-click on the pom.xml file or project folder and execute Run As->"Maven install". You can also get back to this window through the Run As option on the toolbar once you have the project selective. This mode runs the JUnit test you wrote within the context of the full maven project. All pre-test and post-test setup and teardown you wired into the Maven command line build will be executed.


    Note that you can create a separate window for any of the Eclipse tabs. Using dual monitors -- I commonly display the IDE on one page the the Console output on another when using debug statements.


  2. Rerun the tests as a JUnit test. This mode runs the JUnit test raw within Eclipse. This is very efficient for making and testing Java code changes but will not run any maven setup or teardown plugins (which is not always required or can be avoided).


    Always Make Projects Eclipse/JUnit-Friendly

    Maven is a very useful and powerful tool. However, there is a point where the information from Maven has been captured by the IDE and we don't need to run full Maven builds (e.g., RunAs: Maven Install). As you saw from the RunAs: JUnit test we were able to run the unit test and run it exceptionally fast without Maven. I strongly recommend making your unit tests Eclipse/JUnit-friendly so that you can work efficiently in certain areas. That means hard-code reasable defaults without relying on the maven-surefire-plugin passing in properties from the outside environment. Allow overrides, but code in a usable default into the test.

There are two primary ways to use the debugger; separate/remote process and embedded (within Eclipse). The second is much easier to use but is limited by what you can execute within the Eclipse IDE. The first takes a minor amount of setup but can be re-used to debug code running within application servers on your local and remote machines.

  1. Lets start with a remote debugging session by recalling the profile you were asked to add to either your pom.xml or settings.xml. If you have not done so, you can add it to either at this time.

        <profiles>
            <profile> <!-- tells surefire to run JUnit tests with remote debug -->
                <id>debugger</id>
                <activation>
                    <property>
                        <name>debugger</name>
                    </property>
                </activation>
                <properties>
                    <surefire.argLine>-Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent -Djava.compiler=NONE</surefire.argLine>
                </properties>       
            </profile>            
        
        </profiles>
  2. Add a definition for the "surefire.argLine" within the maven-surefire-plugin declaration. Surefire is already being pulled into the project, this declaration just specifies the extra configuration along with a specific version. Maven will start complaining ig you leave off that version.

                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-surefire--plugin</artifactId>
                    <version>2.17</version>
                    <configuration>
                        <argLine>${surefire.argLine}</argLine>
                    </configuration>
                </plugin>
    
  3. Uncomment (or re-add) your failure test in AppTest.java.

        @Test
    
        public void testFail() {
            //System.out.println("testFail");
            log.info("testFail");
            App app = new App();
            assertTrue("app didn't return 0", app.returnOne() == 0);
        }
  4. Execute a Run As Maven test. My selecting the project, right clicking and chosing the right target. You should see the following error in the console.

    Running myorg.mypackage.ex1.AppTest
    INFO  28-08 23:52:31,809 (AppTest.java:testApp:17)  -testApp
    DEBUG 28-08 23:52:31,821 (App.java:returnOne:11)  -Here's One!
    INFO  28-08 23:52:31,829 (AppTest.java:testFail:25)  -testFail
    DEBUG 28-08 23:52:31,831 (App.java:returnOne:11)  -Here's One!
    Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.409 sec <<< FAILURE!
    testFail(myorg.mypackage.ex1.AppTest)  Time elapsed: 0.016 sec  <<< FAILURE!
    java.lang.AssertionError: app didn't return 0
        at org.junit.Assert.fail(Assert.java:93)
        at org.junit.Assert.assertTrue(Assert.java:43)
        at myorg.mypackage.ex1.AppTest.testFail(AppTest.java:27)
  5. Click on the hyperlink to one of the lines in the project source code in the failure stack trace. Place a breakpoint at that line by double-clicking on the line number. A blue ball should appear to the left of the numbers.

  6. Debug As->Maven build..., change the base directory to a re-usable ${project_loc} variable, assign the "test" goal, and activate the "debugger" profile. Click "Debug" when finished. It will automatically save.


    You should see the Maven build start but pause before executing the first JUnit test. Think of this as the "server-side" of the debugger session.

    [INFO] --- maven-surefire-plugin:2.17:test (default-test) @ firstSimpleModuleEx ---
    [INFO] Surefire report directory: /home/jcstaff/workspaces/ejava-class/ejava-student/javase/firstSimpleModuleEx/target/surefire-reports
    
    -------------------------------------------------------
     T E S T S
    -------------------------------------------------------
    Listening for transport dt_socket at address: 8000
  7. Start the "client-side" of the debugger session by clicking on the bug pulldown at the top of the window. Select debug configurations, double click on Remote Java Application, select your project, and notice the localhost:8000 that came up agrees with your server-side configuration. Press "Debug" when you are ready and answer the prompt to change you view.


  8. The IDE should switch to a debugger view and be waiting at the line you set the breakpoint on. From here you can look at the state of any variables (we don't have any) and step into the next call.


  9. Once you are done with the debugging session you can click continue (agreen right arrow at top) or detach from the server (red swiggly line at top).

  10. Terminate the debugger session, retun to one of the JavaEE or Java-based views. Select a specific JUnit test, test method, package, or entire application and click Debug As JUnit test.

  11. Note the IDE again switches to the Debug view and is stopped at the breakpoint. You may step into the call, look at the state of any variable, or terminate the program (red square at top of window).

This chapter will cover setup required to start the development database in server-mode. The database has at least three (3) modes.

  • In-memory mode

  • File-based mode

  • Server-based mode

The in-memory option manages all tables within the memory footprint of the JVM. This is very fast and requires no server-process setup -- which makes it a good option for automated unit tests. However, since the DB instance is created and destroyed with each JVM execution it makes a bad choice for modules relying on multiple tools to pre-populate the database prior to executing tests.

The file-based option stores all information in the filesystem. It is useful for multi-JVM (sequential) setup and post-mortem analysis. However only a single client may access the files at one time. I have seen this used effectively when simulating external databases -- where an external setup script populates the database and the JVM running the unit tests just queries the information as they would do in production. We will use this as an option to server-based mode since we are using separate plugins to initialize the database. We also want to treat our database schema as a first-class artifact for our application -- and not rely on test capabilities to instantiate the database for each test.

The server-based option requires a separate process activated but allows concurrent connections from database user interface while the JVM is under test. This chapter will focus on the setup required to run the database in server mode.

  1. Prepare your environment to run the database in server mode for this exercise by following the instructions defined in Development Environment Setup.

  2. Start the database and web server server in a directory where you wish to create database files. Your h2.jar file source be located in M2_REPO/com/h2database/h2/*/h2*.jar to name at least one location. Another location is JBOSS_HOME/modules/com/h2database/h2/main/h2*.jar

    cd /tmp
    
    java -jar h2.jar

    This should result in a database server process and access to a web-based database UI through the following local URL: http://localhost:8082/login.jsp

  3. Connect to the database server from the web-based UI.

    Driver Class: org.h2.Driver
    
    JDBC URL: jdbc:h2:tcp://localhost:9092/./h2db/ejava</jdbc.url>
    User Name: sa
    Password: 

    Look in the directory where you started the server. After connecting with a relative URL using the UI, there should be a new "h2db" directory with one or more files called "ejava*". You want to make sure you use the same URL in the UI and application so you are seeing the same instance of the database.

    If you use file-based mode, the connection information would look like the following where "./h2db/ejava" must point to the exact same path your JVM under test uses. This can be a relative or fully-qualified path.

    Driver Class: org.h2.Driver
    
    JDBC URL: jdbc:h2:./target/h2db/ejava
    User Name: sa
    Password: 

This chapter will put in place a couple of core constructs that will allow Maven and the IDE recognize the elements of your source tree.

  1. Create a root directory for your project and populate with a pom.xml file

    
    <?xml version="1.0"?>
    <project
        xmlns="http://maven.apache.org/POM/4.0.0"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
        <modelVersion>4.0.0</modelVersion>

        <groupId>myorg.jpa</groupId>
        <artifactId>entityMgrEx</artifactId>
        <version>1.0-SNAPSHOT</version>

        <name>Entity Manager Exercise</name>

    </project>
  2. Define a remote repository to use to download hibernate artifacts

    
     <!-- needed to resolve some hibernate dependencies -->
        <repositories>
            <repository>
                <id>jboss-nexus</id>
                <name>JBoss Nexus Repository</name>
                <url>https://repository.jboss.org/nexus/content/groups/public-jboss/</url>
            </repository>
        </repositories> 
  3. Add property definitions for versions of dependencies and plugins we will be using.

    
        <properties>
            <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
            <java.source.version>1.8</java.source.version>
            <java.target.version>1.8</java.target.version>

            <jboss.host>localhost</jboss.host>
            <db.host>${jboss.host}</db.host>

            <maven-compiler-plugin.version>3.7.0</maven-compiler-plugin.version>
            <maven-jar-plugin.version>3.1.0</maven-jar-plugin.version>
            <maven-surefire-plugin.version>2.22.0</maven-surefire-plugin.version>
            <sql-maven-plugin.version>1.5</sql-maven-plugin.version>

            <h2db.version>1.4.197</h2db.version>
            <javax.persistence-api.version>2.2</javax.persistence-api.version>
            <hibernate-entitymanager.version>5.3.1.Final</hibernate-entitymanager.version>
            <junit.version>4.12</junit.version>
            <log4j.version>1.2.17</log4j.version>
            <slf4j.version>1.7.25</slf4j.version>
            <ejava.version>5.0.0-SNAPSHOT</ejava.version>
        </properties>
  4. Add a dependencyManagement section to define and configure the dependencies we will be using to work with JPA. These are passive definitions -- meaning they don't actually add any dependencies to your project. They just define the version and possible exclusions for artifacts so all child/leaf projects stay consistent.

    
        <dependencyManagement>
            <dependencies>
                <dependency>
                    <groupId>javax.persistence</groupId>
                    <artifactId>javax.persistence-api</artifactId>
                    <version>${javax.persistence-api.version}</version>
                </dependency>
                <dependency>
                    <groupId>org.hibernate</groupId>
                    <artifactId>hibernate-core</artifactId>
                    <version>${hibernate-entitymanager.version}</version>
                </dependency>
                <dependency>
                    <groupId>junit</groupId>
                    <artifactId>junit</artifactId>
                    <version>${junit.version}</version>
                </dependency>
                <dependency>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-api</artifactId>
                    <version>${slf4j.version}</version>
                </dependency>
                <dependency>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                    <version>${slf4j.version}</version>
                </dependency>
                <dependency>
                  <groupId>log4j</groupId>
                  <artifactId>log4j</artifactId>
                  <version>${log4j.version}</version>
                </dependency>
            </dependencies>
        </dependencyManagement> 

    Note

    Knowing this exercise will always be a single module -- we could do this simpler. However, it is assumed that you will soon take the information you learn here to a real enterprise project and that will have many modules and could benefit from reusing a standard parent configuration. All definitions will be housed in a single module during the conduct of this exercise but the properties, dependencyManagement, and pluginManagement sections we will build below can be lifted and moved to a parent pom in a multi-module project.

  5. Add pluginManagement definitions for certain plugins we will use in this module. Like above -- these are passive definitions that define the configuration for certain plugins when the child/leaf projects chose to use them. Lets start with a simple example and add a few more complex ones later. In this example, we are making sure all uses of the jar plugin are of a specific version.

    
     <build>
            <pluginManagement>
                <plugins>

                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-jar-plugin</artifactId>
                        <version>${maven-jar-plugin.version}</version>
                    </plugin>

                </plugins>    
            </pluginManagement>
        </build> 
  6. Add the src/main dependencies. This represents what your code depends upon at compile time and runtime.

    • scope=compile is used when your src/main code depends on the artifact to compile and you wish the design of transitive dependency to automatically bring this dependency with the module.

    • scope=provided is used when your src/main code depends on the artifact to compile but you do not wish this automatically be brought along when used with downstream clients. Normally this type of artifact is an API and the downstream client will be providing their own version of the API packaged with their provider.

    
     <dependencies>
            <dependency>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-api</artifactId>
                <scope>provided</scope>
            </dependency>

            <dependency>
                <groupId>javax.persistence</groupId>
                <artifactId>javax.persistence-api</artifactId>
                <scope>provided</scope>
            </dependency>
            ...
        </dependencies> 

    Note

    Notice how the definitions above are lacking a version element. The dependency declaration actively brings the dependency into the project and inherits the definition specified by the dependencyManagement section above.

  7. Add the src/test standard dependencies.

    • scope=test is used for anything that your src/test code depends upon (but not your src/main) or what your unit tests need at runtime to operate the test. For example, a module may declare a scope=test dependency on h2 database (later) to do some local unit testing and then be ultimately deployed to a postgres server in a downstream client. In this case we are picking JUnit4 as the testing framework, log4j as the logging implementation for commons-logging, and hibernate as the JPA implementation for the jpa-api. We are also adding an extra dependency to allow hibernate slf4j logging to be integrated with log4j.

    
            <dependency>
                <groupId>org.hibernate</groupId>
                <artifactId>hibernate-core</artifactId>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-log4j12</artifactId>
                <scope>test</scope>
            </dependency>

            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <scope>test</scope>
            </dependency>
            <dependency>
              <groupId>log4j</groupId>
              <artifactId>log4j</artifactId>
              <scope>test</scope>
            </dependency>    
  8. Add a testResources definition to the build section to get src/test/resource files filtered when copied into the target tree. We do this so we have a chance to replace the variables in the persistence.xml and hibernate.properties file.

    
     <build>
            <!-- filtering will replace URLs, credentials, etc in the 
                 files copied to the target directory and used during testing.
                -->
            <testResources>
                <testResource>
                    <directory>src/test/resources</directory>
                    <filtering>true</filtering>
                </testResource>
            </testResources>

            <pluginManagement>
            ...
        </build> 
  9. Add a compiler specification to control the source and target java versions. In this case we are picking up the specific value from property variables set above and can be overridden in the developer's settings.xml or on the command line using system properties.

    
     <build>
            <pluginManagement>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-compiler-plugin</artifactId>
                        <version>${maven-compiler-plugin.version}</version>
                        <configuration>
                                <source>${java.source.version}</source>
                                <target>${java.target.version}</target>
                        </configuration>                    
                    </plugin>      
                ...      
                </plugins>
            </pluginManagement>
        </build> 
  10. Add a definition for the maven-surefire-plugin so we can set properties needed for testing.

    
     <build>
            <pluginManagement>
                <plugins>
                    ...

                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>${maven-surefire-plugin.version}</version>
                        <configuration>
                            <argLine>${surefire.argLine}</argLine>
                        </configuration>
                    </plugin>            

                </plugins>    
            </pluginManagement>
        </build> 

    Note

    At this point, we are just allowing the argLine defined elsewhere to be optionally specified (for debugging). We do not yet have a need for system properties, but if we did the example shows how -Dname=value would be specified within the plugin configuration. The plugin (not pluginManagement) definition in the child pom will include any necessary system properties to be passed to the unit test.

    
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-surefire-plugin</artifactId>
                    <configuration>
                        <systemPropertyVariables>
                            <name1>value1</name1>
                            <name2>value2</name2>
                        </systemPropertyVariables>
                    </configuration>
                </plugin>
  11. Add a set of profiles that define H2 and Hibernate as our database and persistence provider. In the example below we are adding two database definitions (that happen to both be the same vendor). One requires the server to be running and the other uses an embedded server and a local filesystem. The embedded version can be easier to test with. The server version can be easier to debug. Activate which one you want with either your settings.xml#activeProfile settings or using a -Pprofile-name argument on the command line. If you already have a settings.xml#activeProfile setting, you can turn it off using -P\!deactivated-profile-name ((bash) or -P!deactivated-profile-name (dos)) and follow it up with -Pactivated-profile-name.

    
     <profiles>
            <profile> <!-- H2 server-based DB -->
                <id>h2srv</id>
                <properties>
                      <jdbc.driver>org.h2.Driver</jdbc.driver>
                      <jdbc.url>jdbc:h2:tcp://${db.host}:9092/./h2db/ejava</jdbc.url>
                      <jdbc.user>sa</jdbc.user>
                      <jdbc.password/>
                      <hibernate.dialect>org.hibernate.dialect.H2Dialect</hibernate.dialect>
                </properties>
                <dependencies>
                    <dependency>
                        <groupId>com.h2database</groupId>
                        <artifactId>h2</artifactId>
                        <version>${h2db.version}</version>
                        <scope>test</scope>
                    </dependency>
                </dependencies>
            </profile>

            <profile> <!-- H2 file-based DB -->
                <id>h2db</id>
                <activation>
                    <property> 
                        <name>!jdbcdb</name>
                    </property>
                </activation>
                <properties>
                      <jdbc.driver>org.h2.Driver</jdbc.driver>
                      <jdbc.url>jdbc:h2:${basedir}/target/h2db/ejava</jdbc.url>
                      <jdbc.user>sa</jdbc.user>
                      <jdbc.password/>
                      <hibernate.dialect>org.hibernate.dialect.H2Dialect</hibernate.dialect>
                </properties>
                <dependencies>
                    <dependency>
                        <groupId>com.h2database</groupId>
                        <artifactId>h2</artifactId>
                        <version>${h2db.version}</version>
                        <scope>test</scope>
                    </dependency>
                </dependencies>
            </profile>
        </profiles> 

    Note

    Profiles can be used to control which options are enabled at build time to make the module more portable. I also use them to help identify which dependencies are brought in for what reason -- especially for profiles that are configure to always activate.

  12. Perform a test of your pom.xml by issuing a sample build command. All should complete even though there is nothing yet in your source tree.

    
    $ mvn clean test

This chapter will take you through steps that will populate your database with a (simple) database schema. A database schema is required by any module that directly interacts with a RDMBS. The JPA provider can automatically generate a database schema but that is generally restricted to early development and quick prototypes. A module within the data tier will ultimately be responsible for providing a separate artifact the create and/or migrate the schema from version-to-version. That is typically finalized by humans knowledgable about particular databases and can be aided by tool(s) we introduce in this exercise.

  1. Create a set of ddl scripts in src/main/resources/ddl to handle creating the schema, deleting rows in the schema, and dropping tables in the schema. Make sure each script has the word "create", "delete", or "drop" in its file name to match some search strings we'll use later. Have the database generate a value for the primary key. That value should not be allowed to be null.

    `-- src
        |-- main
        |   |-- java
        |   `-- resources
        |       |-- ddl
        |       |   |-- emauto_create.ddl
        |       |   |-- emauto_delete.ddl
        |       |   `-- emauto_drop.ddl
        `-- test
            |-- java
            `-- resources 

    Note

    We could actually skip this step and have the persistence provider create the table for us. That approach is great for quick Java-first prototypes. However, creating the schema outside of the persistence provider is a more realistic scenario for larger developments.

    # src/main/resources/ddl/emauto_create.ddl
    CREATE TABLE EM_AUTO (
        ID BIGINT generated by default as identity (start with 1) not null,
        MAKE VARCHAR(32),
        MODEL VARCHAR(32),
        COLOR VARCHAR(32),
        MILEAGE INT,
    
        CONSTRAINT em_autoPK PRIMARY KEY(ID)
    )
    
    # src/main/resources/ddl/emauto_delete.ddl
    DELETE FROM EM_AUTO;
    
    # src/main/resources/ddl/emauto_drop.ddl
    DROP TABLE EM_AUTO if EXISTS;
  2. You can perform a sanity check of the above scripts by pasting them into the DB UI SQL area and executing.

  3. Add the standard database setup and teardown scripts. This allows us to create a legacy database schema and write classes that map to that schema. We will later have the persistence provider create the schema for us when we are in quick prototype mode. First create the reusable portion of the definition in the pluginManagement section. This will define the version, database dependencies, and property information for all to inherit.

    
        <build>
            <pluginManagement>
                <plugins>
                    ...
                    <plugin>
                        <groupId>org.codehaus.mojo</groupId>
                        <artifactId>sql-maven-plugin</artifactId>        
                        <version>${sql-maven-plugin.version}</version>        
                    
                        <dependencies>
                            <dependency>
                                <groupId>com.h2database</groupId>
                                <artifactId>h2</artifactId>
                                <version>${h2db.version}</version>
                            </dependency>
                        </dependencies>
                    
                        <configuration>
                            <username>${jdbc.user}</username>
                            <password>${jdbc.password}</password>
                            <driver>${jdbc.driver}</driver>
                            <url>${jdbc.url}</url>          
                        </configuration>
                    </plugin>          

                </plugins>    
            </pluginManagement>
        </build> 
  4. Next add the potentially project-specific portion to a build-plugins-plugin section that would normally be in the child module. However, when you add this to the module -- do so within a profile that is wired to always run except when the system property -DskipTests is defined. This is a standard maven system property that builders use to build the module and bypass both unit and integration testing. By honoring the property here -- our module will only attempt to work with the database if we ware not skipping tests. Note the !bang-not character means "the absence of this system property".

    
         <profiles>
            ...
            <profile>
                <id>testing</id>
                <activation>
                    <property>
                        <name>!skipTests</name>
                    </property>
                </activation>
          
                <build>
                    <plugins>
                        <plugin>
                            <!-- runs schema against the DB -->
                            <groupId>org.codehaus.mojo</groupId>
                            <artifactId>sql-maven-plugin</artifactId>        

                            <executions>

                                <!-- place execution elements here  -->

                            </executions>
                        </plugin>          
                    </plugins>          
                </build>          
            </profile>
        </profiles> 
  5. Configure the sql-maven-plugin executions element to run any drop scripts in the source tree before running tests.

    
            <execution>
                <id>drop-db-before-test</id>
                <phase>process-test-classes</phase>
                <goals>
                    <goal>execute</goal>
                </goals>
                <configuration>
                    <autocommit>true</autocommit>
                    <fileset>
                        <basedir>${basedir}/src</basedir>
                        <includes>
                            <include>main/resources/ddl/**/*drop*.ddl</include>
                        </includes>
                    </fileset>
                    <onError>continue</onError>
                </configuration>
            </execution> 

    Note

    Note that we are controlling when the scripts are executed using the phase element. This is naming a well known Maven lifecycle phase for the build.

  6. Configure the sql-maven-plugin executions element to run any scripts from the source tree to create schema before running tests.

    
            <execution>
                <id>create-db-before-test</id>
                <phase>process-test-classes</phase>
                <goals>
                    <goal>execute</goal>
                </goals>
                <configuration>
                    <autocommit>true</autocommit>
                    <fileset>
                        <basedir>${basedir}/src</basedir>
                        <includes>
                            <include>main/resources/ddl/**/*create*.ddl</include>

                        </includes>
                    </fileset>
                    <print>true</print>
                </configuration>
            </execution>
  7. Configure the sql-maven-plugin executions element to run any populate scripts from the source tree to add rows to the database before running tests.

    
            <execution>
                <id>populate-db-before-test</id>
                <phase>process-test-classes</phase>
                <goals>
                    <goal>execute</goal>
                </goals>
                <configuration>
                    <autocommit>true</autocommit>
                    <fileset>
                        <basedir>${basedir}/src</basedir>
                        <includes>
                            <include>test/resources/ddl/**/*populate*.ddl</include>
                        </includes>
                    </fileset>
                </configuration>
            </execution>
  8. Configure the sql-maven-plugin executions element to run any drop scripts after testing. You may want to comment this out if you want to view database changes in a GUI after the test.

    
            <execution>
                <id>drop-db-after-test</id>
                <phase>test</phase>
                <goals>
                    <goal>execute</goal>
                </goals>
                <configuration>
                    <autocommit>true</autocommit>
                    <fileset>
                        <basedir>${basedir}/src</basedir>
                        <includes>
                            <include>main/resources/ddl/**/*drop*.ddl</include>     
                            </includes>
                    </fileset>
                </configuration>
            </execution>
  9. Build and run the tests. The schema should show up in the DB UI.

    $mvn clean test -P\!h2db -Ph2srv

    Note

    Remember to turn off (-P!profile-name) the embedded profile (h2db) if active by default and turn on the server profile (h2srv) if you wish to use the server and DB UI while the unit test is running. The DB UI can only inspect the embedded file once all active clients close the file. The backslash is only needed for commands from the bash shell.

In this chapter we are going to add tuning aspects to the schema put in place above. Examples of this include any indexes we believe would enhance the query performance. This example is still quite simple and lacks enough context to determine what would and would not be a helpful index. Simply treat this exercise as a tutorial in putting an index in place when properly identified. Adding the physical files mentioned here could be considered optional if all schema is hand-crafted. You control the contents of each file in a 100% hand-crafted DDL solution. However, for those cases where auto-generated schema files are created, you may want a separate set of files designated for "tuning" the schema that was auto-generated. We will demonstrate using two extra files to create/drop database indexes.

  1. Add a file to add database indexes for your schema

    # src/main/resources/ddl/emauto_tuningadd.ddl
    
    CREATE INDEX EM_AUTO_MAKEMODEL ON EM_AUTO(MAKE, MODEL);
  2. Wire this new file into your SQL plugin definition for creating schema. Have it run after your table creates. Add an "orderFile" element to the configuration to specify to the plugin that you wish the files be executed in a specific order. Otherwise the order will be non-deterministic and the tuning may get executed before the schema is created.

    
     <execution>
        <id>create-db-before-test</id>
        <phase>process-test-classes</phase>
        <goals>
            <goal>execute</goal>
        </goals>
        <configuration>
            <autocommit>true</autocommit>
            <orderFile>ascending</orderFile>
            <fileset>
                <basedir>${basedir}/src</basedir>
                <includes>
                    <include>main/resources/ddl/**/*create*.ddl</include>
                    <include>main/resources/ddl/**/*tuningadd*.ddl</include>
                </includes>
            </fileset>
            <print>true</print>
        </configuration>
    </execution>
  3. Add a file to augment the drop script and remove indexes from your schema

    # src/main/resources/ddl/emauto_tuningremove.ddl
    
    DROP INDEX EM_AUTO_MAKEMODEL if exists;
  4. Wire this new file into your SQL plugin definition for dropping schema. Have it run before your table drops. Add an "orderFile" element to the configuration to specify to the plugin that you wish the files be executed in a specific order.

    
    <configuration>
        <autocommit>true</autocommit>
        <orderFile>decending</orderFile>
        <fileset>
            <basedir>${basedir}/src</basedir>
            <includes>
                <include>main/resources/ddl/**/*tuningremove*.ddl</include>
                <include>main/resources/ddl/**/*drop*.ddl</include>
            </includes>
        </fileset>
        <onError>continue</onError>
    </configuration>
  5. Build the schema for your module

    $ mvn clean process-test-classes
    ...
    [INFO] --- sql-maven-plugin:1.4:execute (drop-db-before-test) @ entityMgrEx ---
    [INFO] Executing file: .../src/main/resources/ddl/emauto_drop.ddl
    [INFO] Executing file: .../src/main/resources/ddl/emauto_tuningremove.ddl
    [INFO] 2 of 2 SQL statements executed successfully
    [INFO] 
    [INFO] --- sql-maven-plugin:1.4:execute (create-db-before-test) @ entityMgrEx ---
    [INFO] Executing file: .../src/main/resources/ddl/emauto_create.ddl
    [INFO] Executing file: .../entityMgrEx/src/main/resources/ddl/emauto_tuningadd.ddl
    [INFO] 2 of 2 SQL statements executed successfully

This chapter will add an entity class, the persistence.xml, and associated property file to define the persistence unit.

The entity class represents one or more tables in the database and each instance of the entity class represents a specific row in those tables.

The persistence.xml defines a JPA persistence unit (along with other related XML files and entity class annotations). Instances of a persistence unit is called a persistence context. Instances of the persistence unit are accessed through an EntityManager.

`-- src
    |-- main
    |   |-- java
    |   |   `-- myorg
    |   |       `-- entitymgrex
    |   |           `-- Auto.java
    |   `-- resources
    |       `-- META-INF
    |           `-- persistence.xml
    `-- test
        |-- java
        `-- resources
            `-- hibernate.properties
  1. Create a (Plain Old Java Object (POJO)) class to represent an automobile. Use class annotations to provide the following:

    # src/main/java/myorg/entitymgrex/Auto.java
    
    package myorg.entitymgrex;
    import java.io.Serializable;
    import javax.persistence.Column;
    import javax.persistence.Entity;
    import javax.persistence.GenerationType;
    import javax.persistence.GeneratedValue;
    import javax.persistence.Id;
    import javax.persistence.Table;
    @Entity @Table(name="EM_AUTO")
    public class Auto implements Serializable {
        private static final long serialVersionUID = 1L;
        @Id @GeneratedValue(strategy=GenerationType.IDENTITY)
        private long id;
        private String make;
        private String model;
        private String color;
        private int mileage;
        public Auto(){}
        public Auto(int id) { this.id=id; }
        public long getId() { return id; }
        //more getter/setters go here
    }

    Note

    @Entity, @Id, and a default constructor are the only requirements on an entity class. The implementation of java.io.Serializable is not a JPA requirement but will be leveraged by a later example within this exercise.

  2. Add the remaining setter/getter methods to the class. If you are using Eclipse to author the class -- right click->Source->Generate Getters and Setters will generate all of this for you. Since we are using generated primary key values, there is no immediate need for a constructor to set the id. If you add this later, remember to also add a default constructor, which was removed by the compiler when you manually add the first constructor.

        public void setMake(String make) {
    
            this.make = make;
        }
        
        public int getMileage() { return mileage; }
        public void setMileage(int mileage) {
            this.mileage = mileage;
        }
        
        public String getModel() { return model; }
        public void setModel(String model) {
            this.model = model;
        }
        
        public String getColor() { return color; }
        public void setColor(String color) {
            this.color = color;
        }
  3. You may also want to add a public toString():String method to conveniently print your Auto objects. Eclipse can also generate that on demand and configurable.

        @Override
    
        public String toString() {
            StringBuilder builder = new StringBuilder();
            builder
                .append("id=").append(id)
                .append(", make=").append(make)
                .append(", model=").append(model)
                .append(", color=").append(color)
                .append(", mileage=").append(mileage);
            return builder.toString();
        }
  4. Create a META-INF/persistence.xml file to define the persistence unit for our jar file.

    • persistence-unit name: must match what we place in our JUnit test

    • provider: specify that this persistence unit is defined for the org.hibernate.jpa.HibernatePersistenceProvider provider.

    • define provider-specific properties that tell the provider how to obtain a connection to the database as well as some other configuration properties.

    Note

    The technique to add the provider-specific properties includes somewhat sensitive information like user credentials. If we place them in the persistence.xml file within the src/main tree, these properties will become part of our deployed artifact. To avoid this, we will define them in a separate hibernate.properties file placed in the src/test tree.

    
    # src/main/resources/META-INF/persistence.xml
    <?xml version="1.0" encoding="UTF-8"?>
    <persistence xmlns="http://java.sun.com/xml/ns/persistence"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd" version="2.0">

        <persistence-unit name="entityMgrEx">
            <provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>

            <properties>
               <!-- defined in src/test/resources/hibernate.properties -->
            </properties>
        </persistence-unit>            
    </persistence>
  5. Create a hibernate.properties file in src/test/resources to hold information we want to support testing, but may not want to be part of the deployed artifact. Leave the volatile values as variables so they can be expanded into the target tree during compile time.

    • the variables will be filled in during the build process using "filtering" and the resources plugin.

    • the show and format_sql options are only turned on during early development and debug.

    • the jdbc.batch_size property set to 0 is also used during debug. Setting it to this value will eliminate any batching of SQL commands, allowing errors about the commands to be better reported to the developer.

    # src/test/resources/hibernate.properties
    hibernate.connection.url=${jdbc.url}
    hibernate.connection.driver_class=${jdbc.driver}
    hibernate.connection.username=${jdbc.user}
    hibernate.connection.password=${jdbc.password}
    
    hibernate.dialect=${hibernate.dialect}
    #hibernate.hbm2ddl.auto=create
    hibernate.hbm2ddl.import_files=/ddl/emauto-tuningdrop.ddl,/ddl/emauto-tuning.ddl
    hibernate.show_sql=true
    hibernate.format_sql=true
    #hibernate.jdbc.batch_size=0
    
  6. Make sure your project still builds. Your area should look something like the following after the build.

    $ mvn clean install -P\!h2db -Ph2srv
    |-- pom.xml
    |-- src
    |   |-- main
    |   |   |-- java
    |   |   |   `-- myorg
    |   |   |       `-- entitymgrex
    |   |   |           `-- Auto.java
    |   |   `-- resources
    |   |       |-- ddl
    |   |       |   |-- emauto_create.ddl
    |   |       |   |-- emauto_delete.ddl
    |   |       |   |-- emauto_drop.ddl
    |   |       |   |-- emauto_tuningadd.ddl
    |   |       |   `-- emauto_tuningremove.ddl
    |   |       `-- META-INF
    |   |           `-- persistence.xml
    |   `-- test
    |       `-- resources
    |           `-- hibernate.properties
    `-- target
        |-- classes
        |   |-- ddl
        |   |   |-- emauto_create.ddl
        |   |   |-- emauto_delete.ddl
        |   |   |-- emauto_drop.ddl
        |   |   |-- emauto_tuningadd.ddl
        |   |   `-- emauto_tuningremove.ddl
        |   |-- META-INF
        |   |   `-- persistence.xml
        |   `-- myorg
        |       `-- entitymgrex
        |           `-- Auto.class
       ...
        `-- test-classes
            `-- hibernate.properties
    
  7. You should also check that your hibernate.properties file was filtered by your build.testResources definition provided earlier. The variable definitions within your src/test/resources source file(s) should be replaced with property values from your environment.

    $ more src/test/resources/hibernate.properties target/test-classes/hibernate.properties 
    ::::::::::::::
    src/test/resources/hibernate.properties
    ::::::::::::::
    hibernate.dialect=${hibernate.dialect}
    hibernate.connection.url=${jdbc.url}
    hibernate.connection.driver_class=${jdbc.driver}
    hibernate.connection.password=${jdbc.password}
    hibernate.connection.username=${jdbc.user}
    #hibernate.hbm2ddl.auto=create
    hibernate.show_sql=true
    hibernate.format_sql=true
    #hibernate.jdbc.batch_size=0
    
    ::::::::::::::
    target/test-classes/hibernate.properties
    ::::::::::::::
    hibernate.dialect=org.hibernate.dialect.H2Dialect
    hibernate.connection.url=jdbc:h2:tcp://127.0.0.1:9092/./h2db/ejava
    hibernate.connection.driver_class=org.h2.Driver
    hibernate.connection.password=
    hibernate.connection.username=sa
    #hibernate.hbm2ddl.auto=create
    #hibernate.hbm2ddl.import_files=/ddl/emauto-tuningdrop.ddl,/ddl/emauto-tuning.ddl 
    hibernate.show_sql=true
    hibernate.format_sql=true
    #hibernate.jdbc.batch_size=0

This chapter will create a JUnit test case that will instantiate an EntityManager, perform some basic operations, and log information about the tests.

`-- src
    `-- test
        |-- java
        |   `-- myorg
        |       `-- entitymgrex
        |           `-- EntityMgrTest.java
        `-- resources
            `-- log4j.xml
  1. Create a JUnit test case to hold your test code. The following is an example of a 4.x JUnit test case that uses @Annotations.

    # src/test/java/myorg/entitymgrex/EntityMgrTest.java
    
    package myorg.entitymgrex;
    import java.io.ByteArrayInputStream;
    import java.io.ByteArrayOutputStream;
    import java.io.ObjectInputStream;
    import java.io.ObjectOutputStream;
    import javax.persistence.EntityManager;
    import javax.persistence.EntityManagerFactory;
    import javax.persistence.Persistence;
    import javax.persistence.Query;
    import org.slf4j.Logger;
    import org.slf4j.LoggerFactory;
    import static org.junit.Assert.*;
    import org.junit.After;
    import org.junit.AfterClass;
    import org.junit.Before;
    import org.junit.BeforeClass;
    import org.junit.Test;
    public class EntityMgrTest {
        private static final Logger log = LoggerFactory.getLogger(EntityMgrTest.class);
        @Test
        public void testTemplate() {
            logger.info("testTemplate");
        }
    }
  2. Provide a setUpClass() method that runs once before all tests that can create the entity manager. This method must be static.

        private static final String PERSISTENCE_UNIT = "entityMgrEx";
    
        private static EntityManagerFactory emf;
        @BeforeClass
        public static void setUpClass() {
            log.debug("creating entity manager factory");
            emf = Persistence.createEntityManagerFactory(PERSISTENCE_UNIT);
        }
  3. Provide a setUp() method that will be called before each testMethod is executed. Have this method create an entity manager for the tests to use.

        private EntityManager em;    
    
    
        @Before
        public void setUp() throws Exception {
            log.debug("creating entity manager");
            em = emf.createEntityManager();
            //cleanup();
        }
  4. Provide a tearDown() method that will be called after each testMethod. Have this flush all remaining items in the persistence context to the database and close the entity manager.

        @After
    
        public void tearDown() throws Exception {
            if (em!=null) {
                logger.debug("tearDown() started, em={}", em);
                em.getTransaction().begin();
                em.flush();
                //logAutos();
                em.getTransaction().commit();
                em.close();
                logger.debug("tearDown() complete, em={}", em);
                em=null;
            }
         }
  5. Provide a tearDownClass() method that will be called after all testMethods have completed. This method must be static and should close the entity manager factory.

        @AfterClass
    
        public static void tearDownClass() {
            if (emf!=null) {
                logger.debug("closing entity manager factory");
                emf.close();
                emf=null;
            }
        }
  6. Add in a logAutos() method to query and print all autos in the database. Do this after flushing the entity manager in the tearDown() method so you can see the changes from the previous test. The following example uses the entity manager to create an ad-hoc EJB-QL statement.

        @After
    
        public void tearDown() throws Exception {
    ...
                em.flush();            
                logAutos();            
                em.getTransaction().commit();            
    ...
         }
        public void logAutos() {
            Query query = em.createQuery("select a from Auto as a");
            for (Object o: query.getResultList()) {
                logger.info("EM_AUTO: {}", o);
            }
        }
  7. You might also want to add a cleanup() to clear out the Auto table between tests. The example below uses the entity manager to create a native SQL statement.

        @Before
    
        public void setUp() throws Exception {
            ...
            em = emf.createEntityManager();
            cleanup();
        }
        public void cleanup() {
            em.getTransaction().begin();
            Query query = em.createNativeQuery("delete from EM_AUTO");
            int rows = query.executeUpdate();
            em.getTransaction().commit();
            logger.info("removed {} rows", rows);
        }
  8. Add a log4j.xml file to src/test/resources that has your desired settings. The one below produces less timestamp information at the console and more details in the logfile.

    
    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
    <log4j:configuration
        xmlns:log4j="http://jakarta.apache.org/log4j/"
        debug="false">

        <appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
            <param name="Target" value="System.out"/>

            <layout class="org.apache.log4j.PatternLayout">
                <param name="ConversionPattern"
                       value="(%F:%M:%L)  -%m%n"/>
            </layout>
        </appender>

        <appender name="logfile" class="org.apache.log4j.RollingFileAppender">
            <param name="File" value="target/log4j-out.txt"/>
            <param name="Append" value="false"/>
            <param name="MaxFileSize" value="100KB"/>
            <param name="MaxBackupIndex" value="1"/>
            <layout class="org.apache.log4j.PatternLayout">
                <param name="ConversionPattern"
                       value="%-5p %d{dd-MM HH:mm:ss,SSS} [%c] (%F:%M:%L)  -%m%n"/>
            </layout>
       </appender>

       <logger name="myorg">
          <level value="debug"/>
          <appender-ref ref="logfile"/>
       </logger>

       <root>
          <priority value="fatal"/>
          <appender-ref ref="CONSOLE"/>
       </root>

    </log4j:configuration>

    Note

    Although it might be a bit entertaining to set the priority of the root appender to debug to see everything the persistence provider has to say, it is quite noisy. Consider changing to root priority to fatal so that a majority of the log statements are yours.

  9. You should be able to build and test your module at this time.

    $ mvn clean test
    
    Running myorg.entitymgrex.EntityMgrTest
    (EntityMgrTest.java:setUpClass:25)  -creating entity manager factory
    (EntityMgrTest.java:setUp:31)  -creating entity manager
    Hibernate: 
        delete 
        from
            EM_AUTO
    (EntityMgrTest.java:cleanup:58)  -removed 0 rows
    (EntityMgrTest.java:testTemplate:75)  -testTemplate
    (EntityMgrTest.java:tearDown:39)  -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@3e52a475
    Hibernate: 
        select
            auto0_.id as id0_,
            auto0_.color as color0_,
            auto0_.make as make0_,
            auto0_.mileage as mileage0_,
            auto0_.model as model0_ 
        from
            EM_AUTO auto0_
    (EntityMgrTest.java:tearDown:45)  -tearDown() complete, em=org.hibernate.ejb.EntityManagerImpl@3e52a475
    (EntityMgrTest.java:tearDownClass:69)  -closing entity manager factory
    Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.337 sec
    
    Results :
    
    Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
    ...
    [INFO] BUILD SUCCESS

    Raise org.hibernate verbosity to help debug errors

    If you tests failed and there is no key information printed, try raising the verbosity for all org.hibernate classes to DEBUG.

    # src/test/resources/log4j.xml
    <logger name="org.hibernate">
       <level value="trace"/>
       <appender-ref ref="logfile"/>
    </logger>
  10. Check that you have the following artifacts in your project tree.

    |-- pom.xml
    
    |-- src
    |   |-- main
    |   |   |-- java
    |   |   |   `-- myorg
    |   |   |       `-- entitymgrex
    |   |   |           `-- Auto.java
    |   |   `-- resources
    |   |       |-- ddl
    |   |       |   |-- emauto_create.ddl
    |   |       |   |-- emauto_delete.ddl
    |   |       |   |-- emauto_drop.ddl
    |   |       |   |-- emauto_tuningadd.ddl
    |   |       |   `-- emauto_tuningremove.ddl
    |   |       `-- META-INF
    |   |           `-- persistence.xml
    |   `-- test
    |       |-- java
    |       |   `-- myorg
    |       |       `-- entitymgrex
    |       |           `-- EntityMgrTest.java
    |       `-- resources
    |           |-- hibernate.properties
    |           `-- log4j.xml
    `-- target
        |-- antrun
        |   `-- build-main.xml
        |-- classes
        |   |-- ddl
        |   |   |-- emauto_create.ddl
        |   |   |-- emauto_delete.ddl
        |   |   |-- emauto_drop.ddl
        |   |   |-- emauto_tuningadd.ddl
        |   |   `-- emauto_tuningremove.ddl
        |   |-- META-INF
        |   |   `-- persistence.xml
        |   `-- myorg
        |       `-- entitymgrex
        |           `-- Auto.class
        |-- generated-sources
        |   `-- annotations
        |-- generated-test-sources
        |   `-- test-annotations
        |-- h2db
        |   `-- ejava.h2.db
        |-- log4j-out.txt
        |-- maven-status
        |   `-- maven-compiler-plugin
        |       |-- compile
        |       |   `-- default-compile
        |       |       |-- createdFiles.lst
        |       |       `-- inputFiles.lst
        |       `-- testCompile
        |           `-- default-testCompile
        |               |-- createdFiles.lst
        |               `-- inputFiles.lst
        |-- surefire-reports
        |   |-- myorg.entitymgrex.EntityMgrTest.txt
        |   `-- TEST-myorg.entitymgrex.EntityMgrTest.xml
        `-- test-classes
            |-- hibernate.properties
            |-- log4j.xml
            `-- myorg
                `-- entitymgrex
                    `-- EntityMgrTest.class

Over the years/versions, Eclipse has progressed from being ignorant of Maven (with all integration coming from the Maven side) to being very much integrated with Maven. In that later/integrated mode, Eclipse will try really hard to do the right thing within Eclipse for what was defined to be done outside of Eclipse. For example, Eclipse will turn Maven dependencies directly into an Eclipse build path. There exists, however, some plugins that Eclipse has yet to learn about and will alert you to that fact. Many of these have no role within Eclipse and you simply need to explicitly give Eclipse instruction to ignore the plugin. Luckily these cases get fewer and fewer each year and Eclipse will update your pom.xml with the necessary configuration when it is needed.

  • Import the project into Eclipse using "Existing Maven Projects" option. You should see an error for the maven-sql-plugin. Ignore the error and continue with the import. We will address the error in the next step.

  • Add the following profile to your pom.xml. The profile is activated when the m2e.version property is defined -- which is a property m2e (Maven To Eclipse) sets within Eclipse.

    
        <profiles>
           ...
            <!--  tell Eclipse what to do with some of the plugins -->
            <profile>
              <id>m2e</id>
              <activation>
                <property>
                  <name>m2e.version</name>
                </property>
              </activation>
              <build>
                <pluginManagement>
                    <plugins>
                        <plugin>
                          <groupId>org.eclipse.m2e</groupId>
                          <artifactId>lifecycle-mapping</artifactId>
                          <version>1.0.0</version>
                          <configuration>
                            <lifecycleMappingMetadata>
                              <pluginExecutions>

                                <pluginExecution>
                                  <pluginExecutionFilter>
                                    <groupId>org.codehaus.mojo</groupId>
                                    <artifactId>sql-maven-plugin</artifactId>
                                    <versionRange>[1.0.0,)</versionRange>
                                    <goals>
                                      <goal>execute</goal>
                                    </goals>
                                  </pluginExecutionFilter>
                                  <action>
                                    <ignore />
                                  </action>
                                </pluginExecution>

                              </pluginExecutions>
                            </lifecycleMappingMetadata>
                          </configuration>
                        </plugin>

                    </plugins>
                </pluginManagement>
               </build>
            </profile>
        ...
        </profiles>

    The red error marks should have cleared from the pom.xml editor. The above provide is activated when the "m2e.version" is present within Eclipse and tells Eclipse to ignore the sql-maven-plugin.

This chapter will demonstrate various methods to perform create, read, update, and delete (CRUD) operations on the database using the EntityManager, the persistence unit/context, and the entity class.

Note

The following changes are all made to the EntityMgrTest.java JUnit test class. Everything is being done within this file to keep things simple. This test case is playing the role of the business and persistence (Data Access Object (DAO)) logic.

  1. add a testCreate() method to test the functionality of EntityManager.create(). This will add an object to the database once associated with a transaction.

        @Test
    
        public void testCreate() {
            logger.info("testCreate");
            Auto car = new Auto();
            car.setMake("Chrysler");
            car.setModel("Gold Duster");
            car.setColor("Gold");
            car.setMileage(60*1000);
            logger.info("creating auto: {}", car);
            em.persist(car);
        }
     -testCreate
     -creating auto:myorg.entitymgrex.Auto@140984b, id=0, make=Chrysler, model=Gold Duster, color=Gold, mileage=60000
     -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@3ac93e
     -EM_AUTO:myorg.entitymgrex.Auto@140984b, id=1, make=Chrysler, model=Gold Duster, color=Gold, mileage=60000
     -removed 1 rows
  2. add a testMultiCreate() to test creating several objects. This should also help verify that unique primary keys are being generated.

        @Test
    
        public void testMultiCreate() {
            logger.info("testMultiCreate");
            for(int i=0; i<5; i++) {
                Auto car = new Auto();
                car.setMake("Plymouth " + i);
                car.setModel("Grand Prix");
                car.setColor("Green");
                car.setMileage(80*1000);
                logger.info("creating auto: {}", car);
                em.persist(car);
            }
        }
     -testMultiCreate
     -creating auto:myorg.entitymgrex.Auto@c3e9e9, id=0, make=Plymouth 0, model=Grand Prix, color=Green, mileage=80000
     -creating auto:myorg.entitymgrex.Auto@31f2a7, id=0, make=Plymouth 1, model=Grand Prix, color=Green, mileage=80000
     -creating auto:myorg.entitymgrex.Auto@131c89c, id=0, make=Plymouth 2, model=Grand Prix, color=Green, mileage=80000
     -creating auto:myorg.entitymgrex.Auto@1697b67, id=0, make=Plymouth 3, model=Grand Prix, color=Green, mileage=80000
     -creating auto:myorg.entitymgrex.Auto@24c4a3, id=0, make=Plymouth 4, model=Grand Prix, color=Green, mileage=80000
     -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@1e9c82e
     -EM_AUTO:myorg.entitymgrex.Auto@c3e9e9, id=2, make=Plymouth 0, model=Grand Prix, color=Green, mileage=80000
     -EM_AUTO:myorg.entitymgrex.Auto@31f2a7, id=3, make=Plymouth 1, model=Grand Prix, color=Green, mileage=80000
     -EM_AUTO:myorg.entitymgrex.Auto@131c89c, id=4, make=Plymouth 2, model=Grand Prix, color=Green, mileage=80000
     -EM_AUTO:myorg.entitymgrex.Auto@1697b67, id=5, make=Plymouth 3, model=Grand Prix, color=Green, mileage=80000
     -EM_AUTO:myorg.entitymgrex.Auto@24c4a3, id=6, make=Plymouth 4, model=Grand Prix, color=Green, mileage=80000
  3. add a testFind() to test the ability to find an object by its primary key value.

        @Test
    
        public void testFind() {
            logger.info("testFind");
            Auto car = new Auto();
            car.setMake("Ford");
            car.setModel("Bronco II");
            car.setColor("Red");
            car.setMileage(0*1000);
            logger.info("creating auto: {}", car);
            em.persist(car);
            //we need to associate the em with a transaction to get a
            //primary key generated and assigned to the auto
            em.getTransaction().begin();
            em.getTransaction().commit();
            Auto car2 = em.find(Auto.class, car.getId());
            assertNotNull("car not found:" + car.getId(), car2);
            logger.info("found car: {}", car2);
        }
     -testFind
     -creating auto:myorg.entitymgrex.Auto@aae86e, id=0, make=Ford, model=Bronco II, color=Red, mileage=0
     -found car:myorg.entitymgrex.Auto@aae86e, id=7, make=Ford, model=Bronco II, color=Red, mileage=0
     -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@97d026
     -EM_AUTO:myorg.entitymgrex.Auto@aae86e, id=7, make=Ford, model=Bronco II, color=Red, mileage=0
  4. add a getReference() to test the ability to get a reference to an object. With such a shallow object, this will act much like find().

        @Test
    
        public void testGetReference() {
            logger.info("testGetReference");
            Auto car = new Auto();
            car.setMake("Ford");
            car.setModel("Escort");
            car.setColor("Red");
            car.setMileage(0*1000);
            logger.info("creating auto: {}", car);
            em.persist(car);
            //we need to associate the em with a transaction to get a
            //primary key generated and assigned to the auto
            em.getTransaction().begin();
            em.getTransaction().commit();
            Auto car2 = em.getReference(Auto.class, car.getId());
            assertNotNull("car not found:" + car.getId(), car2);
            logger.info("found car: {}", car2);
        }
     -testGetReference
     -creating auto:myorg.entitymgrex.Auto@608760, id=0, make=Ford, model=Escort, color=Red, mileage=0
     -found car:myorg.entitymgrex.Auto@608760, id=8, make=Ford, model=Escort, color=Red, mileage=0
     -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@157ea4a
     -EM_AUTO:myorg.entitymgrex.Auto@608760, id=8, make=Ford, model=Escort, color=Red, mileage=0
  5. add a testUpdate() method to test the ability to have the setter() of a managed ubject update the database.

        @Test
    
        public void testUpdate() {
            logger.info("testUpdate");
            Auto car = new Auto();
            car.setMake("Pontiac");
            car.setModel("Gran Am");
            car.setColor("Red");
            car.setMileage(0*1000);
            logger.info("creating auto: {}", car);
            em.persist(car);
            //we need to associate the em with a transaction to get a
            //primary key generated and assigned to the auto
            em.getTransaction().begin();
            em.getTransaction().commit();
            for(int mileage=car.getMileage(); mileage<(100*1000); mileage+=20000) {
                //here's where the update is done
                car.setMileage(mileage);
                //commit the update to the database for query
                em.getTransaction().begin();
                em.getTransaction().commit();
                //inspect database for value
                int value = getMileage(car.getId());
                assertTrue("unexpected mileage:" + value, value == mileage);
                logger.info("found mileage: {}", value);
            }
        }
        private int getMileage(long id) {
            Query query =
                em.createQuery("select a.mileage from Auto as a where a.id=:pk");
            query.setParameter("pk", id);
            return (Integer)query.getSingleResult();
        }
     -testUpdate
     -creating auto:myorg.entitymgrex.Auto@6a3960, id=0, make=Pontiac, model=Gran Am, color=Red, mileage=0
     -found mileage:0
     -found mileage:20000
     -found mileage:40000
     -found mileage:60000
     -found mileage:80000
     -EM_AUTO:myorg.entitymgrex.Auto@6a3960, id=9, make=Pontiac, model=Gran Am, color=Red, mileage=80000
  6. add a testMerge() method to test the ability to perform updates based on the current values of a detached object. Note that we are using Java serialization to simulate sending a copy of the object to/from a remote process and then performing the merge based on the updated object.

        @Test
    
        public void testMerge() throws Exception {
            logger.info("testMerge");
            Auto car = new Auto();
            car.setMake("Chrystler");
            car.setModel("Concord");
            car.setColor("Red");
            car.setMileage(0*1000);
            logger.info("creating auto: {}", car);
            car = em.merge(car); //using merge to persist new
            //we need to associate the em with a transaction to get a
            //primary key generated and assigned to the auto
            em.getTransaction().begin();
            em.getTransaction().commit();
            for(int mileage=(10*1000); mileage<(100*1000); mileage+=20000) {
                //simulate sending to remote system for update
                Auto car2 = updateMileage(car, mileage);
                //verify the object is not being managed by the EM
                assertFalse("object was managed", em.contains(car2));
                assertTrue("object wasn't managed", em.contains(car));
                assertTrue("mileage was same",
                        car.getMileage() != car2.getMileage());
                //commit the update to the database for query
                em.merge(car2);
                assertTrue("car1 not merged:" + car.getMileage(),
                        car.getMileage() == mileage);
                em.getTransaction().begin();
                em.getTransaction().commit();
                //inspect database for value
                int value = getMileage(car.getId());
                assertTrue("unexpected mileage:" + value, value == mileage);
                logger.info("found mileage:" + value);
            }
        }
        
        private Auto updateMileage(Auto car, int mileage) throws Exception {
            //simulate sending the object to a remote system
            ByteArrayOutputStream bos = new ByteArrayOutputStream();
            ObjectOutputStream oos = new ObjectOutputStream(bos);
            oos.writeObject(car);
            oos.close();
            //simulate receiving an update to the object from remote system
            ByteArrayInputStream bis =
                new ByteArrayInputStream(bos.toByteArray());
            ObjectInputStream ois = new ObjectInputStream(bis);
            Auto car2 = (Auto)ois.readObject();
            ois.close();
            //here's what they would have changed in remote process
            car2.setMileage(mileage);
            return car2;
        }
     -testMerge
     -creating auto:myorg.entitymgrex.Auto@147358f, id=0, make=Chrystler, model=Concord, color=Red, mileage=0
     -found mileage:10000
     -found mileage:30000
     -found mileage:50000
     -found mileage:70000
     -found mileage:90000
     -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@1b4c1d7
     -EM_AUTO:myorg.entitymgrex.Auto@147358f, id=10, make=Chrystler, model=Concord, color=Red, mileage=90000
    
  7. add a testRemove() method to verify that we can delete objects from the database.

        @Test
    
        public void testRemove() {
            logger.info("testRemove");
            Auto car = new Auto();
            car.setMake("Jeep");
            car.setModel("Cherokee");
            car.setColor("Green");
            car.setMileage(30*1000);
            logger.info("creating auto: {}", car);
            em.persist(car);
            //we need to associate the em with a transaction to get a
            //primary key generated and assigned to the auto
            em.getTransaction().begin();
            em.getTransaction().commit();
            Auto car2 = em.find(Auto.class, car.getId());
            assertNotNull("car not found:" + car.getId(), car2);
            logger.info("found car: {}", car2);
            //now remove the car
            logger.info("removing car: {}", car);
            em.remove(car);
            //we need to associate the em with a transaction to
            //physically remove from database
            em.getTransaction().begin();
            em.getTransaction().commit();
            Auto car3 = em.find(Auto.class, car.getId());
            assertNull("car found", car3);
        }
     -testRemove
     -creating auto:myorg.entitymgrex.Auto@28305d, id=0, make=Jeep, model=Cherokee, color=Green, mileage=30000
     -found car:myorg.entitymgrex.Auto@28305d, id=11, make=Jeep, model=Cherokee, color=Green, mileage=30000
     -removing car:myorg.entitymgrex.Auto@28305d, id=11, make=Jeep, model=Cherokee, color=Green, mileage=30000
    

In a previous chapter, you manually created a set of DDL files to create schema, delete rows from the schema in the database, and drop the schema from the database. Since your persistence provider knows how to work with schema, you can optionally get it to create schema for you rather than generating it manually. Even if you are working with legacy schema (and won't be changing the database), it is extremely helpful to see the persistence providers version of the schema to be able to more quickly determine a mis-match in the mapping rather than waiting until runtime testing. In order to add schema generation to your projects you can add one of the following; runtime schema generation or compile-time schema generation. Runtime schema generation is fine for examples and small prototypes, but compile-time generation is suitable for more realistic development scenarios.

  1. runtime schema generation can be added to your project by adding the following property to your persistence-unit or hibernate.properties. Coldstart your database, comment out your SQL plugin, and re-run your tests if you want to verify the above will create the database at runtime.

    
    #persistence.xml
       <property name="hibernate.hbm2ddl.auto" value="create"/> 

    #hibernate.properties
        hibernate.hbm2ddl.auto=create
  2. A set of files for schema can be generated by adding a standard set of properties to the persistence.xml properties element.

    
    <properties>
        <property name="javax.persistence.schema-generation.scripts.action" value="drop-and-create"/>
        <property name="javax.persistence.schema-generation.scripts.create-target" value="target/classes/ddl/entityMgrEx-JPAcreate.ddl"/>
        <property name="javax.persistence.schema-generation.scripts.drop-target" value="target/classes/ddl/entityMgrEx-JPAdrop.ddl"/>
    </properties>

    With the above configuration in place, the persistence unit will create two files in the target/classes/ddl directory that represent the JPA provider's view of the mapping.

    target/classes/ddl/
    |-- emauto_create.ddl
    |-- emauto_delete.ddl
    |-- emauto_drop.ddl
    |-- emauto_tuningadd.ddl
    |-- emauto_tuningremove.ddl
    |-- entityMgrEx-JPAcreate.ddl  <== generated 
    `-- entityMgrEx-JPAdrop.ddl    <== by the persistence unit
    

    The primary downfall in this approach is that the schema is generated too late for us to use the maven plugin to populate schema and it will execute this behavior all the way into production.

  3. compile-time schema generation can be moved forward in the build cycle by instantiating the persistence unit twice; once in a small program designed only to generate schema and once for our unit tests. I have wrapped that small program in a Maven plugin which we can install in our pom. It can be configured some. However, since I wrote it for use with this course -- it pretty much does what we want without much configuration.

    Add the following plugin definition to the pluginManagement section of your pom.xml. This will define the core behavor of the jpa-schemagen-maven-plugin to execute the generate goal. By default it executes during the process-test-classes phase.

    
        <build>
            <pluginManagement>
                <plugins>
                    ...
                    <plugin>
                        <groupId>info.ejava.utils.jpa</groupId>
                        <artifactId>jpa-schemagen-maven-plugin</artifactId>
                        <version>${ejava.version}</version>
                        <executions>
                            <execution>
                                <goals>
                                  <goal>generate</goal>
                                </goals>
                            </execution>
                        </executions>
                   </plugin>
                    ...
                </plugins>
            </pluginManagement>
        </build>
  4. Add the following active declaration to you pom to activate the plugin and fill in the module-specifics. We could optionally add it to the database profiles.

    
            ...
            </pluginManagement>

            <plugins>
                <plugin>
                    <artifactId>jpa-schemagen-maven-plugin</artifactId>
                    <groupId>info.ejava.utils.jpa</groupId>
                    <configuration>
                        <persistenceUnit>entityMgrEx</persistenceUnit>
                    </configuration>
                </plugin>
            </plugins>
        </build>
  5. Build your module and notice the generated JPA.ddl files

    $ mvn clean process-test-classes
    
    ...
    [INFO] --- jpa-schemagen-maven-plugin:5.0.0-SNAPSHOT:generate (default) @ entityMgrEx ---
    [INFO] Generating database schema for: entityMgrEx
    [INFO] removing existing target file:/Users/jim/proj/784/entityMgrEx/target/classes/ddl/entityMgrEx-drop.ddl
    [INFO] removing existing target file:/Users/jim/proj/784/entityMgrEx/target/classes/ddl/entityMgrEx-create.ddl
    Aug 14, 2018 10:28:50 PM org.hibernate.jpa.internal.util.LogHelper logPersistenceUnitInformation
    INFO: HHH000204: Processing PersistenceUnitInfo [
        name: entityMgrEx
        ...]
    Aug 14, 2018 10:28:50 PM org.hibernate.Version logVersion
    ...
    INFO: HHH000476: Executing import script 'org.hibernate.tool.schema.internal.exec.ScriptSourceInputNonExistentImpl@10850d17'
    ...
    ---
    ---
    target/classes/ddl/
    |-- emauto_create.ddl
    |-- emauto_delete.ddl
    |-- emauto_drop.ddl
    |-- emauto_tuningadd.ddl
    |-- emauto_tuningremove.ddl
    |-- entityMgrEx-JPAcreate.ddl  <== generated thru
    |-- entityMgrEx-JPAdrop.ddl    <===== configuration in persistence.xml
    |-- entityMgrEx-create.ddl     <== generated thru
    `-- entityMgrEx-drop.ddl       <===== plugin we just added
  6. (Optionally) update your SQL plugin defintion added in previous chapter to reference the dynamically generated schema in the target tree.

  7. (Optionally) update your persistence.xml to turn off schema generation from within all uses of the persistence unit.

  8. Eclipse will again report a plugin error within the pom.xml editor. Add the following definition to the lifecycle-mapping plugin to have the error ignored.

    
         <pluginExecution>
           <pluginExecutionFilter>
             <groupId>info.ejava.utils.jpa</groupId>
             <artifactId>jpa-schemagen-maven-plugin</artifactId>
             <versionRange>[5.0.0-SNAPSHOT,)</versionRange>
             <goals>
               <goal>generate</goal>
             </goals>
           </pluginExecutionFilter>
           <action>
             <ignore/>
           </action>
         </pluginExecution>

Since you will likely have many JPA modules in your enterprise application, lets take a moment to break the current module into a parent and child before you quit. That way you can better visualize which parts are specific to the child module and which are reusable from a common parent.

  1. Create a sibling module called ../jpa-parent

    
    $ mkdir ../jpa-parent
  2. Add the module definition (../jpa-parent/pom.xml)

    
    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
        <modelVersion>4.0.0</modelVersion>

        <groupId>myorg.jpa</groupId>
        <artifactId>jpa-parent</artifactId>
        <version>1.0-SNAPSHOT</version>
        <packaging>pom</packaging>

        <name>JPA Parent POM</name>
        <description>
            This parent pom is intended to provide common and re-usable 
            definitions and constructs across JPA projects.
        </description>
    </project>
  3. Add the following parent declaration to your existing module. The relativePath is only useful if you find yourself changing the parent pom on a frequent basis. Otherwise the parent module can be found in the localRepository once it has been installed.

    
        <parent>
            <groupId>myorg.jpa</groupId>
            <artifactId>jpa-parent</artifactId>
            <version>1.0-SNAPSHOT</version>
            <relativePath>../jpa-parent</relativePath>
        </parent>

        <groupId>myorg.jpa</groupId>
        <artifactId>entityMgrEx-child</artifactId>

        <name>Entity Manager Exercise</name>
  4. Verify your project still builds. This will verify your relativePath is correct.

    $mvn clean verify
    ...
    [INFO] BUILD SUCCESS
  5. Move the following constructs from the entityMgrEx module to the jpa-parent module. These represent the *passive* definitions that will not directly impact the child module until the child requests that feature. Your child module should still have the same build and test functionality except now it should look a little smaller. One could also make a case for moving some of the SQL/DDL script execution definitions also to the parent -- which would make this module almost of trivial size.

    • properties

    • repositories

    • dependencyManagement

    • pluginManagement

    • select profiles

    
    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
        <modelVersion>4.0.0</modelVersion>

        <groupId>myorg.jpa</groupId>
        <artifactId>jpa-parent</artifactId>
        <version>1.0-SNAPSHOT</version>
        <packaging>pom</packaging>

        <name>JPA Parent POM</name>
        <description>
            This parent pom is intended to provide common and re-usable 
            definitions and constructs across JPA projects.
        </description>

        <properties>
            <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
            <java.source.version>1.8</java.source.version>
            <java.target.version>1.8</java.target.version>

            <jboss.host>localhost</jboss.host>
            <db.host>${jboss.host}</db.host>

            <maven-compiler-plugin.version>3.7.0</maven-compiler-plugin.version>
            <maven-jar-plugin.version>3.1.0</maven-jar-plugin.version>
            <maven-surefire-plugin.version>2.22.0</maven-surefire-plugin.version>
            <sql-maven-plugin.version>1.5</sql-maven-plugin.version>        

            <h2db.version>1.4.197</h2db.version>
            <javax.persistence-api.version>2.2</javax.persistence-api.version>
            <hibernate-entitymanager.version>5.3.1.Final</hibernate-entitymanager.version>
            <junit.version>4.12</junit.version>
            <log4j.version>1.2.17</log4j.version>
            <slf4j.version>1.7.25</slf4j.version>
            <ejava.version>5.0.0-SNAPSHOT</ejava.version>
        </properties>

        <dependencyManagement>
            <dependencies>
                <dependency>
                    <groupId>javax.persistence</groupId>
                    <artifactId>javax.persistence-api</artifactId>
                    <version>${javax.persistence-api.version}</version>
                </dependency>        
                <dependency>
                    <groupId>org.hibernate</groupId>
                    <artifactId>hibernate-core</artifactId>
                    <version>${hibernate-entitymanager.version}</version>
                </dependency>
                <dependency>
                    <groupId>junit</groupId>
                    <artifactId>junit</artifactId>
                    <version>${junit.version}</version>
                </dependency>
                <dependency>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-api</artifactId>
                    <version>${slf4j.version}</version>
                </dependency>
                <dependency>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                    <version>${slf4j.version}</version>
                </dependency>
                <dependency>
                  <groupId>log4j</groupId>
                  <artifactId>log4j</artifactId>
                  <version>${log4j.version}</version>
                </dependency>    
            </dependencies>
        </dependencyManagement>

        <build>
            <pluginManagement>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-compiler-plugin</artifactId>
                        <version>${maven-compiler-plugin.version}</version>
                        <configuration>
                                <source>${java.source.version}</source>
                                <target>${java.target.version}</target>
                        </configuration>                    
                    </plugin>      

                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-jar-plugin</artifactId>
                        <version>${maven-jar-plugin.version}</version>
                    </plugin>

                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>${maven-surefire-plugin.version}</version>
                        <configuration>
                            <argLine>${surefire.argLine}</argLine>
                            <systemPropertyVariables>
                                <property.name>value</property.name>
                            </systemPropertyVariables>
                        </configuration>
                    </plugin>            

                    <plugin>
                        <groupId>info.ejava.utils.jpa</groupId>
                        <artifactId>jpa-schemagen-maven-plugin</artifactId>
                        <version>${ejava.version}</version>
                        <executions>
                            <execution>
                                <goals>
                                  <goal>generate</goal>
                                </goals>
                            </execution>
                        </executions>
                   </plugin>

                   <plugin>
                       <groupId>org.codehaus.mojo</groupId>
                       <artifactId>sql-maven-plugin</artifactId>        
                       <version>${sql-maven-plugin.version}</version>        
                    
                       <dependencies>
                           <dependency>
                               <groupId>com.h2database</groupId>
                               <artifactId>h2</artifactId>
                               <version>${h2db.version}</version>
                           </dependency>
                       </dependencies>
                    
                       <configuration>
                           <username>${jdbc.user}</username>
                           <password>${jdbc.password}</password>
                           <driver>${jdbc.driver}</driver>
                           <url>${jdbc.url}</url>          
                       </configuration>
                    </plugin>
                </plugins>    
            </pluginManagement>
        </build>

        <profiles>
            <profile> <!-- H2 server-based DB -->
                <id>h2srv</id>
                <properties>
                      <jdbc.driver>org.h2.Driver</jdbc.driver>
                      <jdbc.url>jdbc:h2:tcp://${db.host}:9092/./h2db/ejava</jdbc.url>
                      <jdbc.user>sa</jdbc.user>
                      <jdbc.password/>
                      <hibernate.dialect>org.hibernate.dialect.H2Dialect</hibernate.dialect>
                </properties>
                <dependencies>
                    <dependency>
                        <groupId>com.h2database</groupId>
                        <artifactId>h2</artifactId>
                        <version>${h2db.version}</version>
                        <scope>test</scope>
                    </dependency>
                </dependencies>
            </profile>

            <profile> <!-- H2 file-based DB -->
                <id>h2db</id>
                <activation>
                    <property> 
                        <name>!jdbcdb</name>
                    </property>
                </activation>
                <properties>
                      <jdbc.driver>org.h2.Driver</jdbc.driver>
                      <jdbc.url>jdbc:h2:${basedir}/target/h2db/ejava</jdbc.url>
                      <jdbc.user>sa</jdbc.user>
                      <jdbc.password/>
                      <hibernate.dialect>org.hibernate.dialect.H2Dialect</hibernate.dialect>
                </properties>
                <dependencies>
                    <dependency>
                        <groupId>com.h2database</groupId>
                        <artifactId>h2</artifactId>
                        <version>${h2db.version}</version>
                        <scope>test</scope>
                    </dependency>
                </dependencies>
            </profile>

            <profile>
                <id>testing</id>
                <activation>
                    <property>
                        <name>!skipTests</name>
                    </property>
                </activation>
          
                <build>
                    <plugins>
                        <plugin>
                            <!-- runs schema against the DB -->
                            <groupId>org.codehaus.mojo</groupId>
                            <artifactId>sql-maven-plugin</artifactId>        

                            <executions>

                                <!-- place execution elements here  -->
                                <execution>
                                    <id>drop-db-before-test</id>
                                    <phase>process-test-classes</phase>
                                    <goals>
                                        <goal>execute</goal>
                                    </goals>
                                    <configuration>
                                        <autocommit>true</autocommit>
                                        <orderFile>decending</orderFile>
                                        <fileset>
                                            <basedir>${basedir}/src</basedir>
                                            <includes>
                                                <include>main/resources/ddl/**/*tuningremove*.ddl</include>
                                                <include>main/resources/ddl/**/*drop*.ddl</include>
                                            </includes>
                                        </fileset>
                                        <onError>continue</onError>
                                    </configuration>
                                </execution>

                                <execution>
                                    <id>create-db-before-test</id>
                                    <phase>process-test-classes</phase>
                                    <goals>
                                        <goal>execute</goal>
                                    </goals>
                                    <configuration>
                                        <autocommit>true</autocommit>
                                        <orderFile>ascending</orderFile>
                                        <fileset>
                                            <basedir>${basedir}/src</basedir>
                                            <includes>
                                                <include>main/resources/ddl/**/*create*.ddl</include>
                                                <include>main/resources/ddl/**/*tuningadd*.ddl</include>                 
                                            </includes>
                                        </fileset>
                                        <print>true</print>
                                    </configuration>
                                </execution>

                                <execution>
                                    <id>populate-db-before-test</id>
                                    <phase>process-test-classes</phase>
                                    <goals>
                                        <goal>execute</goal>
                                    </goals>
                                    <configuration>
                                        <autocommit>true</autocommit>
                                        <fileset>
                                            <basedir>${basedir}/src</basedir>
                                            <includes>
                                                <include>test/resources/ddl/**/*populate*.ddl</include>
                                            </includes>
                                        </fileset>
                                    </configuration>
                                </execution>

                                <!--
                                <execution>
                                    <id>drop-db-after-test</id>
                                    <phase>test</phase>
                                    <goals>
                                        <goal>execute</goal>
                                    </goals>
                                    <configuration>
                                        <autocommit>true</autocommit>
                                        <fileset>
                                            <basedir>${basedir}/src</basedir>
                                            <includes>
                                                <include>main/resources/ddl/**/*drop*.ddl</include>     
                                                </includes>
                                        </fileset>
                                    </configuration>
                                </execution>
                                -->
                            </executions>
                        </plugin>          
                    </plugins>          
                </build>
            </profile>
            
            <!--  tell Eclipse what to do with some of the plugins -->
            <profile>
              <id>m2e</id>
              <activation>
                <property>
                  <name>m2e.version</name>
                </property>
              </activation>
              <build>
                <pluginManagement>
                    <plugins>
                        <plugin>
                          <groupId>org.eclipse.m2e</groupId>
                          <artifactId>lifecycle-mapping</artifactId>
                          <version>1.0.0</version>
                          <configuration>
                            <lifecycleMappingMetadata>
                              <pluginExecutions>
                                
                                <pluginExecution>
                                  <pluginExecutionFilter>
                                    <groupId>org.codehaus.mojo</groupId>
                                    <artifactId>sql-maven-plugin</artifactId>
                                    <versionRange>[1.0.0,)</versionRange>
                                    <goals>
                                      <goal>execute</goal>
                                    </goals>
                                  </pluginExecutionFilter>
                                  <action>
                                    <ignore />
                                  </action>
                                </pluginExecution>

                              </pluginExecutions>
                            </lifecycleMappingMetadata>
                          </configuration>
                        </plugin>

                    </plugins>
                </pluginManagement>
               </build>
            </profile>
            
        </profiles>
    </project>
  6. Leave the following the child project. This is a collection of *active* project constructs.

    • plugins

    • dependencies

    • module-specific properties

    • profiles that declare plugins and dependencies

    
    <?xml version="1.0"?>
    <project
        xmlns="http://maven.apache.org/POM/4.0.0"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
        <modelVersion>4.0.0</modelVersion>
        <parent>
            <groupId>myorg.jpa</groupId>
            <artifactId>jpa-parent</artifactId>
            <version>1.0-SNAPSHOT</version>
            <relativePath>../jpa-parent</relativePath>
        </parent>


        <groupId>myorg.jpa</groupId>
        <artifactId>entityMgrEx</artifactId>
        <version>1.0-SNAPSHOT</version>

        <name>Entity Manager Exercise</name>


        <build>
            <!-- filtering will replace URLs, credentials, etc in the 
                 files copied to the target directory and used during testing.
                -->
            <testResources>
                <testResource>
                    <directory>src/test/resources</directory>
                    <filtering>true</filtering>
                </testResource>
            </testResources>
            <plugins>
                <plugin>
                    <artifactId>jpa-schemagen-maven-plugin</artifactId>
                    <groupId>info.ejava.utils.jpa</groupId>
                    <configuration>
                        <persistenceUnit>entityMgrEx</persistenceUnit>
                    </configuration>
                </plugin>
            </plugins>
        </build>

        <dependencies>
            <dependency>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-api</artifactId>
                <scope>provided</scope>
            </dependency>
        
            <dependency>
                <groupId>javax.persistence</groupId>
                <artifactId>javax.persistence-api</artifactId>
                <scope>provided</scope>
            </dependency>        

            <dependency>
                <groupId>org.hibernate</groupId>
                <artifactId>hibernate-core</artifactId>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-log4j12</artifactId>
                <scope>test</scope>
            </dependency>

            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <scope>test</scope>
            </dependency>
            <dependency>
              <groupId>log4j</groupId>
              <artifactId>log4j</artifactId>
              <scope>test</scope>
            </dependency>    
        </dependencies>

        <profiles>
            <profile>
                <id>testing</id>
                <activation>
                    <property>
                        <name>!skipTests</name>
                    </property>
                </activation>
          
                <build>
                    <plugins>
                        <plugin>
                            <!-- runs schema against the DB -->
                            <groupId>org.codehaus.mojo</groupId>
                            <artifactId>sql-maven-plugin</artifactId>        

                            <executions>

                                <!-- place execution elements here  -->
                                <execution>
                                    <id>drop-db-before-test</id>
                                    <phase>process-test-classes</phase>
                                    <goals>
                                        <goal>execute</goal>
                                    </goals>
                                    <configuration>
                                        <autocommit>true</autocommit>
                                        <orderFile>decending</orderFile>
                                        <fileset>
                                            <basedir>${basedir}/src</basedir>
                                            <includes>
                                                <include>main/resources/ddl/**/*tuningremove*.ddl</include>
                                                <include>main/resources/ddl/**/*drop*.ddl</include>
                                            </includes>
                                        </fileset>
                                        <onError>continue</onError>
                                    </configuration>
                                </execution>

                                <execution>
                                    <id>create-db-before-test</id>
                                    <phase>process-test-classes</phase>
                                    <goals>
                                        <goal>execute</goal>
                                    </goals>
                                    <configuration>
                                        <autocommit>true</autocommit>
                                        <orderFile>ascending</orderFile>
                                        <fileset>
                                            <basedir>${basedir}/src</basedir>
                                            <includes>
                                                <include>main/resources/ddl/**/*create*.ddl</include>
                                                <include>main/resources/ddl/**/*tuningadd*.ddl</include>                 
                                            </includes>
                                        </fileset>
                                        <print>true</print>
                                    </configuration>
                                </execution>

                                <execution>
                                    <id>populate-db-before-test</id>
                                    <phase>process-test-classes</phase>
                                    <goals>
                                        <goal>execute</goal>
                                    </goals>
                                    <configuration>
                                        <autocommit>true</autocommit>
                                        <fileset>
                                            <basedir>${basedir}/src</basedir>
                                            <includes>
                                                <include>test/resources/ddl/**/*populate*.ddl</include>
                                            </includes>
                                        </fileset>
                                    </configuration>
                                </execution>

                                <!--
                                <execution>
                                    <id>drop-db-after-test</id>
                                    <phase>test</phase>
                                    <goals>
                                        <goal>execute</goal>
                                    </goals>
                                    <configuration>
                                        <autocommit>true</autocommit>
                                        <fileset>
                                            <basedir>${basedir}/src</basedir>
                                            <includes>
                                                <include>main/resources/ddl/**/*drop*.ddl</include>     
                                                </includes>
                                        </fileset>
                                    </configuration>
                                </execution>
                                -->
                            </executions>
                        </plugin>          
                    </plugins>          
                </build>
            </profile>
        </profiles>

    </project>
  7. Verify your project still builds. This will verify your relativePath is correct.

    $mvn clean verify
    ...
    [INFO] BUILD SUCCESS
  8. Optionally change your jpa-parent dependency to the class examples base parent project.

    
    
        <parent>
            <groupId>info.ejava.examples.build</groupId>
            <artifactId>dependencies</artifactId>
            <version>x.x.x-SNAPSHOT</version>
            <relativePath>build/dependencies/pom.xml</relativePath>
        </parent>

    Note

    Replace x.x.x-SNAPSHOT with the correct version for class.

Note

It is never a good idea to declare *active* POM constructs in a parent of a multi-module project unless *ALL* child modules are of the same purpose. Strive for parent Maven projects to define standards to follow without inserting unecessary dependencies or other constructs.

This chapter will take you through the steps to register a Java POJO with the JPA persistence unit using both orm.xml mapping-file descriptors and Java class annotations. It will also take you through the steps to define a POJO class legal to be used as JPA entity class.

JPA Classes are required to ...

  • Be identified as being a JPA entity class

  • Have a non-private default constructor

  • At least have one property defined as the primary key

  1. Create a POJO Java class in the ...mapped Java package

    package myorg.entityex.mapped;
    
    
    import java.util.Date;
    public class Animal {
        private int id;
        private String name;
        private Date dob;
        private double weight;
        
        public Animal(String name, Date dob, double weight) {
            this.name = name;
            this.dob = dob;
            this.weight = weight;
        }
        
        public int getId() { return id; }
        public void setId(int id) {
            this.id = id;
        }
        
        public String getName() { return name; }
        public void setName(String name) {
            this.name = name;
        }
        
        public Date getDob() { return dob; }
        public void setDob(Date dob) {
            this.dob = dob;
        }
        
        public double getWeight() { return weight; }
        public void setWeight(double weight) {
            this.weight = weight;
        }
    }
  2. Copy the existing AutoTest.java to AnimalTest.java and remove (or ignore) references to the Auto class from AnimalTest.java

  3. Attempt to persist the Animal by adding the following @Test method to the AnimalTest.java JUnit class.

    # src/test/java/myorg/entityex/AnimalTest.java
    
     
        @Test
        public void testCreateAnimal() {
            logger.info("testCreateAnimal");
            Animal animal = new Animal("bessie", 
                    new GregorianCalendar(1960, 1, 1).getTime(), 1400.2);
            em.persist(animal);        
            
            assertNotNull("animal not found", em.find(Animal.class,animal.getId()));
        }
  4. Attempt to build and run your test. Your test should fail with the following error message. This means that although your class is a valid Java POJO, it has not been made known to the persistence unit as a JPA entity.

    testCreateAnimal(myorg.entityex.AutoTest): Unknown entity: myorg.entityex.mapped.Animal
    ...
    java.lang.IllegalArgumentException: Unknown entity: myorg.entityex.mapped.Animal
            at org.hibernate.ejb.AbstractEntityManagerImpl.persist(AbstractEntityManagerImpl.java:856)
            at myorg.entityex.AutoTest.testCreateAnimal(AutoTest.java:100)
    
  5. Add the POJO class to the persistence unit by adding an orm.xml JPA mapping file to your project. Place the file in the src/main/resources/orm directory.

    
    # src/main/resources/orm/Animal-orm.xml

    <?xml version="1.0" encoding="UTF-8"?>
    <entity-mappings xmlns="http://java.sun.com/xml/ns/persistence/orm"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://java.sun.com/xml/ns/persistence/orm http://java.sun.com/xml/ns/persistence/orm_2_0.xsd" version="2.0">

        <entity class="myorg.entityex.mapped.Animal"/>

    </entity-mappings>
  6. Register the orm.xml file with the persistence unit by adding a mapping-file element reference.

    
    # src/test/resources/META-INF/persistence.xml

        <persistence-unit name="entityEx-test">
            <provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>

            <mapping-file>orm/Animal-orm.xml</mapping-file>
            <class>myorg.entityex.Auto</class>
            <properties>
            ...
  7. Attempt to build and run your test. Your test should fail with the following error message. The specifics of the error message will depend upon whether you are running just the JUnit test or building within Maven since the pom is configured to build database schema from the JPA mappings prior to running the JUnit test.

    
    PersistenceUnit: entityEx-test] Unable to configure EntityManagerFactory: No identifier specified for entity: myorg.entityex.mapped.Animal

    Caused by: org.hibernate.AnnotationException: No identifier specified for entity: myorg.entityex.mapped.Animal

    Although the class is a valid POJO and we followed the deployment descriptor mechanism for registering it with the persistence unit, it is not a legal entity. The error message indicates it is lacking a primary key field.

  8. Update the orm.xml file and define the "id" column as the primary key property for the entity.

    
        <entity class="myorg.entityex.mapped.Animal">
            <attributes>
                <id name="id"/>
            </attributes>
        </entity>
  9. Rebuild your module and it should now persist the POJO as a JPA entity. The SQL should be printed in the debug output.

    $ mvn clean test
    ...
    Hibernate: 
        insert 
        into
            Animal
            (dob, name, weight, id) 
        values
            (?, ?, ?, ?)
     -tearDown() complete, em=org.hibernate.ejb.EntityManagerImpl@12a80ea3
     -closing entity manager factory
     -HHH000030: Cleaning up connection pool [jdbc:h2:/home/jcstaff/workspaces/ejava-javaee/git/jpa/jpa-entity/entityEx/target/h2db/ejava]
    Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.94 sec
    
    Results :
    
    Tests run: 2, Failures: 0, Errors: 0, Skipped: 0
    
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
  10. Update your JUnit test method to look like the following. The unit test now clears the cache of entities and forces the entity manager to instantiate a new instance for the value returned from the find().

        @Test
    
        public void testCreateAnimal() {
            logger.info("testCreateAnimal");
            Animal animal = new Animal("bessie", 
                    new GregorianCalendar(1960, 1, 1).getTime(), 1400.2);
            em.persist(animal);        
            
            assertNotNull("animal not found", em.find(Animal.class,animal.getId()));
            
            em.flush(); //make sure all writes were issued to DB
            em.clear(); //purge the local entity manager entity cache to cause new instance
            assertNotNull("animal not found", em.find(Animal.class,animal.getId()));
        }
  11. Attempt to rebuild your module. It should fail because the entity class does not have a default constructor. Remember that default constructors are provided for free in POJOs until you add the first constructor. Once you add a custom constructor you are required to add a default constructor to make it a legal entity class.

    javax.persistence.PersistenceException: org.hibernate.InstantiationException: No default constructor for entity: myorg.entityex.mapped.Animal
    
  12. Update the POJO with a default constructor.

        public Animal() {} //must have default ctor
    
        public Animal(String name, Date dob, double weight) {
            this.name = name;
            this.dob = dob;
            this.weight = weight;
        }
  13. Rebuild the module. It should now pass because you have defined and registered a compliant entity class. The class was

  1. Copy the POJO class to a new java package and class name (Animal2).

    package myorg.entityex.annotated;
    
    
    import java.util.Date;
    public class Animal2 {
        private int id;
        private String name;
        private Date dob;
        private double weight;
        
        public Animal2() {} //must have default ctor
    ...
    }
  2. Add a javax.persistence.Entity annotation to the class

    import javax.persistence.Entity;
    
    
    @javax.persistence.Entity
    public class Animal2 {
  3. Register the new entity with the persistence.xml using a class element reference

    
        <persistence-unit name="entityEx-test">
            <provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>

            <mapping-file>orm/Animal-orm.xml</mapping-file>
            <class>myorg.entityex.Auto</class>
            <class>myorg.entityex.annotated.Animal2</class>
            <properties>
  4. Add a new test method to work with the new class added to the module.

        @Test
    
        public void testCreateAnimalAnnotated() {
            logger.info("testCreateAnimalAnnotated");
            myorg.entityex.annotated.Animal2 animal = new myorg.entityex.annotated.Animal2("bessie", 
                    new GregorianCalendar(1960, 1, 1).getTime(), 1400.2);
            em.persist(animal);        
            
            assertNotNull("animal not found", em.find(myorg.entityex.annotated.Animal2.class,animal.getId()));
            
            em.flush(); //make sure all writes were issued to DB
            em.clear(); //purge the local entity manager entity cache to cause new instance
            assertNotNull("animal not found", em.find(myorg.entityex.annotated.Animal2.class,animal.getId()));
  5. Attempt to build/run your module at this point. You should get a familiar error about Animal2 not having an identifier.

    Unable to configure EntityManagerFactory: No identifier specified for entity: myorg.entityex.annotated.Animal2
    
  6. Since we want to use annotations for the new class, fix the issue by adding a @javax.persistence.Id annotation to the id attribute. This is called FIELD access in JPA. You can alternately use PROPERTY access by moving the annotation to the getId() method.

        @javax.persistence.Id
    
        private int id;
  7. Re-run you test. It should succeed this time.

    $ mvn clean test
    ...
    [INFO] BUILD SUCCESS
    ...
  8. If you would like to observe the data in the database, do two things

  9. Type the following command in the H2 browser UI

    SELECT * FROM ANIMAL2;
    ID      DOB     NAME    WEIGHT  
    0   1960-02-01 00:00:00.0   bessie  1400.2

In this chapter we will create custom class/database mappings for some class properties

  • Map a class to a specific table

  • Map a property to a specific column

  • Define constraints for properties

  • Take a look at using getters and setters

  1. Copy your Animal.java class to Cat.java

    package myorg.entityex.mapped;
    
    
    import java.util.Date;
    public class Cat {
        private int id;
        private String name;
        private Date dob;
        private double weight;
        
        public Cat() {} //must have default ctor
        public Cat(String name, Date dob, double weight) {
            this.name = name;
            this.dob = dob;
            this.weight = weight;
        }
        
        public int getId() { return id; }
    ...
  2. Copy your Animal2.java class to Cat2.java

    package myorg.entityex.annotated;
    
    
    import java.util.Date;
    @javax.persistence.Entity
    public class Cat2 {
        private int id;
        private String name;
        private Date dob;
        private double weight;
        
        public Cat2() {} //must have default ctor
        public Cat2(String name, Date dob, double weight) {
            this.name = name;
            this.dob = dob;
            this.weight = weight;
        }
        
        @javax.persistence.Id
        public int getId() { return id; }
    ...
  3. Name the new Cat entity class in the Animal-orm.xml

    
    # src/main/resources/orm/Animal-orm.xml

        <entity class="myorg.entityex.mapped.Animal">
    ...
        <entity class="myorg.entityex.mapped.Cat">
            <attributes>
                <id name="id"/>
            </attributes>
        </entity>
  4. Name the new Cat2 entity class in the persistence.xml

    
    # src/test/resources/META-INF/persistence.xml

            <mapping-file>orm/Animal-orm.xml</mapping-file>
            <class>myorg.entityex.Auto</class>
            <class>myorg.entityex.annotated.Animal2</class>
            <class>myorg.entityex.annotated.Cat2</class>
  5. Rebuild your module form the command line and observe the create schema generated for Cat and Cat2. Notice that the JPA provider used the class name as the default entity name and will be attempting to map the entity to a database table by the same name as the entity.

    $ more target/classes/ddl/*
    ...
       create table Cat (
            id integer not null,
            dob timestamp,
            name varchar(255),
            weight double not null,
            primary key (id)
        );
    
        create table Cat2 (
            id integer not null,
            dob timestamp,
            name varchar(255),
            weight double not null,
            primary key (id)
        );
  6. Add a table element to the orm.xml definition to map Cat to the ENTITYEX_CAT table.

    
    
        <entity class="myorg.entityex.mapped.Cat">
            <table name="ENTITYEX_CAT"/>
            <attributes>
  7. Add a @javax.persistence.Table annotation to the Cat2 class to map instances to the ENTITYEX_CAT table.

    @javax.persistence.Entity
    
    @javax.persistence.Table(name="ENTITYEX_CAT")
    public class Cat2 {
        private int id;
  8. Rebuild your module form the command line and observe the create schema generated for Cat and Cat2. Notice now that we have mapped two entity classes to the same table using a custom table name.

    $ more target/classes/ddl/*
    ...
       create table ENTITYEX_CAT (
            id integer not null,
            dob timestamp,
            name varchar(255),
            weight double not null,
            primary key (id)
        );
  9. Map the id property for both the Cat and Cat2 to the CAT_ID column. Also have the persistence provider automatically generate a value for the primary key during the persist(). The exercise will go into generated primary key types in more detaiu

        @javax.persistence.Id
    
        @javax.persistence.Column(name="CAT_ID")
        @javax.persistence.GeneratedValue
        private int id;
    
        <entity class="myorg.entityex.mapped.Cat">
            <table name="ENTITYEX_CAT"/>
            <attributes>
                <id name="id">
                    <column name="CAT_ID"/>
                    <generated-value/>
                </id>
            </attributes>
        </entity>
  10. Make the name column mandatory (nullable=false) and define the length of the string to be 20 characters. Note that these property assignments are only useful as documentation and generating schema. Many of the column properties are not used at runtime by the provider.

        @javax.persistence.Column(nullable=false, length=20)
    
        private String name;
    
                <basic name="name">
                    <column nullable="false" length="20"/>
                </basic>
  11. Have the weight column stored with a precision of 3 digits, with 1 digit (scale) to the right of the decimal place. You will need to change the datatype of the mapped property to BigDecimal to fully leverage this capability.

    # src/main/java/myorg/entityex/annotated/Cat2.java
    
    
        @javax.persistence.Column(precision=3, scale=1)  //10.2lbs
        private BigDecimal weight;
    ...
        public Cat2(String name, Date dob, BigDecimal weight) {
    ...
        public BigDecimal getWeight() { return weight; }
        public void setWeight(BigDecimal weight) {
    # src/main/java/myorg/entityex/mapped/Cat.java
    
    
        private BigDecimal weight;
    ...
        public Cat(String name, Date dob, BigDecimal weight) {
    ...
        public BigDecimal getWeight() { return weight; }
        public void setWeight(BigDecimal weight) {
    
    # src/main/resources/orm/Animal-orm.xml
                <basic name="weight">
                    <column precision="3" scale="1"/>
                </basic>
  12. Rebuild the module from the command line and observe the database schema generated generated for the ENTITEX_CAT table.

    # target/classes/ddl/entityEx-createJPA.ddl
    
        create table ENTITYEX_CAT (
            CAT_ID integer generated by default as identity,
            dob date,
            name varchar(20) not null,
            weight decimal(3,1),
            primary key (CAT_ID)
        );

    Notice how

In the above example, you used FIELD access to the property values. This is the preferred method if your business object attributes provide an accurate representation as to what should be stored in the database. FIELD access was chosen by the provider by the fact that our annotated class placed the @Id annotation on a Java attribute and not a Java getter().

# implies FIELD access


    @javax.persistence.Id
    @javax.persistence.Column(name="CAT_ID")
    @javax.persistence.GeneratedValue
    private int id;
...    
    public int getId() { return id; }

If moved the @Id property definitions to the getter(), then the access would have been switched to PROPERTY. That was how JPA 1.0 annotated classed worked and it was always one way or another.

# implies PROPERTY access


    private int id;
...    
    @javax.persistence.Id
    @javax.persistence.Column(name="CAT_ID")
    @javax.persistence.GeneratedValue
    public int getId() { return id; }

Since it was always one way or the other with JPA 1.0, the specification in the orm.xml file was placed on the root element of the entity


    <entity class="myorg.entityex.mapped.Cat"
        access="FIELD">

Starting with JPA 2.0, we can also make the specification more explicit (like the XML technique) with the addition of the @Access annotation

@javax.persistence.Access(javax.persistence.AccessType.FIELD)

public class Cat2 {

Although switching between FIELD and PROPERTY access was always a capability in JPA 1.0 -- JPA 2.0 added the ability to chose on a per-property basis. This is done by applying the @Access annotation to the getter() you want to have property access. In this section, we will continue to expose all our properties to the provider through FIELD access, but define a PROPERTY access for the "weight" property.

  1. Update the annotated Cat2 entity to store weight as a double and expose it to the provider as a BigDecimal.

        private double weight;
    
    ...
        @javax.persistence.Column(precision=3, scale=1)  //10.2lbs
        @javax.persistence.Access(javax.persistence.AccessType.PROPERTY)
        public BigDecimal getWeight() {
            return new BigDecimal(weight); 
        }
        public void setWeight(BigDecimal weight) {
            this.weight = weight==null ? 0 : weight.doubleValue();
        }
  2. Add a logger and some log statements to help identify the calls to the getter and setter methods

    # src/main/java/myorg/entityex/annotated/Cat2.java
    
    
        private static final Log logger = LogFactory.getLog(Cat2.class);
    ...
        public BigDecimal getWeight() {
            logger.debug("annotated.getWeight()");
            return new BigDecimal(weight); 
        }
        public void setWeight(BigDecimal weight) {
            logger.debug("annotated.setWeight()");
            this.weight = weight==null ? 0 : weight.doubleValue();
        }
  3. Add the following test method to your AnimalTest. By persisting the entity, we will force the provider to get properties from the entity. By clearing the persistence unit of the entity prior to executing the find, we will force the provider to instantiate a new entity instance and set the properties within the entity.

    # src/test/java/myorg/entityex/AnimalTest.java
    
    
        @Test
        public void testCreateCatAnnotated() {
            logger.info("testCreateCatAnnotated");
            myorg.entityex.annotated.Cat2 cat = new myorg.entityex.annotated.Cat2("fluffy", null, 99.9);
            em.persist(cat);                                                 //get provider to call getters
            em.flush(); em.detach(cat);
            cat = em.find(myorg.entityex.annotated.Cat2.class, cat.getId()); //get provider to call setters
        }
  4. Run your new test method and observe the calls to getWeight and setWeight printed.

    $ mvn clean test -Dtest=myorg.entityex.AnimalTest#testCreateCatAnnotated
    
    ...
    -testCreateCatAnnotated
     -annotated.getWeight() //<----------------
    Hibernate: 
        insert 
        into
            ENTITYEX_CAT
            (CAT_ID, dob, name, weight) 
        values
            (null, ?, ?, ?)
     -annotated.getWeight() //<----------------
     -annotated.getWeight() //<----------------