Benchmarking Hibernate Validator and Apache BeanValidation: the two JSR-303 implementations

Until recently, if you decided to use JSR-303 validations on a project, the choice of what implementation to use was an easy one: Hibernate Validator was the reference implementation and the only one that passed the TCK to ensure its correct behavior. But on the 11th of June the Apache BeanValidation team released a version of its validator that passes the TCK, providing a new compliant implementation and giving more choice to the end users.

One of the features that can influence what implementation suits your needs better is how well they perform. So, in this post I will analyse the performance of both validators in various operations (validating, parsing the constraints, etc.) and 3 different scenarios (from simple beans to complex beans using inheritance and groups).

Additionally, the usage of the tool used to benchmark the implementations will be described so you can use it to perform benchmarks more suited to your environment.

The contendants

In case you need more information about the two implementations, here is a brief description of what they have to offer.

Apache BeanValidation

Formerly Agimatec Validation, since March 2010 it has migrated to Apache where it is currently under incubation. One of its most useful extra features is the ability to perform method validation using the same kind of JSR-303 annotations.
The benchmarked version is: 0.1-incubating.

Hibernate Validator

The reference implementation of the standard, coming from the JBoss people. Amongst other extra features, its 4.1.0 release will allow modifications to the constraints definitions in runtime via its programmatic constraint configuration API.
The benchmarked version is: 4.1.0.CR1.

Benchmarking procedure

There are 2 main operations that a validator has to perform. First, it needs to parse the constraints defined in a bean to be able to validate it. If you are following good practices in the usage of the validator (reusing the factory), this operation will only be done once per bean, so its performance, while important, won’t be critical.
The second operation is the validation in itself, and so it will be done every time a validation call is performed. For this reason, the performance of this operation is very important.

Generating the test cases

In order to be able to programatically test beans with different properties, a tool that autogenerates a set of beans has been created. You can grab its code from the sandbox area in Apache’s BeanValidation subversion: jsr303-impl-bench.

The usage of this generator is simple, you simply specify the values for various parameters like the total number of beans, the amount of valid values or the lower and upper bound for the number of fields in the beans, and the generator will create the source files for those beans and a holder bean that will have an array with all the beans.

To benchmark the implementations, 2 junit tests are provided that will use the generated beans as input. Everything is integrated inside a maven project, so simply calling mvn test will generate the beans, compile them and run the tests using them as input.

Additionally, a simple shell script (runner.sh) is provided. This script will read .properties files from the current dir that define the overridden parameters for the generator and benchmark a JSR-303 implementation against those scenarios.

Benchmarked scenarios

Three different scenarios have been benchmarked with the objective of providing an idea of the performance of the two validators when dealing with simple beans; beans with inheritance and no groups; and beans with both inheritance and groups.

  • A very simple scenario (Scenario 1), which will benchmark against 300 beans with no inheritance (all beans inherit from Object) and no groups.
  • A more complex scenario (Scenario 2), which will benchmark against 300 beans, 30% of which will inherit from one of 5 base beans. Again, no groups will be used.
  • An even more complex scenario (Scenario 3), which will benchmark against 300 beans, 60% of which will inherit from one of 10 base beans, and 60% of the beans will have a group sequence definition and their constraints will use some of the 10 defined groups.

The common properties, constant across all the scenarios, are:

  • 80% of the values will be valid.
  • All beans will have between 4 and 7 annotated basic fields (Strings, Integers and ints currently).
  • All beans will have between 1 and 3 annotated fields which reference other beans.
  • The bean graph (created using the references to other beans), will have fill ratios of 80%, 40% and 20% for the first, second and third level respectively.

You can learn about the configuration options by checking the file: generator.default.properties

And for each scenario, four operations will be benchmarked:

  • Raw validation: Validating already parsed beans.
  • Raw parsing: Parsing the metadata constraints defined in the beans.
  • Combined: Validates beans which have not been previously parsed.
  • MT: Launches 4 threads which will validate already parsed beans.

Copies of the properties file for each scenario are available here: scenarios.zip.

Results

Benchmarking the above scenarios in my CoreDuo T2300 @ 1.66Ghz and 2GB of RAM produced the following results. (Each scenario is benchmarked using 20 different bean graphs, and every operation is run 50 times. Times are normalized to represent the execution of a single operation for the 300 beans).

Scenario 1

Results for 300 beans, no inheritance, no groups
Apache implementation is faster when validating beans, both in single threaded and multithreaded benchmarks. Parsing speed is similar, although Apache is a little faster.

Scenario 2

Results for 300 beans, 30% inheritance, no groups
Adding inheritance increases the time spent on parsing and validating, but it must be taken into account that the base beans are also annotated, and so the amount of required work is also bigger.

Results are similar to the first scenario, with the Apache implementation performing better.

Scenario 3

Results for 300 beans, 60% inheritance, 60% beans with groups
Parsing time has increased as more work has to be done in this phase (processing groups and more beans with inheritance). Validating time decreases, but that is to be expected as validation will stop once one group in the sequence is found to have constraint violations.

Again, Apache performs better with results similar to the other two scenarios.

Conclusion

Several conclusions can be extracted from the results:

  • Apache implementation is around 50% faster when validating under these scenarios, and so, it is likely that using it will improve your application’s performance compared to using Hibernate’s Validator.
  • Both implementations performance scales equally from 1 thread to a multi-threaded scenario with 4 threads.
  • Parsing time is roughly equivalent, but this shouldn’t affect your application performance as much as validating performance.
  • The relation between the performance of the two implementations doesn’t change while going from a simple scenario to a more complex one.

How to replicate the results / run your own benchmarks

To replicate the results in this post, you can fetch revision 956368 of the jsr303-impl-bench project, unzip the scenario definitions linked above, and execute runner.sh Apache and runner.sh Hibernate to run the benchmarks.

Also, you may want to benchmark the implementations using a scenario which replicates the one in your application more closely. To do so, simply create a new properties file overriding all the needed variables and run the tests against it.

Reliable execution of persistent transactional asynchronous tasks using only Spring

I recently came across the need of adding the ability of executing asynchronous tasks in a project. The project in question had these needs:

  • Ability to execute transactional tasks in the background (like emailing a generated report to a user).
  • Ability to schedule a task so it doesn’t get executed sooner than the scheduled date.
  • Tasks execute transactional DB operations, so if the transaction fails, they should retry X times before failing.
  • Reliable execution: if a task is scheduled, it is guaranteed to be executed, even if the server fails or the thread executing it dies.
  • Also, the app is already using a DB for data persistence, so it will be preferable to also use it to store the tasks queue instead of requiring extra infrastructure to manage.
  • No need for a JavaEE application server, Tomcat or Jetty should suffice.

As additional requirement, the project size didn’t justify using Quartz or JMS, as they added too much complexity and dependencies to solve a problem that only requires a small fraction of the functionality these solutions provide.

So this just left me with the help of Spring. Spring has support for scheduling and task execution, but the provided executors either rely on plain threads/timers or need Quartz. Plain threads/timers are fine for almost all of the needs, but they don’t cover reliable execution, so if for example the server is rebooted, your tasks would be lost (JavaEE timers can be made persistent, but the project’s target environment was Tomcat).

So building on Spring task executing capabilities, this solution will add persistent storage to the tasks to ensure reliability in their execution.

Initial requirements

For this solution to work, you just need to have Spring 3 with working declarative transactions (@Transactional). I’m using JPA2 for persistence and optimistic locking to ensure data integrity. If you are using different technologies, adapting the solution should be just a manner of changing exception catching and modifying the DAOs.

Configuring Spring

As I said earlier, this solution builds on Spring’s task executing capabilities. This means that I will use Spring to manage the thread pool needed to manage the asynchronous execution of methods marked by @Scheduled. Then, in those methods I will add the necessary logic to manage the actual task execution.

Assuming you have the task schema added to your configuration file, these two lines are the only configuration required to create a thread pool of 10 threads and configure Spring to use that pool to run the annotated methods.

    <!-- A task scheduler that will call @Scheduled methods -->
    <task:scheduler id="myScheduler" pool-size="10"/>
    <task:annotation-driven scheduler="myScheduler"/>

A holder class to store the tasks in a queue

Tasks need to be persisted, and in their persisted status they need to carry some extra information to be able to correctly execute them. So each enqueued task will store this:

  • Creation time-stamp: the moment when the task was initially queued for execution.
  • Triggering time-stamp: a task cannot be executed sooner than this.
  • Started time-stamp: the exact moment when a thread starts executing this task.
  • Completed time-stamp: when a task is successfully completed, this gets filled. Along with the started time-stamp, this allows the executor to detect stalled or dead tasks.
  • Serialized task: the actual task.

My JPA2 entity is as follows:

/**
 * Persistent entity that stores an async task.
 * 
 * @author Carlos Vara
 */
@Entity
@Table(name="TASK_QUEUE")
public class QueuedTaskHolder {
 
    // Getters -----------------------------------------------------------------
 
    @Id
    @MyAppId
    public String getId() {
        if ( this.id == null ) {
            this.setId(UUIDHelper.newUUID());
        }
        return this.id;
    }
 
    @NotNull
    @Past
    @Temporal(TemporalType.TIMESTAMP)
    public Calendar getCreationStamp() {
        return this.creationStamp;
    }
 
    @Temporal(TemporalType.TIMESTAMP)
    public Calendar getTriggerStamp() {
        return triggerStamp;
    }
 
    @Past
    @Temporal(TemporalType.TIMESTAMP)
    public Calendar getStartedStamp() {
        return this.startedStamp;
    }
 
    @Past
    @Temporal(TemporalType.TIMESTAMP))
    public Calendar getCompletedStamp() {
        return this.completedStamp;
    }
 
    @Lob
    @NotNull
    public byte[] getSerializedTask() {
        return this.serializedTask;
    }
 
    @Version
    protected int getVersion() {
        return this.version;
    }
 
 
    // Setters -----------------------------------------------------------------
 
    protected void setId(String id) {
        this.id = id;
    }
 
    public void setCreationStamp(Calendar creationStamp) {
        this.creationStamp = creationStamp;
    }
 
    public void setTriggerStamp(Calendar triggerStamp) {
        this.triggerStamp = triggerStamp;
    }
 
    public void setStartedStamp(Calendar startedStamp) {
        this.startedStamp = startedStamp;
    }
 
    public void setCompletedStamp(Calendar completedStamp) {
        this.completedStamp = completedStamp;
    }
 
    public void setSerializedTask(byte[] serializedTask) {
        this.serializedTask = serializedTask;
    }
 
    public void setVersion(int version) {
        this.version = version;
    }
 
 
    // Fields ------------------------------------------------------------------
 
    private String id;
    private Calendar creationStamp;
    private Calendar triggerStamp = null;
    private Calendar startedStamp = null;
    private Calendar completedStamp = null;
    private byte[] serializedTask;
    private int version;
 
 
    // Lifecycle events --------------------------------------------------------
 
    @SuppressWarnings("unused")
    @PrePersist
    private void onAbstractBaseEntityPrePersist() {
        this.ensureId();
        this.markCreation();
    }
 
    /**
     * Ensures that the entity has a unique UUID.
     */
    private void ensureId() {
        this.getId();
    }
 
    /**
     * Sets the creation stamp to now.
     */
    private void markCreation() {
        setCreationStamp(Calendar.getInstance(TimeZone.getTimeZone("Etc/UTC")));
    }
 
 
    @Override
    public String toString() {
        SimpleDateFormat sdf = new SimpleDateFormat("yyyy.MM.dd HH:mm:ss z");
        return new ToStringCreator(this).append("id", getId())
            .append("creationStamp", (getCreationStamp()!=null)?sdf.format(getCreationStamp().getTime()):null)
            .append("startedStamp", (getStartedStamp()!=null)?sdf.format(getStartedStamp().getTime()):null)
            .append("completedStamp", (getCompletedStamp()!=null)?sdf.format(getCompletedStamp().getTime()):null)
            .toString();
    }    
 
}

A DAO to retrieve tasks from the queue

The executor will do 3 things: enqueue new tasks, get tasks from the queue and execute them, and re-queue tasks that are suspected to be stalled (usually because their executing thread has died). So the DAO has to provide operations to cover those scenarios.

The interface that defines this DAO:

/**
 * DAO operations for the {@link QueuedTaskHolder} entities.
 * 
 * @author Carlos Vara
 */
public interface QueuedTaskHolderDao {
 
    /**
     * Adds a new task to the current persistence context. The task will be
     * persisted into the database at flush/commit.
     * 
     * @param queuedTask
     *            The task to be saved (enqueued).
     */
    void persist(QueuedTaskHolder queuedTask);
 
 
    /**
     * Finder that retrieves a task by its id.
     * 
     * @param taskId
     *            The id of the requested task.
     * @return The task with that id, or <code>null</code> if no such task
     *         exists.
     */
    QueuedTaskHolder findById(String taskId);
 
 
    /**
     * @return A task which is candidate for execution. The receiving thread
     *         will need to ensure a lock on it. <code>null</code> if no
     *         candidate task is available.
     */
    QueuedTaskHolder findNextTaskForExecution();
 
 
    /**
     * @return A task which has been in execution for too long without
     *         finishing. <code>null</code> if there aren't stalled tasks.
     */
    QueuedTaskHolder findRandomStalledTask();
 
}

And my JPA2 implementation (I’m using the new typesafe criteria query):

/**
 * JPA2 implementation of {@link QueuedTaskHolderDao}.
 * 
 * @author Carlos Vara
 */
@Repository
public class QueuedTaskHolderDaoJPA2 implements QueuedTaskHolderDao {
 
 
    // QueuedTaskDao methods ---------------------------------------------------
 
    @Override
    public void persist(QueuedTaskHolder queuedTask) {
        this.entityManager.persist(queuedTask);
    }
 
    @Override
    public QueuedTaskHolder findById(String taskId) {
        return this.entityManager.find(QueuedTaskHolder.class, taskId);
    }
 
    @Override
    public QueuedTaskHolder findNextTaskForExecution() {
 
        Calendar NOW = Calendar.getInstance();
 
        // select qt from QueuedTask where
        //      qt.startedStamp == null AND
        //      (qth.triggerStamp == null || qth.triggerStamp < NOW)
        // order by qth.version ASC, qt.creationStamp ASC
        CriteriaBuilder cb = this.entityManager.getCriteriaBuilder();
        CriteriaQuery<QueuedTaskHolder> cq = cb.createQuery(QueuedTaskHolder.class);
        Root<QueuedTaskHolder> qth = cq.from(QueuedTaskHolder.class);
        cq.select(qth)
            .where(cb.and(cb.isNull(qth.get(QueuedTaskHolder_.startedStamp)), 
                    cb.or(
                            cb.isNull(qth.get(QueuedTaskHolder_.triggerStamp)),
                            cb.lessThan(qth.get(QueuedTaskHolder_.triggerStamp), NOW))))
            .orderBy(cb.asc(qth.get(QueuedTaskHolder_.version)), cb.asc(qth.get(QueuedTaskHolder_.creationStamp)));
 
        List<QueuedTaskHolder> results = this.entityManager.createQuery(cq).setMaxResults(1).getResultList();
        if ( results.isEmpty() ) {
            return null;
        }
        else {
            return results.get(0);
        }
 
    }
 
    @Override
    public QueuedTaskHolder findRandomStalledTask() {
 
        Calendar TOO_LONG_AGO = Calendar.getInstance();
        TOO_LONG_AGO.add(Calendar.SECOND, -7200);
 
        // select qth from QueuedTask where 
        //      qth.startedStamp != null AND
        //      qth.startedStamp < TOO_LONG_AGO
        CriteriaBuilder cb = this.entityManager.getCriteriaBuilder();
        CriteriaQuery<QueuedTaskHolder> cq = cb.createQuery(QueuedTaskHolder.class);
        Root<QueuedTaskHolder> qth = cq.from(QueuedTaskHolder.class);
        cq.select(qth).where(
                cb.and(
                        cb.isNull(qth.get(QueuedTaskHolder_.completedStamp)),
                        cb.lessThan(qth.get(QueuedTaskHolder_.startedStamp), TOO_LONG_AGO)));
 
        List<QueuedTaskHolder> stalledTasks = this.entityManager.createQuery(cq).getResultList();
 
        if ( stalledTasks.isEmpty() ) {
            return null;
        }
        else {
            Random rand = new Random(System.currentTimeMillis());
            return stalledTasks.get(rand.nextInt(stalledTasks.size()));
        }
 
    }
 
 
    // Injected dependencies ---------------------------------------------------
 
    @PersistenceContext
    private EntityManager entityManager;
 
}

As it can be seen in the implementation, the “definitions” for a stalled task and the priorities given to the tasks in the queue can be easily tweaked in case it’s needed.

Currently, tasks can be retrieved from the queue as soon as their triggering stamp is reached, and they are ordered by the amount of times they have been tried to execute (a trick using the version column) and by how old they are. It’s easy to add an extra condition for example to never query tasks that have failed too many times.

The executor

Now, the most important piece in the system. The executor will:

  • Enqueue (persist) tasks received.
  • Retrieve tasks that need to be executed. Ensure that the current thread gets a proper lock on the task so it’s the only one attempting its execution.
  • Check for stalled tasks and re-queue them.

The first operation is synchronous, and in my scenario gets executed in the same transaction as the operation that requests the task execution, this way, if for whatever reason the current transaction fails, no spurious tasks are queued.

The other two operations are asynchronous and their execution is managed by the thread pool that was configured in the first step. The rate of execution can be adjusted depending on the amount of tasks that your system needs to manage. Also, these methods will execute/re-queue as many tasks as they can while they have work to do, so there is no need for setting the rates too high.

The executor implements Spring’s TaskExecutor interface, so it can be easily substituted by another implementation should the need for it arise.

/**
 * A task executor with persistent task queueing.
 * 
 * @author Carlos Vara
 */
@Component("MyTaskExecutor")
public class MyTaskExecutor implements TaskExecutor {
 
    final static Logger logger = LoggerFactory.getLogger(MyTaskExecutor.class);
 
    @Autowired
    protected QueuedTaskHolderDao queuedTaskDao;
 
    @Autowired
    protected Serializer serializer;
 
 
    /**
     * Additional requirement: must be run inside a transaction.
     * Currently using MANDATORY as the app won't create tasks outside a
     * transaction.
     * 
     * @see org.springframework.core.task.TaskExecutor#execute(java.lang.Runnable)
     */
    @Override
    @Transactional(propagation=Propagation.MANDATORY)
    public void execute(Runnable task) {
 
        logger.debug("Trying to enqueue: {}", task);
 
        AbstractBaseTask abt; 
        try {
            abt = AbstractBaseTask.class.cast(task);
        } catch (ClassCastException e) {
            logger.error("Only runnables that extend AbstractBaseTask are accepted.");
            throw new IllegalArgumentException("Invalid task: " + task);
        }
 
        // Serialize the task
        QueuedTaskHolder newTask = new QueuedTaskHolder();
        byte[] serializedTask = this.serializer.serializeObject(abt);
        newTask.setTriggerStamp(abt.getTriggerStamp());
 
        logger.debug("New serialized task takes {} bytes", serializedTask.length);
 
        newTask.setSerializedTask(serializedTask);
 
        // Store it in the db
        this.queuedTaskDao.persist(newTask);
 
        // POST: Task has been enqueued
    }
 
 
    /**
     * Runs enqueued tasks.
     */
    @Scheduled(fixedRate=60l*1000l) // Every minute
    public void runner() {
 
        logger.debug("Started runner {}", Thread.currentThread().getName());
 
        QueuedTaskHolder lockedTask = null;
 
        // While there is work to do...
        while ( (lockedTask = tryLockTask()) != null ) {
 
            logger.debug("Obtained lock on {}", lockedTask);
 
            // Deserialize the task
            AbstractBaseTask runnableTask = this.serializer.deserializeAndCast(lockedTask.getSerializedTask());
            runnableTask.setQueuedTaskId(lockedTask.getId());
 
            // Run it
            runnableTask.run();
        }
 
        logger.debug("Finishing runner {}, nothing else to do.", Thread.currentThread().getName());
    }
 
 
    /**
     * The hypervisor re-queues for execution possible stalled tasks.
     */
    @Scheduled(fixedRate=60l*60l*1000l) // Every hour
    public void hypervisor() {
 
        logger.debug("Started hypervisor {}", Thread.currentThread().getName());
 
        // Reset stalled threads, one at a time to avoid too wide transactions
        while ( tryResetStalledTask() );
 
        logger.debug("Finishing hypervisor {}, nothing else to do.", Thread.currentThread().getName());
    }
 
 
    /**
     * Tries to ensure a lock on a task in order to execute it.
     * 
     * @return A locked task, or <code>null</code> if there is no task available
     *         or no lock could be obtained.
     */
    private QueuedTaskHolder tryLockTask() {
 
        int tries = 3;
 
        QueuedTaskHolder ret = null;
        while ( tries > 0 ) {
            try {
                ret = obtainLockedTask();
                return ret;
            } catch (OptimisticLockingFailureException e) {
                tries--;
            }
        }
 
        return null;
    }
 
    /**
     * Tries to reset a stalled task.
     * 
     * @return <code>true</code> if one task was successfully re-queued,
     *         <code>false</code> if no task was re-queued, either because there
     *         are no stalled tasks or because there was a conflict re-queueing
     *         it.
     */
    private boolean tryResetStalledTask() {
        int tries = 3;
 
        QueuedTaskHolder qt = null;
        while ( tries > 0 ) {
            try {
                qt = resetStalledTask();
                return qt != null;
            } catch (OptimisticLockingFailureException e) {
                tries--;
            }
        }
 
        return false;
    }
 
    /**
     * @return A locked task ready for execution, <code>null</code> if no ready
     *         task is available.
     * @throws OptimisticLockingFailureException
     *             If getting the lock fails.
     */
    @Transactional
    public QueuedTaskHolder obtainLockedTask() {
        QueuedTaskHolder qt = this.queuedTaskDao.findNextTaskForExecution();
        logger.debug("Next possible task for execution {}", qt);
        if ( qt != null ) {
            qt.setStartedStamp(Calendar.getInstance(TimeZone.getTimeZone("etc/UTC")));
        }
        return qt;
    }
 
 
    /**
     * Tries to reset a stalled task, returns null if no stalled task was reset.
     * 
     * @return The re-queued task, <code>null</code> if no stalled task is
     *         available.
     * @throws OptimisticLockingFailureException
     *             If the stalled task is modified by another thread during
     *             re-queueing.
     */
    @Transactional
    public QueuedTaskHolder resetStalledTask() {
        QueuedTaskHolder stalledTask = this.queuedTaskDao.findRandomStalledTask();
        logger.debug("Obtained this stalledTask {}", stalledTask);
        if ( stalledTask != null ) {
            stalledTask.setStartedStamp(null);
        }
        return stalledTask;
    }
 
}

The base task and an example task

Now, to ensure the correct transactionality of the task execution, and that they get correctly de-queued upon completion, some extra work has to be done during their execution. This extra functionality will be centralized in a base abstract task class, from whom all the tasks in the system will inherit.

/**
 * Superclass for all async tasks.
 * <ul>
 *  <li>Ensures that its associated queued task is marked as completed in the same tx.</li>
 *  <li>Marks the task as serializable.</li>
 * </ul>
 * 
 * @author Carlos Vara
 */
public abstract class AbstractBaseTask implements Runnable, Serializable {
 
    final static Logger logger = LoggerFactory.getLogger(AbstractBaseTask.class);
 
 
    // Common data -------------------------------------------------------------
 
    private transient String queuedTaskId;
    private transient QueuedTaskHolder qth;
    private transient Calendar triggerStamp;
 
 
    public void setQueuedTaskId(String queuedTaskId) {
        this.queuedTaskId = queuedTaskId;
    }
 
    public String getQueuedTaskId() {
        return queuedTaskId;
    }
 
    public void setTriggerStamp(Calendar triggerStamp) {
        this.triggerStamp = triggerStamp;
    }
 
    public Calendar getTriggerStamp() {
        return triggerStamp;
    }
 
 
    // Injected components -----------------------------------------------------
 
    @Autowired(required=true)
    protected transient QueuedTaskHolderDao queuedTaskHolderDao;
 
 
    // Lifecycle methods -------------------------------------------------------
 
    /**
     * Entrance point of the task.
     * <ul>
     *  <li>Ensures that the associated task in the queue exists.</li>
     *  <li>Marks the queued task as finished upon tx commit.</li>
     *  <li>In case of tx rollback, frees the task.</li>
     * </ul>
     * 
     * @see java.lang.Runnable#run()
     */
    @Override
    final public void run() {
 
        try {
            transactionalOps();
        } catch (RuntimeException e) {
            // Free the task, so it doesn't stall
            logger.warn("Exception forced task tx rollback: {}", e);
            freeTask();
        }
 
    }
 
    @Transactional
    private void transactionalOps() {
        doInTxBeforeTask();
        doTaskInTransaction();
        doInTxAfterTask();
    }
 
    @Transactional
    private void freeTask() {
        QueuedTaskHolder task = this.queuedTaskHolderDao.findById(this.queuedTaskId);
        task.setStartedStamp(null);
    }
 
 
    /**
     * Ensures that there is an associated task and that its state is valid.
     * Shouldn't be needed, just for extra security.
     */
    private void doInTxBeforeTask() {
        this.qth = this.queuedTaskHolderDao.findById(this.queuedTaskId);
        if ( this.qth == null ) {
            throw new IllegalArgumentException("Not executing: no associated task exists: " + this.getQueuedTaskId());
        }
        if ( this.qth.getStartedStamp() == null || this.qth.getCompletedStamp() != null ) {
            throw new IllegalArgumentException("Illegal queued task status: " + this.qth);
        }
    }
 
 
    /**
     * Method to be implemented by concrete tasks where their operations are
     * performed.
     */
    public abstract void doTaskInTransaction();
 
 
    /**
     * Marks the associated task as finished.
     */
    private void doInTxAfterTask() {
        this.qth.setCompletedStamp(Calendar.getInstance());
    }
 
 
    private static final long serialVersionUID = 1L;
}

The class also holds a trigger stamp field that can be used before calling MyTaskExecutor.execute() to schedule the task for a given time and date.

A simple (and useless) example task that extends this base task:

/**
 * Logs the status of a User.
 * 
 * @author Carlos Vara
 */
@Configurable
public class ExampleTask extends AbstractBountyTask {
 
    final static Logger logger = LoggerFactory.getLogger(ExampleTask.class);
 
 
    // Injected components -----------------------------------------------------
 
    @Autowired
    private transient UserDao userDao;
 
 
    // Data --------------------------------------------------------------------
 
    private final String userId;
 
 
    public ExampleTask(String userId) {
        this.userId = userId;
    }
 
 
    /**
     * Logs the status of a user.
     */
    @Override
    public void doTaskInTransaction() {
 
        // Get the user
        User user = this.userDao.findBagById(this.userId);
        if ( user == null ) {
            logger.error("User {} doesn't exist in the system.", userId);
            return;
        }
 
        // Log the user status
        logger.info("User status: {}", user.getStatus());
    }
 
    private static final long serialVersionUID = 1L;
}

It’s important to note that I’m using Spring’s @Configurable to manage dependency injection after the tasks have been deserialized. You can solve this problem in a different way if using aspectj isn’t a possibility.

And finally, an example of how to use it

Last thing, a piece of simple code that shows how to send a task to the background to be executed as soon as possible and how to schedule a task so it will be executed the next day:

@Service
public class ExampleServiceImpl implements ExampleService {
 
    @Qualifier("BountyExecutor")
    @Autowired
    private TaskExecutor taskExecutor;
 
    @Transactional
    public void example() {
        // Task will execute ASAP
        this.taskExecutor.execute(new ExampleTask("1"));
        // Task won't execute until tomorrow
        ExampleTask et = new ExampleTask("2");
        Calendar tomorrow = Calendar.getInstance();
        tomorrow.add(Calendar.DAY, 1);
        et.setTriggerStamp(tomorrow);
        this.taskExecutor.execute(et);
    }
}

An explanation of a task lifetime

Given that the algorithm presented here is a bit complex, I will details the steps in the lifetime of a task to clarify how the system ensures reliable execution.

Step 1: task queuing

A task is enqueued when calling MyTaskExecutor.execute(). The en-queuing is part of the transaction opened in the service method that creates the task, so if that transaction fails, both your service method changes and the task data are left uncommitted, which is the correct behavior.

Step 2: task locking

Your task is stored in the DB, and it has its started and completed stamps set to null. This means that it hasn’t been executed yet, and that it seems that nobody is trying to execute it. The executor then tries to lock it, by fetching it from the db and setting its started stamp. If that transaction succeeds, it’s guaranteed that the thread is the only one with that task assigned. If the thread were to die now, in between transactions, the task would eventually become stalled and be re-queued by the hypervisor.

Step 3: task execution

Now that the thread has a lock in the task, the execution starts. A new transaction is started, and the task operations are performed inside it along with marking the task as completed at the end of the transaction. If the transaction succeeds, the task will be correctly de-queued as part of it. If it fails, a try is done to free the task immediately, but if this try also failed (or its code was never reached) the task would be eventually collected by the hypervisor.

And that’s it. Hope you find it useful, please post a comment if you successfully re-use the system :-)

Edit: 2010-07-05
I shared a template project which illustrates this system at github: http://github.com/CarlosVara/spring-async-persistent-tasks

Automatic validation of method calls with JSR-303 (Appendix-C of the specification)

The recently approved Bean Validation Standard (JSR-303) left one great (and requested) feature out of the specification: method validation. This proposal defined an additional API for the Validator with methods that allowed validation of method/constructor parameters as well as the return value of methods. Thankfully, even though this spec didn’t make it to the final approved document, all constraint annotations accept parameter as target, so the door was left open for it to be implemented as an extra feature.
methodvalidation.tar

Apache BeanValidation

There are currently 2 implementations of the standard, Hibernate Validator and Apache BeanValidation (formerly agimatec-validator, and as of this march an Apache Incubator project). Of these two, only Apache BeanValidation supports the additional method validation API, so it is the only choice if you need that feature, and it’s what I will use as base for this example.

Method validation API

The proposed additional methods in the bean validation are the following:

<T> Set<ConstraintViolation<T>> validateParameters(Class<T> clazz, Method method,
                                                       Object[] parameterValues,
                                                       Class<?>... groups);
 
<T> Set<ConstraintViolation<T>> validateParameter(Class<T> clazz, Method method,
                                                   Object parameterValue,
                                                   int parameterIndex,
                                                   Class<?>... groups);
 
<T> Set<ConstraintViolation<T>> validateReturnedValue(Class<T> clazz, Method method,
                                                       Object returnedValue,
                                                       Class<?>... groups);
 
<T> Set<ConstraintViolation<T>> validateParameters(Class<T> clazz,
                                                    Constructor constructor,
                                                    Object[] parameterValues,
                                                    Class<?>... groups);
 
 
<T> Set<ConstraintViolation<T>> validateParameter(Class<T> clazz,
                                                   Constructor constructor,
                                                   Object parameterValue,
                                                   int parameterIndex,
                                                   Class<?>... groups);

So, to validate the parameters of a method call, one would call validateParameters with the holder class, the method description and the parameter values as parameters, and the output would be similar than when validating a bean.

And how do you specify the constraints? In the method declaration, as in this example:

@NotNull
@NotEmpty
public String operation(@NotNull @Pattern(regexp="[0-9]{2}") String param) {
   // Your code
   return val;
}

This enhanced method declaration indicates that the param value cannot be null, and must be matched by the regular expression [0-9]{2}. In the same way, the value returned by the function cannot be null or an empty string.

Automatic validation using AspectJ

Being one good example of a crosscutting concern, validation code can easily pollute all your application code and maintenance can become really difficult. So, a good way to implement it automatically is using AspectJ. This way, you will decide in a single place (the pointcut) what method and constructors you want to be validated, and the validation code will also be centralized in a single place (the advice).

The aspect implementing this functionality is as follows:

package net.carinae.methodvalidation;
 
import java.util.Arrays;
import java.util.Set;
import javax.validation.ConstraintViolation;
import javax.validation.Validation;
import javax.validation.ValidationException;
import javax.validation.ValidatorFactory;
import org.apache.bval.jsr303.extensions.MethodValidator;
import org.aspectj.lang.reflect.ConstructorSignature;
import org.aspectj.lang.reflect.MethodSignature;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
 
/**
 * Enforces correct parameters and return values on the adviced methods and constructors.
 * <p>
 * NOTE: Currently only works with Apache BeanValidation.
 * 
 * @author Carlos Vara
 */
public aspect MethodValidationAspect {
 
	final static Logger logger = LoggerFactory.getLogger(MethodValidationAspect.class);
 
	static private ValidatorFactory factory;
 
	static {
		factory = Validation.buildDefaultValidatorFactory();
	}
 
	static private MethodValidator getMethodValidator() {
		return factory.getValidator().unwrap(MethodValidator.class);
	}
 
 
	pointcut validatedMethodCall() : execution(@ValidatedMethodCall * *(..));
 
	pointcut validatedConstructorCall() : execution(@ValidatedConstructorCall * .new(..));
 
	pointcut validatedReturnValue() : validatedMethodCall() && execution(!void *(..));
 
 
	/**
	 * Validates the method parameters.
	 */
	before() : validatedMethodCall() {
 
		MethodSignature methodSignature = (MethodSignature)thisJoinPoint.getSignature();
 
		logger.trace("Validating call: {} with args {}", methodSignature.getMethod(), Arrays.toString(thisJoinPoint.getArgs()));
 
		Set<? extends ConstraintViolation<?>> validationErrors = getMethodValidator().validateParameters(thisJoinPoint.getThis().getClass(), methodSignature.getMethod(), thisJoinPoint.getArgs());
 
		if ( validationErrors.isEmpty() ) {
			logger.trace("Valid call");
		}
		else {
			logger.warn("Invalid call");
			RuntimeException ex = buildValidationException(validationErrors);
			throw ex;
		}
 
	}
 
 
	/**
	 * Validates the constructor parameters.
	 */
	before() : validatedConstructorCall() {
 
		ConstructorSignature constructorSignature = (ConstructorSignature)thisJoinPoint.getSignature();
 
		logger.trace("Validating constructor: {} with args {}", constructorSignature.getConstructor(), Arrays.toString(thisJoinPoint.getArgs()));
 
		Set<? extends ConstraintViolation<?>> validationErrors = getMethodValidator().validateParameters(thisJoinPoint.getThis().getClass(), constructorSignature.getConstructor(), thisJoinPoint.getArgs());
 
		if ( validationErrors.isEmpty() ) {
			logger.trace("Valid call");
		}
		else {
			logger.warn("Invalid call");
			RuntimeException ex = buildValidationException(validationErrors);
			throw ex;
		}
	}
 
 
	/**
	 * Validates the returned value of a method call.
	 * 
	 * @param ret The returned value
	 */
	after() returning(Object ret) : validatedReturnValue() {
 
		MethodSignature methodSignature = (MethodSignature)thisJoinPoint.getSignature();
 
		logger.trace("Validating returned value {} from call: {}", ret, methodSignature.getMethod());
 
		Set<? extends ConstraintViolation<?>> validationErrors = getMethodValidator().validateReturnedValue(thisJoinPoint.getThis().getClass(), methodSignature.getMethod(), ret);
 
		if ( validationErrors.isEmpty() ) {
			logger.info("Valid call");
		}
		else {
			logger.warn("Invalid call");
			RuntimeException ex = buildValidationException(validationErrors);
			throw ex;
		}
 
	}
 
 
	/**
	 * @param validationErrors The errors detected in a method/constructor call.
	 * @return A RuntimeException with information about the detected validation errors. 
	 */
	private RuntimeException buildValidationException(Set<? extends ConstraintViolation<?>> validationErrors) {
		StringBuilder sb = new StringBuilder();
		for (ConstraintViolation<?> cv : validationErrors ) {
			sb.append("\n" + cv.getPropertyPath() + "{" + cv.getInvalidValue() + "} : " + cv.getMessage());
		}
		return new ValidationException(sb.toString());
	}
 
}

I have defined 3 simple pointcuts (to advice parameter validation of method and constructors, and return value of methods), and I have used 2 custom made interfaces to annotate the methods in which I want validation to be performed. You may of course want to tweak those pointcuts to adapt to your environment. The code of the 2 interfaces is included in the packaged project.

Some gotchas

Take care that the current implementation of the method validation API is still experimental. As of this writing, many constraints still don’t work (I used a patched build for the @Size and @Pattern constraints to work), but it’s just a matter of time that all the features available for bean validation work as well for parameter validation.

If you want to start from a template, I have attached a simple maven project that shows the use of this technique and has the aspect and needed interfaces code in it.
Download it here: methodvalidation.tar.gz

Testing that an entity doesn’t get modified (no setters called) with mockito

I have recently started using mockito to improve the quality of my unit tests. It’s a great tool, with a very clear syntax that makes tests very readable. Also, if you follow a BDD pattern in your tests, mockito has aliases in BDDMockito so that your actions can clearly follow the given/then/when template.

I’m using it to test the behaviour of some transactional services. The underlying persistence technology used is JPA, so checking that some entity is not modified is a bit hard. Why? Because with JPA, the usual way of updating an entity is by bringing it into your persistence context, modifying it, and then simply closing the context. The entity manager will detect that the retrieved entity has changed, and will create the appropriate update statement.

So, if I want to ensure that a method in the service layer doesn’t modify an entity, a clean way of doing it is ensuring that no setter methods are called on that entity. In order to be able to test this with mockito, you need to do two things: first, make the mocked DAO return a “spy” of the entity; and second, verify that no spurious set* methods have been called on that spy during the service call.

The fist part is the easy one, simply create the entity that should be returned, and once it has all of its expected values set, wrap it in a spy.

MyEntity entity = new MyEntity();
entity.setId(1);
entity.setText("hello");
entity = spy(entity);
given(mockedEntityDao.findEntityById(1)).willReturn(entity);

The second part is trickier. A new verification mode needs to be created. Its verify method will have access to the list of all the called methods in the spy, and it’s simply a matter of checking that no method that has the signature of a setter (its name matches the expression set[A-Z].* and has 1 parameter) has been called. The code of this verification mode is as follows:

import java.util.List;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.mockito.exceptions.Reporter;
import org.mockito.internal.invocation.Invocation;
import org.mockito.internal.verification.api.VerificationData;
import org.mockito.internal.verification.api.VerificationMode;
 
/**
 * Mockito verification mode to ensure that no setter calls have been performed
 * on a mock/spy.
 * 
 * @author Carlos Vara
 */
public class NoSetterCalls implements VerificationMode {
 
    private final Reporter reporter;
    private final Pattern regexPattern;
 
    public NoSetterCalls() {
        this.reporter = new Reporter();
        this.regexPattern = Pattern.compile("set[A-Z].*");
    }
 
    @Override
    public void verify(VerificationData data) {
 
        List<Invocation> invocations = data.getAllInvocations();
        for ( Invocation inv : invocations ) {
 
            Matcher m = this.regexPattern.matcher(inv.getMethodName());
            if ( m.matches() && inv.getArgumentsCount() == 1 ) {
                // A setter has been called!
                this.reporter.neverWantedButInvoked(inv, inv.getLocation());
            }
 
        }
 
    }
 
}

And finally, an example test would be as follows:

@Test
public void checkSimpleAlternativeParagraph() {
 
    // GIVEN
    //  - A valid call to a service methodproposal to create a new alternative
    String callParam = "1";
    //  - And the expected DAO behaviour
    MyEntity entity = new MyEntity();
    entity.setId(1);
    entity.setText("hello");
    entity = spy(entity);
    given(mockedEntityDao.findEntityById(1)).willReturn(entity);
 
    // WHEN
    this.entityService.doSomethingButDontModify(callParam);
 
    // THEN
    //  - Verify that the queried entity hasn't been modified
    verify(entity, new NoSetterCalls()).setId("this setId() call is ignored");
 
}

And it works. The only drawback, is that you have to complete the verification call with an extra method call, the setId() call in this case, but could be any other method. This call has no side-effects (it gets passed as parameter to the NoSetterCalls verifier) but disturbs a little the clarity of the test.

Note when using JPA2

In JPA2, the entity manager has a detach method. If your application uses it, the way to test that no modification is performed would be to check that no setters followed by a flush are performed in the entity prior to calling detach. Or more easily, add a DAO method that returns a detached entity, and move this check to the DAO.

Integration of JSR 303 bean validation standard and Wicket 1.4

In this entry I will show a way to integrate the new JSR 303 bean validation standard with Wicket 1.4. The resulting example form will have AJAX callbacks to inform the user promptly about validation errors and these messages will be internationalized according to the locale associated with the user’s session. Spring 3 will be used to manage the Validator instance.

To get in perspective, in this example a user registration form (inside of a panel) will be created. The form will have 4 inputs: email, password, password verification and user age. When a user fills an input, an AJAX callback to the server will be done to validate that input. In case of a validation error, the reported errors will appear next to that input.

The UserRegistrationPanel

This panel will contain two components: the form and a feedback panel where the errors which are not associated to a single input will be reported (when the 2 supplied passwords don’t match for example).

The panel markup is as follows:

<html xmlns:wicket="http://wicket.apache.org/dtds.data/wicket-xhtml1.3-strict.dtd">         
<body>
<wicket:panel>
 
    <form wicket:id="registrationForm">
        <fieldset>        
            <div wicket:id="validatedEmailBorder" class="entry">
                <label for="email">Your e-mail:</label>
                <input wicket:id="email" name="email" type="text" />
            </div>            
            <div wicket:id="validatedPasswordBorder" class="entry">
                <label for="password">Choose password:</label>
                <input wicket:id="password" name="password" type="password" />
            </div>            
            <div wicket:id="validatedPasswordVerificationBorder" class="entry">
                <label for="passwordVerification">Re-type password:</label>
                <input wicket:id="passwordVerification" name="passwordVerification" type="password" />
            </div>            
            <div wicket:id="validatedAgeBorder" class="entry">
                <label for="age">Your age:</label>
                <input wicket:id="age" name="age" type="text" />
            </div>        
            <input type="submit" value="Register!"/>
        </fieldset>
    </form>
 
    <div wicket:id="feedback" class="feedback"></div>
 
</wicket:panel>
</body>
</html>

Only one thing to explain in here, the inputs are surrounded with a border component. This way, it will be easy to control the extra markup needed to show the input related validation errors.

And now, the associated code UserRegistrationPanel.java:

public class UserRegistrationPanel extends Panel {
 
	public UserRegistrationPanel(String id) {
		super(id);
 
		// Insert the form and the feedback div
		RegistrationForm regForm;
		add(regForm = new RegistrationForm("registrationForm"));
		add(new FeedbackPanel("feedback", new ComponentFeedbackMessageFilter(regForm)).setOutputMarkupId(true));
	}
 
	public final class RegistrationForm extends StatelessForm<NewUser> {
 
		public RegistrationForm(String id) {
			super(id, new CompoundPropertyModel<NewUser>(new NewUser()));
 
			TextField<String> emailInput = new TextField<String>("email");
			add(new InputValidationBorder<NewUser>("validatedEmailBorder", this, emailInput));
 
			PasswordTextField passwordInput = new PasswordTextField("password");
			add(new InputValidationBorder<NewUser>("validatedPasswordBorder", this, passwordInput));
 
			PasswordTextField passwordVerificationInput = new PasswordTextField("passwordVerification");
			add(new InputValidationBorder<NewUser>("validatedPasswordVerificationBorder", this, passwordVerificationInput));
 
			TextField<Integer> ageInput = new TextField<Integer>("age");
			add(new InputValidationBorder<NewUser>("validatedAgeBorder", this, ageInput));
 
			add(new Jsr303FormValidator(usernameInput, passwordInput, passwordVerificationInput, ageInput));
		}
 
		@Override
		protected void onSubmit() {
			// The NewUser model object is valid!
			// Perform your logic in here...
		}
 
	}
 
}

Now, there is a few things to explain in here:

  • The form has a NewUser bean associated. Its code will be shown in the next section.
  • The InputValidationBorder encapsulates the functionality to validate an input without validating the full bean and show the validation errors next to that input.
  • The Jsr303FormValidator is a form validator. It will only be called when its associated input components are valid (the email, passwords and age) and it will perform a bean scoped validation (in this case, it will check that the 2 supplied passwords are the same). In case it fails, the error will be reported in the panel’s feedback panel.
  • As the feedback panel should only report the errors that aren’t associated with a single input, its model is set so that only errors related to RegForm are reported. The only source of these messages will be the Jsr303FormValidator.

The bean to be validated

The form’s model is a NewUser bean. This bean encapsulates all the data that is requested to a new user. For every property in the bean some constraints must be enforced so they will be annotated following the JSR 303 standard. This is the resulting code NewUser.java:

@PasswordVerification
public class NewUser implements Serializable {
 
	// The email
	private String email;
 
	@NotNull
	@Email
	@Size(min=4,max=255)
	public String getEmail() {
		return this.email;
	}
 
	public void setEmail(String email) {
		this.email = email;
	}
 
 
	// The password (uncyphered at this stage)
	private String password;
 
	@NotNull
	@Size(min=4)
	public String getPassword() {
		return this.password;
	}
 
	public void setPassword(String password) {
		this.password = password;
	}
 
 
	// The password verification
	private String passwordVerification;
 
	@NotNull
	public String getPasswordVerification() {
		return this.passwordVerification;
	}
 
	public void setPasswordVerification(String passwordVerification) {
		this.passwordVerification = passwordVerification;
	}
 
 
	// The age
	private Integer age;
 
	@NotNull
	@Max(140)
	@Min(18)
	public Integer getAge() {
		return this.age;
	}
 
	public void setAge(Integer age) {
		this.age = age;
	}
 
}

The PasswordVerification enforces that both supplied passwords match. It will be explained later. The Email annotation enforces a valid email address. It is a non-standard constraint part of hibernate validator, but you can easily code a replacement in case you are using a different validating engine and it doesn’t have that annotation.

The InputValidationBorder

This border component performs two functions:

  • It encapsulates the input scoped validation logic.
  • And it provides a way to show the related errors close to the input.

It is a generic class whose parameter T is the class of the form’s model object. Its code is as follows:

public class InputValidationBorder<T> extends Border {
 
	protected FeedbackPanel feedback;
 
	public InputValidationBorder(String id, final Form<T> form, final FormComponent<? extends Object> inputComponent) {
		super(id);
		add(inputComponent);
		inputComponent.setRequired(false);
		inputComponent.add(new AjaxFormComponentUpdatingBehavior("onblur") {
 
			@Override
			protected void onUpdate(AjaxRequestTarget target) {
				target.addComponent(InputValidationBorder.this.feedback);
			}
 
			@Override
			protected void onError(AjaxRequestTarget target, RuntimeException e) {
				target.addComponent(InputValidationBorder.this.feedback);
			}
 
		});
 
		inputComponent.add(new Jsr303PropertyValidator(form.getModelObject().getClass(), inputComponent.getId()));
 
		add(this.feedback = new FeedbackPanel("inputErrors", new ContainerFeedbackMessageFilter(this)));
		this.feedback.setOutputMarkupId(true);
	}
 
}

Again, a few things must be explained:

  • The input component is set to not-required. Bean validation will take care of that constraint in case the property is marked as @NotNull.
  • The added AjaxFormComponentUpdatingBehavior must override both onUpdate and onError. In both cases, when the methods are called the validation has already taken place. When the validation fails, onError is called, and the feedback component must be in the target to show the error messages. And when the validation succeeds, onUpdate is called, and the feedback component must again be in the target, so any older messages get cleared.
  • To save code, a convention where the same name for the input component id’s and their associated property in NewUser is used. Thats the reason the Jsr303PropertyValidator is instantiated with the inputComponent’s id.

The associated markup, InputValidationBorder.html is very simple. It just provides a placeholder for the feedback panel next to the input component:

<html xmlns:wicket="http://wicket.apache.org/dtds.data/wicket-xhtml1.3-strict.dtd">  
<wicket:border>
    <wicket:body/>
    <span wicket:id="inputErrors"></span>
</wicket:border>
</html>

Jsr303PropertyValidator

This is a custom made validator that enforces the JSR 303 constraints on the indicated bean property. It implements INullAcceptingValidator (which extends IValidator) so also null values will be passed to the validator.

The validator instance is a Spring supplied bean. It is very easy to integrate Wicket with Spring. Just take care that if you must inject dependencies in something that is not a component, you will have to manually call the injector (as it is done in this validator). Also, in case you decide not to use Spring, you can easily change the code to obtain the validator from the Validation class.

The code of Jsr303PropertyValidator.java is as follows:

public class Jsr303PropertyValidator<T, Z> implements INullAcceptingValidator<T> {
 
	@SpringBean
	protected Validator validator;
 
	protected String propertyName;
	protected Class<Z> beanType;
 
	public Jsr303PropertyValidator(Class<Z> clazz, String propertyName) {
		this.propertyName = propertyName;
		this.beanType = clazz;
		injectDependencies();
	}
 
 
	private void injectDependencies() {
		InjectorHolder.getInjector().inject(this);
	}
 
 
	@Override
	public void validate(IValidatable<T> validatable) {
		Set<ConstraintViolation<Z>> res = this.validator.validateValue(this.beanType, this.propertyName, validatable.getValue());
		for ( ConstraintViolation<Z> vio : res ) {
			validatable.error(new ValidationError().setMessage(vio.getMessage()));
		}
	}
 
}

The class is generic: T is the class of the property to be validated, while Z is the class of the bean which contains the property (in this case, NewUser).

Jsr303FormValidator

This class implements IFormValidator, and it will be called when all the validations for the associated components have succeeded. It performs a full bean validation (not just the class level annotations), so you may use it to enforce individual properties as well. In this example, as all the properties’ constraints get previously validated via the Jsr303PropertyValidator, only the bean scoped constraints can fail.

This is the code of the class:

public class Jsr303FormValidator implements IFormValidator {
 
	@SpringBean
	protected Validator validator;
 
	private final FormComponent<?>[] components;
 
 
	public Jsr303FormValidator(FormComponent<?>...components) {
		this.components = components;
		injectDependencies();
	}
 
	private void injectDependencies() {
		InjectorHolder.getInjector().inject(this);
	}
 
 
	@Override
	public FormComponent<?>[] getDependentFormComponents() {
		return this.components;
	}
 
	@Override
	public void validate(Form<?> form) {
 
		ConstraintViolation[] res = this.validator.validate(form.getModelObject()).toArray(new ConstraintViolation[0]);
		for ( ConstraintViolation vio : res ) {
			form.error(new ValidationError().setMessage(vio.getMessage()));
		}
 
	}
 
}

The @PasswordVerification constraint

The NewUser bean is annotated with this constraint, that will enforce that the password and passwordVerification fields are the same. In order to work, it needs both the annotation definition code and the implementation of the validator. This is not really relevant to the integration part, but I provide it so there is a bean scoped constraint and you can check the Jsr303FormValidator. Here is the code for the annotation and the validator:

PasswordVerification.java

@Documented
@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Constraint(validatedBy = PasswordVerificationValidator.class)
public @interface PasswordVerification {
 
    String message() default "{newuser.passwordverification}";
 
    Class<?>[] groups() default {};
 
    Class<? extends Payload>[] payload() default {};
 
}

PasswordVerificationValidator.java

public class PasswordVerificationValidator implements ConstraintValidator<PasswordVerification, NewUser>{
 
	@Override
	public void initialize(PasswordVerification constraintAnnotation) {
		// Nothing to do
	}
 
	@Override
	public boolean isValid(NewUser value, ConstraintValidatorContext context) {
		if ( value.getPassword() == null && value.getPasswordVerification() == null ) {
			return true;
		}
		else if ( value.getPassword() == null ) {
			return false;
		}
		return ( value.getPassword().equals(value.getPasswordVerification()));
	}
 
}

Final touches, i18n

With the above code, you have all you need to use JSR 303 Validation in your Wicket forms. You have means to both validate individual properties associated with an input and the whole bean in a form’s model.

But the example is incomplete if you need your application to be available in various languages. The validation output messages are produced and interpolated by the validation engine, which isn’t aware of Wicket’s session locale. To correct this, a new MessageInterpolator which can access Wicket’s locale will be supplied to the validator bean.

The code of the new message interpolator (WicketSessionLocaleMessageInterpolator) is as follows:

import java.util.Locale;
import org.apache.wicket.Session;
import org.hibernate.validator.engine.ResourceBundleMessageInterpolator;
import org.springframework.stereotype.Component;
 
@Component(value="webLocaleInterpolator")
public class WicketSessionLocaleMessageInterpolator extends ResourceBundleMessageInterpolator {
 
	@Override
	public String interpolate(String message, Context context) {
		return super.interpolate(message, context, Session.get().getLocale());
	}
 
	@Override
	public String interpolate(String message, Context context, Locale locale) {
		return super.interpolate(message, context, Session.get().getLocale());
	}
 
}

This class extends ResourceBundleMessageInterpolator which is especific to Hibernate’s Validator implementation, but it’s very likely that if you use a different provider you can code a class similar to this one.

And the last needed step is to provide this bean to the validator declaration in Spring. This is the relevant part of the applicationContext.xml:

<bean id="validator" class="org.springframework.validation.beanvalidation.LocalValidatorFactoryBean">
    <property name="messageInterpolator" ref="webLocaleInterpolator" />
</bean>

Now you have everything set-up: form and individual input validation with AJAX callbacks, and localized messages. Hope it can be of help :-)

Testing the layered arquitecture with Spring and TestNG

Following the post explaining a layered arquitecture with Spring and Hibernate, this entry will explain how to easily test its DAOs and Service components using Spring’s TestNG integration.

When it comes to isolating the environment for each layer, the main conceptual difference between the layers is this: the service layer is transactional on its own, so its methods can be tested without adding any extra components; but the DAO layer requires an ongoing transaction for its methods to work, so one must be supplied.

Preparing the environment

Two extra dependencies must be added in the project’s pom.xml file: spring test context framework, which has the helper classes for different test libraries, and the TestNG library, which is the framework that is going to be used. Both will be added with test scope, as they only have to be present in the classpath during that phase.

This is the relevant fragment of the pom.xml file:

<dependencies>
        [...]
        <dependency>
            <groupId>org.testng</groupId>
            <artifactId>testng</artifactId>
            <version>${testng.version}</version>
            <classifier>jdk15</classifier>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>org.springframework.test</artifactId>
            <version>${springframework.version}</version>
            <scope>test</scope>
        </dependency>
	[...]
</dependencies>

Take care that you will need to define the properties holding the version values for the springframework and testng dependencies if you don’t have them already. In this entry, I assume the following versions:

<properties>
	[...]
	<springframework.version>3.0.0.RC2</springframework.version>
	<testng.version>5.10</testng.version>
	[...]
</properties>

Testing the DAO layer

So, DAO methods should be executed inside a transactional scope. To provide it, the test class will inherit from AbstractTransactionalTestNGSpringContextTests. As an extra, you may provide it with different spring application context configuration to adapt it to your test. Just be sure that in case you use a different configuration, it includes a TransactionManager so it can be used to create the transactions.

An example of a DAO test would be like this:

@ContextConfiguration(locations = { "classpath:applicationContext.xml" })
public class UserDaoTest 
		extends AbstractTransactionalTestNGSpringContextTests {
 
	@Autowired
	private UserDao userDao;
 
	@Test
	@Rollback(true)
	public void simpleTest() {
		User user1 = this.userDao.findById(1l);
		assertNotNull(user1, "User 1 could not be retrieved.");
	}
}

The rollback annotation allows you to decide whether the supplied transaction should proceed or be rolled back at the end of the test. In this case it doesn’t matter as the test is of a read-only operation, but it’s very handy when testing write operations.

Testing the service layer

Testing the service layer is even easier as it doesn’t need any extra scope or configuration to work. Still, to get the extra value provided by the Spring Test Framework (selection of the spring configuration, context caching, dependency injection, etc.) it is a good idea to inherit from the AbstractTestNGSpringContextTests class.

An example of a service test would look like this:

@ContextConfiguration( locations={"classpath:applicationContext.xml"} )
public class UserServiceTest extends AbstractTestNGSpringContextTests {
 
	@Autowired
	private UserService userService;
 
	@Test
	public void simpleTest() {
		Collection<User> users = this.userService.getAllUsers();
		assertEquals(users.size(), 3,
			"Incorrect number of users retrieved.");
	}
}

Take note that to unit test the service layer, you should provide a mock DAO to the service so its operations are tested in isolation. Spring’s dependency injection comes again handy in this regard: as the DAO is injected into the service, it’s easy to adapt the test configuration so the service gets a mocked DAO instead. How to configure that may be explained in a following post.

Layered architecture with Hibernate and Spring 3

In this post you will learn one of the ways to create a layered data driven application using Hibernate and Spring 3. The architecture will go up from the database to the service layer, so it’s your choice how to do the presentation part. I will try to adhere to Spring’s best practices in the separation of layers, so the resulting architecture offers both a clear separation between the layers and little dependencies in the Spring framework.

Setting up

I use Maven to take care of the compiling and life-cycle of the project. You may use this pom.xml file as the starting point for this project. It basically defines the needed repositories and dependencies that will be used in this guide.

<project xmlns="http://maven.apache.org/POM/4.0.0"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
        http://maven.apache.org/maven-v4_0_0.xsd">
 
	<modelVersion>4.0.0</modelVersion>
	<groupId>tld.example</groupId>
	<artifactId>layeredarch-example</artifactId>
	<packaging>war</packaging>
	<version>0.0.1-SNAPSHOT</version>
	<name>Layered Arch Example</name>
 
    <properties>
        <aspectj.version>1.6.6</aspectj.version>
        <commons-dbcp.version>1.2.2</commons-dbcp.version>
        <hibernate-annotations.version>3.4.0.GA</hibernate-annotations.version>
        <hibernate-core.version>3.3.2.GA</hibernate-core.version>
        <hsqldb.version>1.8.0.10</hsqldb.version>
        <javassist.version>3.7.ga</javassist.version>
        <log4j.version>1.2.15</log4j.version>
        <slf4j-log4j12.version>1.5.6</slf4j-log4j12.version>
        <springframework.version>3.0.0.RC1</springframework.version>
    </properties>
 
    <dependencies>
        <!-- Compile time dependencies -->
        <dependency>
            <groupId>org.aspectj</groupId>
            <artifactId>aspectjrt</artifactId>
            <version>${aspectj.version}</version>
        </dependency>
        <dependency>
            <groupId>org.aspectj</groupId>
            <artifactId>aspectjweaver</artifactId>
            <version>${aspectj.version}</version>
        </dependency>
        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>${log4j.version}</version>
            <exclusions>
               <exclusion>
                  <groupId>javax.jms</groupId>
                  <artifactId>jms</artifactId>
               </exclusion>
               <exclusion>
                  <groupId>com.sun.jdmk</groupId>
                  <artifactId>jmxtools</artifactId>
               </exclusion>
               <exclusion>
                  <groupId>com.sun.jmx</groupId>
                  <artifactId>jmxri</artifactId>
               </exclusion>
               <exclusion>
                  <groupId>javax.mail</groupId>
                  <artifactId>mail</artifactId>
               </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <groupId>org.hibernate</groupId>
            <artifactId>hibernate-core</artifactId>
            <version>${hibernate-core.version}</version>
        </dependency>
        <dependency>
            <groupId>org.hibernate</groupId>
            <artifactId>hibernate-annotations</artifactId>
            <version>${hibernate-annotations.version}</version>
        </dependency>
        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>org.springframework.core</artifactId>
            <version>${springframework.version}</version>
        </dependency>
        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>org.springframework.orm</artifactId>
            <version>${springframework.version}</version>
        </dependency>
        <!-- Runtime dependencies -->
        <dependency>
            <groupId>commons-dbcp</groupId>
            <artifactId>commons-dbcp</artifactId>
            <version>${commons-dbcp.version}</version>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>hsqldb</groupId>
            <artifactId>hsqldb</artifactId>
            <version>${hsqldb.version}</version>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
            <version>${slf4j-log4j12.version}</version>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>jboss</groupId>
            <artifactId>javassist</artifactId>
            <version>${javassist.version}</version>
            <scope>runtime</scope>
        </dependency>
    </dependencies>
 
    <build>
        <finalName>layeredarch-example</finalName>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <configuration>
                    <source>1.6</source>
                    <target>1.6</target>
                </configuration>
            </plugin>
        </plugins>
    </build>
 
    <repositories>
        <!-- Legacy java.net repository -->
        <repository>
            <id>java-net</id>
            <url>http://download.java.net/maven/1</url>
            <layout>legacy</layout>
        </repository>
        <!-- JBoss repositories: hibernate, etc. -->
        <repository>
            <id>jboss</id>
            <url>http://repository.jboss.com/maven2</url>
            <releases>
                <enabled>true</enabled>
            </releases>
            <snapshots>
                <enabled>false</enabled>
            </snapshots>
        </repository>
        <repository>
            <id>jboss-snapshot</id>
            <url>http://snapshots.jboss.org/maven2</url>
            <releases>
                <enabled>true</enabled>
            </releases>
            <snapshots>
                <enabled>true</enabled>
            </snapshots>
        </repository>
        <!-- SpringSource repositories -->
        <repository>
            <id>springsource-milestone</id>
            <url>http://repository.springsource.com/maven/bundles/milestone</url>
        </repository>
        <repository>
            <id>springsource-release</id>
            <url>http://repository.springsource.com/maven/bundles/release</url>
        </repository>
        <repository>
            <id>springsource-external</id>
            <url>http://repository.springsource.com/maven/bundles/external</url>
        </repository>
    </repositories>
 
</project>

Defining Entities

The Entities represent the domain of your project. They are simple JavaBean classes that also configure how this domain will be persisted to a database. They will be annotated with standard javax.persistence annotations so there will be no dependence in neither Hibernate nor Spring.

The following User.java is a simple Entity that represents an user in the application. In this example, for every user his name and age are stored along with an auto-generated ID that will identify every persisted user.

package tld.example.domain;
 
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;
 
@Entity
public class User {
 
	private Long id;
	private String name;
	private Integer age;
 
	public User() {
	}
 
	@Id
	@GeneratedValue
	public Long getId() {
		return this.id;
	}
 
	private void setId(Long id) {
		this.id = id;
	}
 
	@Column
	public String getName() {
		return this.name;
	}
 
	public void setName(String name) {
		this.name = name;
	}
 
	@Column
	public Integer getAge() {
		return age;
	}
 
	public void setAge(Integer age) {
		this.age = age;
	}	
 
}

You may save this class in the tld.example.domain package, where you will also save all the additional Entities that you add.

DAO Layer

First layer up in the architecture, it’s the DAO layer. These objects take care of the operations needed to query the database in order to fetch, store and update your Entities. I defined the DAOs in an interface/implementation manner. It’s not only a good design practice, but it also helps Spring AOP.

The UserDao interface would be as follows:

package tld.example.dao;
 
import tld.example.domain.User;
 
public interface UserDao {
 
	public User findById(Long id);	
	public User persistOrMerge(User user);
 
}

For the implementation, I have chosen to use Hibernate directly. Nevertheless it’s quite easy to change it to JPA and the provider you prefer. This is the HibernateUserDao class:

package tld.example.dao.impl;
 
import org.hibernate.SessionFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Repository;
 
import tld.example.dao.UserDao;
import tld.example.domain.User;
 
@Repository
public class HibernateUserDao implements UserDao {
 
	@Autowired(required=true)
	private SessionFactory sessionFactory;
 
	public User findById(Long id) {
		return (User) this.sessionFactory.getCurrentSession().createQuery(
			"from User user where user.id=?").setParameter(0, id)
			.uniqueResult();
	}
 
	public User persistOrMerge(User user) {
		return (User) this.sessionFactory.getCurrentSession().merge(user);
	}
 
}

Take into account that there is no transaction management code in this layer. DAOs mission is to abstract the CRUD tasks from your service layer. Transactional logic will be one layer above.

Also, the code exhibits two Spring dependencies because annotations were used to configure the dependency injection of the application. You could easily remove them by moving this configuration to XML.

Service Layer

This is the highest layer of this example’s architecture. The service layer provides your application with transactional operations for your business logic. The idea behind this is that a service method is the smallest atomic operation your application will do in the database, so a service method either completes and the resulting database is in consistent status for your application, or rollbacks to its previous state (which should also be consistent).

Again, the services are split into an interface and an implementation. I defined the following simple UserService interface:

package tld.example.service;
 
import tld.example.domain.User;
 
public interface UserService {
 
	public User retrieveUser(Long id);
	public User createUser(User user);
 
}

And the implementation UserServiceImpl.java:

package tld.example.service.impl;
 
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
 
import tld.example.dao.UserDao;
import tld.example.domain.User;
import tld.example.service.UserService;
 
@Service
public class UserServiceImpl implements UserService {
 
	@Autowired(required=true)
	private UserDao userDao;
 
	@Transactional
	public User createUser(User user) {
		return this.userDao.persistOrMerge(user);
	}
 
	@Transactional(readOnly=true)
	public User retrieveUser(Long id) {
		return this.userDao.findById(id);
	}
 
}

As you can see, when a service method will only perform read operations, you may tell so to Spring and it will be able to optimize the call (this is very useful when the backend is Hibernate).

A very important thing to note here. This is a very simple example with only one Entity and consequently only one DAO, so the Service is very simple. But with this layering, you may very well have Services that use more than one DAO and their functionality spans multiple Entities. The transactional part will take care of that, and you only need to design the Service methods right so they leave the data in the correct status.

Making it all work together

Now, the last step is to configure Hibernate and Spring to make it all work together. As you will see, thanks to the use of annotations very few config lines are needed.

Hibernate will be mostly managed by Spring, so the only configuration it needs is a pointer to the annotated Entities. In this case, only the User class. This is the hibernate.cfg.xml:

<!DOCTYPE hibernate-configuration PUBLIC
 "-//Hibernate/Hibernate Configuration DTD 3.0//EN"
 "http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd">
 
<hibernate-configuration>
    <session-factory>
        <mapping class="tld.example.domain.User" />
    </session-factory>
</hibernate-configuration>

On the Spring part, a bit more of configuration is needed to set up the declarative transactions. Hsqldb is used for the database. This is the applicationContext.xml file:

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns:aop="http://www.springframework.org/schema/aop"
    xmlns:context="http://www.springframework.org/schema/context"
    xmlns:tx="http://www.springframework.org/schema/tx"
    xsi:schemaLocation="
     http://www.springframework.org/schema/beans 
     http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
     http://www.springframework.org/schema/context
     http://www.springframework.org/schema/context/spring-context-3.0.xsd
     http://www.springframework.org/schema/tx
     http://www.springframework.org/schema/tx/spring-tx-3.0.xsd
     http://www.springframework.org/schema/aop 
     http://www.springframework.org/schema/aop/spring-aop-3.0.xsd">
 
    <!-- Configure annotated beans -->
    <context:annotation-config />
    <context:component-scan base-package="tld.example" />
 
    <!-- DataSource: hsqldb file -->
    <bean id="myDataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
        <property name="driverClassName" value="org.hsqldb.jdbcDriver" />
        <property name="url" value="jdbc:hsqldb:file:target/data/example" />
        <property name="username" value="sa" />
        <property name="password" value="" />
    </bean>
 
    <!-- Hibernate -->
    <bean id="mySessionFactory" class="org.springframework.orm.hibernate3.LocalSessionFactoryBean">
        <property name="dataSource" ref="myDataSource" />
        <property name="configLocation">
            <value>classpath:hibernate.cfg.xml</value>
        </property>
        <property name="configurationClass">
            <value>org.hibernate.cfg.AnnotationConfiguration</value>
        </property>
        <property name="hibernateProperties">
            <props>
                <prop key="hibernate.show_sql">true</prop>
                <prop key="hibernate.hbm2ddl.auto">create</prop>
                <prop key="hibernate.dialect">org.hibernate.dialect.HSQLDialect</prop>
            </props>
        </property>
    </bean>
 
    <!-- Transaction management -->
    <tx:annotation-driven/>
    <bean id="transactionManager" class="org.springframework.orm.hibernate3.HibernateTransactionManager">
        <property name="sessionFactory" ref="mySessionFactory"/>
    </bean>
 
</beans>

Basically, the following is configured:

  • Spring is told to scan all your classes under the tld.example package and configure the beans according to the annotations. This will create the UserDao and UserService singletons and inject the autowired fields.
  • A DataSource is configured. It uses Apache DBCP for pooling, and hsqldb as Database (both are included in the project dependencies).
  • Spring will inject Hibernate’s SessionFactory to the DAOs. The mySessionFactory bean is all what is needed to do so correctly.
  • And finally, the configuration needed for the declarative transaction management. This code will ensure that all the @Transactional methods in the service layer either run a full transaction or roll-back in case an exception occurs.

And that’s it :D Time for coding all your Entities, DAOs and Services now. There are lot’s of ways in which you can customize or improve this setup (use JPA, configure Spring’s exception translator, bean validation, etc.), but the important part is that the layering in the architecture allows to do so easily.

Maven, a first day guide

This is the first chapter of a mini-guide that will try first to set clear what the purpose of Apache Maven is, and then show you how you can use it in your Java projects.

I have always preferred to start learning by example and by doing things instead of by reading lengthy manuals (there will be time for that once I have started to get the feel of the technology). So, I will write this aimed to a learner like me, hoping that some of you also prefer learning it this way.

Anyways, the introduction is over, let’s get to the meat :-)

Maven

There are plenty of sites telling you what Maven is, so I will tell you what you will usually use Maven for. When developing a Java project, you will use Maven to perform tasks such as building, packaging, deploying or testing your project. Then, why use it instead of any other tool? These are some of Maven strong points:

  • All information regarding how you want to perform all these tasks is centralized in a single file: pom.xml.
  • Convention over configuration. If you adhere to Maven’s conventions (for example, where to place your .java files), your pom.xml file will be very concise.
  • A nice plugin ecosystem. Maven will probably have a plugin that does that not so usual task you want to perform.
  • And finally, dependency control. One of its more known features, with Maven it’s very easy to configure and check what Jars you need for each stage (building, testing and running).

In case you have used Ant to build-test-deploy-etc. your project, Maven can probably be its substitute, provided that you find Maven’s way of doing the task more adequate.

Installation

If you haven’t already, you may get Maven from here: http://maven.apache.org/download.html. There is also installation instructions for Windows and Unix-like systems.

A note to Linux users, even though you can probably get Maven from your distro’s packaging system, you may consider installing it standalone. At least in Debian/Ubuntu systems the package pulls a gazillion dependencies that you don’t need.

The minimal pom.xml

OK, you have Maven installed, let’s see what it can do for you. Create a directory, and place this pom.xml file in it:

<project xmlns="http://maven.apache.org/POM/4.0.0"
  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
    http://maven.apache.org/maven-v4_0_0.xsd">
 
  <!-- Maven's POM version -->
  <modelVersion>4.0.0</modelVersion>
 
  <!-- When packaging, a JAR file will be produced -->
  <packaging>jar</packaging>
 
  <!-- And the file will be named my-jar.jar -->
  <artifactId>my-jar</artifactId>
 
  <!-- This tells Maven how your jar should be archived, should you
     want to use it as a dependency for another project -->
  <groupId>tld.testing</groupId>
  <version>0.1-Alpha</version>
 
  <!-- The projects name -->
  <name>My Project</name>
 
</project>

It basically tells Maven that it should create a JAR file. You may now execute mvn package in the directory and you will see Maven create the requested JAR file in the also created target directory. Of course, the JAR will be empty apart from an auto-generated Manifest as there are no Java source files yet.

Take care, if it’s the first time you execute a Maven’s stage, it will download the internet (everything will be saved locally, so next time you execute it, probably nothing will have to be downloaded).

Adding files to compile

If you have read Maven’s output to the last command, it will have told you that it had no files to compile. So, lets add a simple Java file, called App.java.

package mypkg;
 
public class App {
 
  public static void main(String[] args) {
    System.out.println("Hello Maven!");
  }
 
}

You will have to create the directory src/main/java/mypkg and place it in there. Once it’s done, type again mvn package in the root of your project. You can check the target directory and see your compiled class in there under the classes directory, and the updated Jar file with the App.class now inside.

As a final check of this stage, execute mvn exec:java -Dexec.mainClass="mypkg.App". This tells Maven to execute your compiled class, so you will see if everything is OK (you should see the “Hello Maven!” output between Maven’s info messages).

As you have seen, by adhering to Maven’s convention of where the source files should be placed, no extra configuration has been needed in the pom.xml file.

Testing stage and the first dependency

If you read Maven’s output when packaging, you may have noticed a “No tests to run” output between the compiling and packaging step. In Maven’s way of doing things, there is a test stage between compiling and packaging. That means, once it has compiled your project, Maven will run any unit tests that you have against the compiled files, and in case it succeeds, it will procede to package. That’s a good thing in my book, so let’s give Maven a test to run.

Suppose we want to run a simple unit test coded in JUnit. We will need the JUnit JAR in the classpath, but only during the test stage. It’s now time to start using Maven’s dependency control, so we add the following before the </project> closing tag in the pom.

  <dependencies>
    <dependency>
      <!-- Group and artifact id tell Maven where to look 
        for a dependency -->
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <!-- And the version completes the information so it 
        knows exactly what JAR it must download -->
      <version>3.8.1</version>
      <!-- We want JUnit only in the test stage -->
      <scope>test</scope>
    </dependency>
  </dependencies>

With that information, next time you package your project Maven will automatically download JUnit JAR and add it in the classpath during your tests.

So let’s see if that works. Create a very simple (and not useful at all) JUnit test in a file called AppTest.java:

import junit.framework.TestCase;
 
public class AppTest extends TestCase {
 
  public void testSum() throws Exception {
    assertEquals(2, 1+1);
  }
 
}

And place that file in src/test/java/mypkg. Then, execute mvn package again. You will see how Maven performs the test, outputs the report of the test stage and proceeds to package as there were no test failures.

With a minimal pom.xml, you have configured Maven to compile, test and package your project. That pom.xml file and directory structure would be a good starting point template, but Maven has a better solution than the copy pasting of that structure…

The quickstart archetype

With Maven you can use lots of predefined archetypes that act like templates when starting a Maven managed project. This is very handy for when you are starting a WAR project or have some configuration that requires a predetermined configuration in the pom and file structure.

Open a terminal in a directory other than the one in which you did your first project, and:

  1. Execute: mvn archetype:generate
  2. Select maven-archetype-quickstart, it’s 15 on my list.
  3. Define value for groupId: : tld.testing
  4. Define value for artifactId: : my-jar
  5. Define value for version: 1.0-SNAPSHOT: : 0.1-Alpha
  6. Define value for package: tld.testing: : mypkg

Once you finish, Maven will create a my-jar directory in which you will have an equivalent pom and directory structure to the one you hand-made in the previous sections. The good thing is now you know why it defines those things, and have a better understanding of Maven than if you had just used the archetype.

Finishing the day

Well, enough Maven for a day I would say. It came out more verbose than I initially wanted, but without the intro it felt a little lacking. I promise next chapters will be more to the point with more examples and less talking!