Category Archives: java

Documenting Jersey REST API

Documentation is error-prone. So the best documentation should be the code itself… or automatically generated from the code.

I haven’t found any open source project that can do this so I created one. As I’m currently learning Scala, I decided to write the code with Scala. You can find the code in Bitbucket: https://bitbucket.org/enefem/restdoc

It’s by no mean complete so feel free to fork and add functionalities or ask for improvements.

Screenshot

How To Use

To use the tools, you must define an init-parameter in the Jersey Container. The param-name should be packageName and the param-value should the complete package name where your REST resources are located. Example is as follow:

<servlet>
    <servlet-name>Jersey REST Service</servlet-name>
    <servlet-class>
         com.sun.jersey.spi.container.servlet.ServletContainer
    </servlet-class>
    <init-param>
         <param-name>
              com.sun.jersey.config.property.packages
         </param-name>
         <param-value>
              de.fraunhofer.iais.tat.targeting.resources
         </param-value>
		  </init-param>
    <init-param>
         <param-name>packageName</param-name>
         <param-value>
             de.fraunhofer.iais.tat.targeting.resources
         </param-value>
    </init-param>
</servlet>

If you are using Guice and jersey-guice (which I do), you can define init-parameter as follow:

serve("/*").with(GuiceContainer.class,
   ImmutableMap.of("packageName", "de.fraunhofer.iais.tat.resource"));

And that’s all! The rest documentation should be available on path /restDoc.

Deployment parameter for REST resource

When you are creating resource in a REST application, it will only be automatically created if you have an empty parameter. The problem is, sometime we need a resource that is configurable, so it can be reused by other applications.

We can solve this problem by defining init parameter in our web.xml like this:

<servlet>
    <servlet-name>JerseyTest</servlet-name>
    <servlet-class>com.sun.jersey.spi.container.servlet.ServletContainer</servlet-class>
    <init-param> 
        <param-name>param</param-name> 
        <param-value>test</param-value> 
    </init-param> 
</servlet>

Or if you’re using Guice (and jersey-guice), changing the end of your ServletModule like this:

serve("/*").with(GuiceContainer.class,
     ImmutableMap.of("param", "test"));

The parameter can be read by your resource by defining ServletConfig as field and annotate that with @Context. Example:

@Path("/test")
public class TestResource {
   @Context ServletConfig config;

   @GET public String testMethod() {
      return config.getInitParameter("param");
   }
}

Java Tips: Process Object Based On Its Type Without if-then-else Solution

I want to share my answer for this question on StackOverflow.

Say you want to process several objects with different types. Each type must be processed differently but some concerns are:

  1. You don’t want if-then-else solution which is obviously not great for long-term
  2. Configuration is also bad for the same reason

So how is the solution? This is the solution that is using library from Reflections.

public class A {

}
public class B {

}
import java.lang.reflect.ParameterizedType;

public abstract class Processor<T> {

	private final Class<T> processedClass;

	public Processor() {
		ParameterizedType parameterizedType =
			(ParameterizedType) getClass().getGenericSuperclass();
		processedClass =
			(Class<T>) parameterizedType.getActualTypeArguments()[0];
	}

	public Class<T> getProcessedClass() {
		return processedClass;
	}

	protected abstract void process(T message);

}
public class ProcessorA extends Processor<A> {

	@Override
	protected void process(A message) {
		System.out.println("Processing object A");
	}

}
public class ProcessorB extends Processor<B> {

	@Override
	protected void process(B message) {
		System.out.println("Processing object B");
	}

}
import java.lang.reflect.Constructor;
import java.util.HashMap;
import java.util.Iterator;
import java.util.Map;
import java.util.Set;

import org.reflections.Reflections;

public class Adapter {

	private Map<Class<?>, Processor<Class<?>>> mapping
		= new HashMap<Class<?>, Processor<Class<?>>>();

	public Adapter() throws Exception {
		Reflections r = new Reflections("");
		Set<Class<? extends Processor>> subTypesOf =
			r.getSubTypesOf(Processor.class);

		for (Iterator iterator = subTypesOf.iterator();
                     iterator.hasNext();) {
			Class<? extends Processor> c =
				(Class<? extends Processor>) iterator.next();
			Constructor<? extends Processor> constructor =
				c.getConstructor();
			Processor p = constructor.newInstance();
			mapping.put(p.getProcessedClass(), p);
		}
	}

	public <T> Processor<T> getProcessor(
			Class<? extends T> c) {
		return (Processor<T>) mapping.get(c);
	}
}
public class Main {

	public static void main(String[] args)
			throws Exception {
		Adapter adapter = new Adapter();

		A a = new A();

		adapter.getProcessor(a.getClass()).process(a);

		B b = new B();

		adapter.getProcessor(b.getClass()).process(b);
	}

}

The console after running main method:

14:01:37.640 [main] INFO  org.reflections.Reflections - Reflections took 375 ms to scan 4 urls, producing 222 keys and 919 values
Processing object A
Processing object B

It’s kind of magic, isn’t it?

Guava: Using CheckedFuture

In this post, I want to discuss another class from Google Guava. Today, it will be CheckedFuture. Let’s first see the JavaDoc of the class:

A CheckedFuture is an extension of Future that includes versions of the get methods that can throw a checked exception and allows listeners to be attached to the future. This makes it easier to create a future that executes logic which can throw an exception.

Implementations of this interface must adapt the exceptions thrown by Future#get(): CancellationException, ExecutionException and InterruptedException into the type specified by the E type parameter.

This interface also extends the ListenableFuture interface to allow listeners to be added. This allows the future to be used as a normal Future or as an asynchronous callback mechanism as needed. This allows multiple callbacks to be registered for a particular task, and the future will guarantee execution of all listeners when the task completes.

From this explanation, we can conclude two things about CheckedFuture:

  1. It extends ListenableFuture which has been discussed in this article.
  2. It aims to simplify exception management of Future.

Let’s look at code. Normally, if Future execution is cancelled because the inner-process throws exception, it will wrap the exception in a ExecutionException. So, our code will looks like something like this.

try {
	task.get();
} catch (ExecutionException e) {
	if (e.getCause()
			instanceof ApplicationException) {
		throw (ApplicationException) e.getCause();
	}
	throw new ApplicationException(e);
} catch (InterruptedException e) {
	throw new ApplicationException(e);
} catch (CancellationException e) {
	throw new ApplicationException(e);
}

A lazy programmer (don’t we all are lazy)? will write that like this:

try {
	task.get();
} catch (Exception e) {
	if (e.getCause()
			instanceof ApplicationException) {
		throw (ApplicationException) e.getCause();
	}
	throw new ApplicationException(e);
}

The same code can be implemented using CheckedFuture like this:

Function<Exception, ApplicationException> mapper =
	new Function<Exception, ApplicationException>() {

	@Override
	public ApplicationException apply(
			Exception from) {
		if (from.getCause() instanceof ApplicationException) {
			throw (ApplicationException) from.getCause();
		}
		throw new ApplicationException(from);
	}
};
CheckedFuture<String, ApplicationException> checkedTask =
	Futures.makeChecked(task, mapper);

checkedTask.checkedGet();

So I guess you’ll say: ‘but the code is longer than the original’. It’s true that in this example the code seems to be longer. In fact, if your case is so simple like this, I’ll suggest you to just use the original version.

This is pretty much same case of using functional-like Java coding style. It’s overkill if the case is so small. The advantage can only be seen once your code grows bigger.

In this CheckedFuture case, one example where using it will give you huge benefit is when you start sending the Future/CheckedFuture around and the get()/checkedGet() is executed in many places. That time, you’ll see that wrapping each call will try catch is annoying and this is where CheckedFuture-based solution will shine.

Guava: Using ListenableFuture

Google Guava has many interesting classes which we can use on our application. The ones from collection package have been already used by many developers and this blog has tutorial on how to use the computing map.

I want to move to the other package. This one is com.google.common.util.concurrent, specifically I want to introduce ListenableFuture. The documentation of the class is as follow:

This interface defines a future that has listeners attached to it, which is useful for asynchronous workflows. Each listener has an associated executor, and is invoked using this executor once the Future’s computation is Future#isDone() complete}. The listener will be executed even if it is added after the computation is complete.

Consider following example. If we have task T1, T2, and T3. T2 can only be done when T1 is finished and T3 can only be done once T2 is ended. The diagram below shows the dependency.

The easiest solution without any concurrency is of course to just run each task one after another. But consider that we have 5 set of these operation. Without thread we can end up with serial solution depicted by the following picture.

ListenableFuture made it easy to create the concurrent version of the solution.

This is a code example of this solution which will print “1”, then pause for a second before printing “2”, and another stop for 1 second before finally printing “3”. Note that ListenableFutureTask extends ListenableFuture.

import java.util.concurrent.Callable;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;

import com.google.common.util.concurrent.ListenableFutureTask;

public class TestListenableFuture {

	static class SimpleTask extends
			ListenableFutureTask<Void> {
		SimpleTask(final String message) {
			super(new Callable<Void>() {

				@Override
				public Void call() throws Exception {
					System.out.println(message);
					Thread.sleep(1000);
					return null;
				}
			});
		}
	}

	public static void main(String[] args) {
		ListenableFutureTask<Void> task1 =
			new SimpleTask("1");
		ListenableFutureTask<Void> task2 =
			new SimpleTask("2");
		ListenableFutureTask<Void> task3 =
			new SimpleTask("3");

		ExecutorService exec =
			Executors.newSingleThreadExecutor();
		exec.submit(task1);
		task1.addListener(task2, exec);
		task2.addListener(task3, exec);

		try {
			Thread.sleep(10000);
		} catch (InterruptedException e) {
		}
		exec.shutdown();
	}
}

Pretty easy, isn’t it? And we can probably extend this solution to create a full workflow framework solution.

UPDATE: Remember that the API is still in Beta version. I’ll try to update it once the release version is released

Java Tips: Initializing Collection

Especially in unit test, it is common case that we have to initialize an array or a collection.

Well, for array, it’s OK… A simple code that we know can solve the problem:

String[] s = new String [] {"1", "2"};

But how about Collection? Normal way to initialize collection is something like this (which pretty ugly):

List<String> s = new ArrayList<String>();
s.add("1");
s.add("2");

I hardly find an elegant solution until I see this post. There are at least three better solution for the case.

First solution:

List<String> s = new ArrayList<String>() {{ add("1"); add("2"); }};

Which unfortunately, doesn’t pass Java Code Convention (that is, if you format the code, it will become uglier than the original).

List<String> s = new ArrayList<String>() {
   {
      add("1");
      add("2");
   }
};

Second solution:

List<String> s = Arrays.asList(new String[]{"1", "2"});

This solution is the best if you use Java 1.4 or before. But if you use Java 5, the third is more elegant:

List<String> s = Arrays.asList("1", "2");

Great!

EDIT: this solution will create a fixed size BUT modifiable collection, so you may want to wrap it with ArrayList (or other Collection class) to make it writable.

Hello World… Using Spring Roo 1.1.0M1 under STS 2.3.3M1

The first day’s Google I/O keynote shows how much work has been done to integrate Spring Roo with GWT. Despite the fact that the demo is not smoothly done, it is kind of interesting combination. Before we have to do all the work to integrate Spring + Hibernate + GWT manually, now the work has already been done. Even nicer, it has tight integration with Eclipse, my favorite Java IDE :D.

Unfortunately the information on how to start creating and playing is not integrated and I spent quite some times to finally run a very simple application using all the technologies. Let me share my experience here…

  1. What you need first is an STS 2.3.3M1. It’s not that easy to find, so here is the link: http://www.springsource.com/products/springsource-google-download. Unfortunately, you have to fill the forms as I can’t find another way to download it without filling the form. Warning, you need STS 2.3.3M1 not release 2.3.2.
  2. Next you have to run the installer. I had no problem for this and at the end, the STS is nicely installed on my computer.
  3. Run the STS and put the workspace location as you like.
  4. If you start a new workspace you’ll get something like this:
  5. Just close the Welcome page and you will get something like this:
  6. You’ll need to install DataNucleus Eclipse Plugin and Google Plugin for Eclipse. To do so, go the third tab under the dashboard and select both extensions. Install it and restart the STS.
  7. We can start creating a new project. Create a Roo project with name ‘hello’ and top level package ‘com.hello’. Wait a bit for Maven to download all the dependencies.
  8. You can use Roo Shell to start adding entities to the project. Do following command:
    persistence setup --provider DATANUCLEUS --database HYPERSONIC_IN_MEMORY
    entity --class ~.server.domain.Employee --testAutomatically
    field string --fieldName userName --sizeMin 3 --sizeMax 30
    gwt setup
    

  9. After that, you should right click on the project and select Google -> Web Toolkit Settings…. And then just click OK. I don’t know what happens but without it the application complain that it can’t find GWT SDK.
  10. Again, right click on the project. Select Maven -> Enable Dependency Management.
  11. Now you can run the application by right click on it and select Run -> Web Application. There you go you got the GWT version of the application.
  12. Alternatively is to run mvn gwt:run from console or from Eclipse.

That’s that!