Transaction handling using an annotation processor
Jacek Dubikowski
Senior Software Engineer
Published: Jan 11, 2023|15 min read15 minutes read
Declarative transaction processing is a very popular feature of Java application frameworks. We can see examples from the three most popular frameworks below.
So why not build a simplified version of your own just to learn how it works? The repository Java Own Framework – step-by-step shows how to do it purely in runtime. I want to show you the compile-time version today.
In the first article of the series, I jump-started creating an annotation processor-based framework by providing the first feature – dependency injection.
Note: Micronaut has heavily inspired all code examples in this text.
14 System.out.printf("Participant: '%s' takes part in event: '%s'%n", participant, event);
15 }
16}
From now on, being able to write code like that is our target. We want to handle transactions by adding the @Transactional annotation to the method of our interest.
Once a method is annotated with @Transactional, we want the annotation processor to generate transaction handling code. For the sake of simplicity, the processor will generate code only for methods of concrete classes.
Since the processor can only generate new code, it will create a subclass of the class with annotated methods. Therefore, the class cannot be final. The methods cannot be final, private, or static. Non-annotated methods won’t be touched at all.
To get a better idea, please look at the example below.
For the below class:
1@Singleton
2public class RepositoryA {
3
4 @Transactional
5 void voidMethod() {
6 }
7
8 int intMethod() {
9 return 1;
10 }
11}
The annotation processor should generate the following:
The example presents a simplified version of what will be generated, but you probably get the idea. The actual code generation and other issues will be shown later on. The generated code will be simple. It won’t care about transaction propagation. It will wrap checked exceptions into the RuntimeException and rethrow them in that form.
The problem is that if you want transactions, you cannot directly create an instance of the class with annotated methods using new or any other factory method. You must rely on the framework, in our case, the one described in the previous article of the series, to provide it. This is because only the generated class will have the expected transactional code.
The only extra thing worth noticing in the example is the generated class name. For the rest of this project, if the annotation processor ever creates replacements for some classes, their names will include the Intercepted word.
As transaction handling is the main subject of this text, we will get straight to it.
Processing the @Transactional annotation is not a mandatory part of our framework. It should be used based on the user’s decision. Therefore, the code responsible for it will be called TransactionalPlugin, as this is a pluggable feature.
Let’s look at the code below (the code also is available here).
1public class TransactionalPlugin implements ProcessorPlugin { // 7
Now, it is time for us to dive deeply into the provided source.
Set<? extends Element> annotated contains all Elements annotated with @Transactional.
In the first step, we filter all methods out of the annotated set of elements using ElementFilter.
Then, the annotated elements are validated against the previously mentioned rules. I introduced the utility class TransactionalMessenger(code here). Its sole responsibility is to wrap Messager and provide a unified API for raising errors associated with the @Transactional processing. Every raiseForSth method calls TransactionalMessenger providing information about the error. The raiseForSth methods’ code is skipped to keep the example concise and manageable.
Now, we group the annotated methods by classes that the methods are declared. In Java, you can only create a method in a class or interface. However, the plugin accepts only concrete class methods and raises errors for others. Therefore, we can be sure that calling *element.getEnclosingElement(_)* where an element is the annotated method will return class representation – TypeElement.
Once we have the mentioned grouping, we can write the code. We need to intercept classes that are keys in the grouping and write transactional versions of methods that are values of the mapping.
The last part is to write the code. The logic is stored in TransactionalInterceptedWriter, so we can move to see its code.
As the TransactionalPlugin must be somehow plugged into our framework workings, the class implements the ProcessorPlugin interface. How it all works will be described after we finish with the transaction handling, as it is not the main topic here.
For code generation, I will use the proven JavaPoet library.
The code of the TransactionalInterceptedWriter is quite complicated. The thing that requires special attention is writing transactional versions of void methods and value-returning methods. Unfortunately, Java language has the void type contrary to Kotlin, Scala, Rust, and others.
We will get to the mentioned part later. Now let’s start with instance fields and constructor.
Instance fields and constants
The Writer constructor is fairly simple, so it can be omitted.
1class TransactionalInterceptedWriter {
2 private static final String TRANSACTION_MANAGER = "transactionManager";
3 private static final Modifier[] PRIVATE_FINAL_MODIFIERS = {Modifier.PRIVATE, Modifier.FINAL};
4
5 private final TypeElement transactionalElement; // 1
6 private final List<ExecutableElement> transactionalMethods; // 2
7 private final PackageElement packageElement; // 3
8}
The constants are fairly simple, and their names are self-explanatory.
The class instance fields are more interesting.
1. The transactionalElement stores the TypeElement representation of the class with the annotated methods. The class will be referred to as intercepted class or superclass.
2. The transactionalMethods stores the ExecutableElement representation of the annotated methods of the transactionalElement class.
3. The packageElement stores the PackageElement representation of the package in which transactionalElement is defined.
Intercepting class definition
We will start with the most high-level thing. Let’s see how the intercepting class is written, but without going into details.
1class TransactionalInterceptedWriter {
2
3 public JavaFile createDefinition(Messager messager) {
First of all, the class must have a name. As mentioned before the generated class will be called the old one but with an extra Intercepted part. For example, Repository will be changed into Repository$Intercepted. Therefore, we know that the type before $ is intercepted by the generated class.
The created class must be annotated with @Singleton, so the DI solution from the first part will pick it up.
To fulfill its role, the generated class will extend the class with methods annotated with @Transactional. We have already talked about it above.
The class will also implement the Intercepted interface, which will be covered later. The interface is related to the provisioning of the intercepted instances. This requires the generated class to implement an extra method. I will describe how it works at the end of the article, as this is unrelated to transactions.
To handle transactions, the class needs a TransactionalManager field. Adding the field is very straightforward.
The class must have a constructor that will call super(requiredDependencies) and set the transactionManager field.
The class will override the methods annotated in its superclass.
The generated code will be stored in the JavaFile object to be written in a real file later.
Now, having the high-level view, we can dive into the details where needed. So let’s start with writing the constructor.
Constructor
To provide the transactional capability, the constructor must call the constructor of its superclass via the super keyword, passing the parameters in the correct order. The transactionManager field of the intercepting class also must be populated.
The first thing that is done to create a constructor is to find out what the dependencies of the intercepted class are. To do it in a convenient way, we will reuse TypeDependencyResolver, created for the DI solution. You can read more about it here.
Having the dependencies of the superclass, we can create parameters for the constructor. The transactionManager is the first parameter, and the rest is provided conveniently as Type ${position in the constructorParameters list}.
Having the intercepting class constructor params, we can prepare the content of the super call. Then it can be added to the super call in the constructor.
The last thing to do is to also set up the transactionManager field.
The generated constructor may look like the code below:
26 var intMethodReturnValue = (int) super.intMethod();
27 transactionManager.commit();
28 return intMethodReturnValue;
29 }
30 catch (Exception e) {
31 try {
32 transactionManager.rollback();
33 }
34 catch (Exception innerException) {
35 throw new RuntimeException(innerException);
36 }
37 throw new RuntimeException(e);
38 }
39 }
40}
TransactionManager’s methods are defined with the checked exception. Therefore, transactional methods need to include try/catch blocks. In the try block, the begin and commit must be called, as well as a rollback in the catch clause. If the return type isn’t void, the result of super method call results must be stored in a variable.
So the high-level method to generate such a call looks like this:
In this and the previous article of this series, I have shown a lot of code, mostly containing JavaPoet usage. My hope is that now you get how the JavaPoet works. From now on, I will try to minimize the boilerplate JavaPoet code by omitting it in the examples or sharing it as Gists. The full code is still present in the repository, of course.
Catch and try blocks
The try and catch blocks code is quite simple. So as mentioned before, here are the gists:
21 System.out.printf("Participant: '%s' takes part in event: '%s'%n", participant, event);
22 }
23}
However, we must be sure to get the expected instance during the runtime, right? Let us check how to make it all work within our framework. To reach the goal, we need two more things.
The TransactionPlugin must be used during compilation.
We must make our framework provide only intercepted instances.
We have seen the code that handles transaction processing. Now, we have to make use of the TransactionalPlugin in our framework. To keep everything simple, I created an interface ProcessorPlugin which will be a way to register extensions. Thanks to that, the whole transaction processing code is held in separate classes.
In the main part of the article, I omitted some code of the TransactionalPlugin related to the ProcessorPlugin implementation. Now, you can see the missing parts of the code below.
1public class TransactionalPlugin implements ProcessorPlugin {
2 @Override
3 public void init(ProcessingEnvironment processingEnv) { // 1
4 this.processingEnv = processingEnv;
5 this.transactionalMessenger = new TransactionalMessenger(processingEnv.getMessager());
6 }
7
8 @Override
9 public Class<? extends Annotation> reactsTo() { //2
10 return Transactional.class;
11 }
12}
The init method implementation is fairly simple. It just requires setting processingEnv and creating TransactionalMessenger.
The reactsTo method implementation states that the plugin is interested in @Transactional annotation. Who would guess, right?
The provided code is nothing big. It is easy to notice that the most interesting thing was the process method shown before.
In the “production” code, the framework must be able to provision the intercepted instances. To make this possible, I introduced the interface below.
1public interface Intercepted {
2 Class<?> interceptedType();
3}
This is very simple, yet very important. Thanks to the interface, we can be sure which type has its intercepted version. You may have remembered from the main part that our $Intercepted classes have implemented the interface. So how was this done?
Implementing the interface
The implementation of the interface is quite simple. For the RepositoryA:
Now, we can differentiate Intercepted classes from regular ones and point out types that have their intercepted versions.
Using only intercepting classes during provisioning
To get only the intercepted version and not the original one, we need to update the BaseBeanProvider. The simplified code was shown in the previous part about DI. Now, it needs an extra step.
1class BaseBeanProvider implements BeanProvider {
2 @Override
3 public <T> List<T> provideAll(Class<T> beanType) {
4 var allBeans = definitions.stream().filter(def -> beanType.isAssignableFrom(def.type()))
5 .map(def -> beanType.cast(def.create(this)))
6 .toList(); // 1
7 var interceptedTypes = allBeans.stream().filter(bean -> Intercepted.class.isAssignableFrom(bean.getClass()))
Now the whole solution works as expected. The framework provides the $Intercepted instances that handle transactions for us. I think that is enough for today. Class dismissed.
In the next and final part, we will a look at RestControllers, so stay tuned!