Building Reactive REST APIs in Java : Part 2

Kalpa Senanayake
8 min readFeb 10, 2019

Introduction

In Building Reactive REST APIs in Java : Part 1 we identified difference between Reactive systems and Reactive programming and also laid solid foundation on reactive programming. If you have not read it yet please consider reading it and come back. Most of the technical details that we are going to discuss here will be based on the foundation in that article.

In this article, we are going to build a RESTFul API using Reactor reactive library and Spring WebFlux stack.

The Reactor

The main reason behind selecting Reactor as the reactive library for this article is that it’s ability to integrate seamlessly with Spring 5 framework. It makes easy for the readers to understand reactive programming model with compared to the blocking model. And also it allows readers to start building reactive RESTFul APIs straight away using well known Spring framework.

But there are other great reactive libraries like RxJava and there is nothing stops you from using that for building your API. That is why it is important to understand the fundamentals of the paradigm rather that focus on a particular library or framework.

Application server.

We will use Netty as the application server. Both these (Reactor and Netty) are default set up for the Spring WebFlux. We can use Jetty, undertow or tomcat for the same purpose and each has its own mechanism to support reactive execution. Spring WebFlux abstracts away those details, but would not stop us going nitty-gritty details if we want to.

The reactor is fully non-blocking reactive programming implementation for JVM. It provides mechanism to enforce back-pressure and compose-able asynchronous event sequence representation as known as, Flux and Mono.

Flux : A Flux object represents a reactive sequence of 0…N

Mono: A Mono object represents a reactive single value.

The reason behind two types is that the creators of the library realised that most of the operators (for manipulation of asynchronous events ), which applies to Flux does not applies to Mono. So they make it simple by creating two types.

Implementation : Phase 1 : Connecting to database in reactive way

Let’s start with a simple example and gradually improve it for more advanced use case. First phase of the implementation application will connect to MongoDB and save some data.

You might wonder why MongoDB, why not MySQL of any other relational data bases. This is because in February 2019, at the time I am writing this article JDBC implementation of Java, still use the blocking manner of communicating with relational databases. We can not get the benefit of using reactive programming if our data source does not support non-blocking way of data communication.

Initialisation:

Initialise a project using https://start.spring.io/. During the initialisation select “Reactive Web” and “Reactive Mongo”

https://start.spring.io/

“Reactive Web” dependency brings the Spring WebFlux in to your project. “Reactive Mongo” dependency brings the drivers and libraries to communicate with MongoDB in reactive manner.

In addition to that, please add lombok library as a dependency which saves us from writing boiler-plate codes. The dependencies section should look like follow.

Dependencies section

Setting up MongoDB

To set up mongo db we will use official MongoDb docker image. The following command will pull and run MongoDb on your local docker environment.

docker pull mongo#  run is as a daemon (-d), expose the port 27017 (-p),
# set it to auto start (--restart)
# and with mongo authentication (--auth)
docker run --name mongo_docker --restart=always -d -p 27017:27017 mongo mongod --auth

Now you should be able to login to mongo db console using following command.

# bash into the container
sudo docker exec -i -t mongo_docker bash
#connect to local mongo
mongo

Create the admin user.

# create the first admin user
use admin
db.createUser({user:"r_user",pwd:"r_pass",roles:[{role:"root",db:"admin"}]})
# exit the mongo shell
exit
# exit the container
exit

Now you should be able to connect to mongo db as the admin user.

sudo docker exec -i -t mongo_docker bash# login as admin user
mongo -u "r_user" -p "r_pass" localhost --authenticationDatabase "admin"
# create book_db
use book_db
#create book_user
db.createUser({user: "book_user", pwd: "pass", roles: [{ role: "readWrite", db: "book_db" }], passwordDigestor:"server"})

Load data in MongoDB.

In phase one, we are going to save some books in to book_db in reactive manner.

Create Book model class.

Book.java

Create the BookRepository

BookRepository.java

Note that this repository extends the ReactiveMongoRepository it provides reactive support for MongoDB. The repository based programming model is well understood and I am not going to explain on it, But it is worth to note that instead of the non-reactive version MongoRepositry which return java.util.List<T>, reactive equivalent return Mono and Flux where applicable.

Initialise data and save

Configure the database url in application.yml

application.yml

To save some data to MongoDb we use the CommandLineRunner functional interface.

From line #36 we start using reactive way of dealing with data base. It deletes existing data in the database and inserts three book records. This operation emits a Mono and then go to play another using .thenMany().

.deleteAll()
.thenMany()

Key element which needs to understand here is the way we convert normal Book objects in to event publishers.

Flux.just()

Flux.just() has been used to create event emitter. Hence every book object becomes an event in a reactive stream.

.flatMap(this.bookRepository::save)

Then every event emitted from this Flux will get saved to database using save() method and return Mono. This is where tools provided by the framework come in handy to convert the these Monos in to a Flux<Book>. This is done by flatMap Operation.

flatMap operator transforms Book events to Mono by invoking save operation and then merge those Monos to a single Flux. Which is flattening the events , then emits these merged results.

The following is digram which shows what flatMap does to event streams.

flatMap()
.subscribe(
null,
null,
() -> log.info("done initialization...")
);

This is another important part as well. Every publisher needs to be subscribed , publishers does nothing until we subscribed to it.

This is all we need to do for the phase 1. Let’s start the application by running the application as a SpringBoot app.

startup logs

Have a look on the startup logs. As we can see the main thread has started the Netty server on port 8080 and invoked on OnSubscribe() method then went off.

The key part comes later by set of thread which are named in following format.

[ntLoopGroup-2-2]
[ntLoopGroup-2-3]
[ntLoopGroup-2-3]
[ntLoopGroup-2-2]
[ntLoopGroup-2-4]
[ntLoopGroup-2-4]
[ntLoopGroup-2-4]
[ntLoopGroup-2-4]

These are the Netty event loop thread group threads which actually do the work for us. These thread work asynchronously on the reactive stream events.

And this is living breathing reactive model in front of you.

  1. The main thread does not do the (actual) work.
  2. There are only few threads do the actual work. (Event loop threads)
  3. They operates asynchronous manner.
  4. Program execution happens on events like OnSubscribe(), OnNext(), OnComplete().

Implementation : Phase 2 : Fetching data

So far we have only managed to insert some data to database via reactive manner. But that is not enough to be a useful application. Next phase is to improve this application by adding RESTFul API to fetch data from database.

Create BookController

When you invoke the /books resource to fetch all books, the logs would look like below.

Invoking /books

Here there is new type of threads appear.

[ctor-http-nio-3]

This is the HTTP NIO request thread which handles the HTTP request, note that it does not wait for the fetching data operation to complete. And then onwards the work carried out by the Netty event loop threads as usual.

But if you invoke this using blocking HTTP client you would not see any major difference here. It just returns a JSON payload and the behaviour seems same as it was implemented using Spring MVC (blocking stack). Let’s explore how can we improve this to be truly reactive.

Implementation : Phase 3 : Service Chaining with network latency.

This the final phase, where we get to see the real power of reactive programming model. In this phase we are going to invoke a remote service and simulate its network latency by adding artificial delay.

The remote service is Shop service, where it offers RESTFul endpoint to query about the available shops for a given book id.

Create Shop bean.

Shop bean

Create ShopService

ShopService

Note that in the ShopServiceImpl.getshopRemote() method I have introduced an artificial random delay for the Shop events. In real-world application this could be asynchronous HTTP client which receive Shop JSON elements with real network delay.

Create ShopService Implementation

Upgrade the controller with new endpoint to get book’s availability in shops.

In this controller upgrade, I have added new endpoint /books/{id}/shops which takes id (bookId) as a path variable which uses to query the remote shop service to check the availability of a book.

The most important part there is produce attribute in the @GetMapping annotation.

produces = MediaType.APPLICATION_STREAM_JSON_VALUE

It tells Spring to treat the resulting events of the Flux as stream of individual JSON items. Whenever an item is emitted it gets serialised , the HTTP response buffer is flushed , but more importantly the connection from the server to client keep open up until the point where event sequence is completed.

The blow GIF shows how it would emit Book events with random delay.

Reactive JSON Stream

The client of this service should be capable of processing JSON streams.

This endpoint harness the power of reactive programming where it uses less computational resources while being non-blocking for data over network. If the client has reactive HTTP client to consume these JSON streams then the Book results get delivered as they get available.

This kind of features achieved in the non-reactive stack by pagination.

Conclusion

The focus of this article series is to serve as a solid foundation to building reactive REST APIs. First part of the series focused on the fundamental concepts of reactive systems , reactive programming and clear the way for implementation.

The second part more focused on implementing reactive RESTFul API using Spring WebFulx and Reactor.

I hope both may server as good starting point for people who wants to learn and build responsive applications.

I know this is bit of lengthy read, but quality applications always go hand in hand with solid understanding of the underlying technology. Hope this article was worth your time, for building good knowledge on reactive RESTFul APIs.

--

--

Kalpa Senanayake

Senior Software Engineer | Cloud | API | System Design