Skip to main content
Redhat Developers  Logo
  • Products

    Platforms

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat AI
      Red Hat AI
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • View All Red Hat Products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat Developer Hub
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat OpenShift Local
    • Red Hat Developer Sandbox

      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Secure Development & Architectures

      • Security
      • Secure coding
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • Product Documentation
    • API Catalog
    • Legacy Documentation
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Eclipse Vert.x Application Configuration (Part 2 of Introduction to Vert.x)

March 22, 2018
Clement Escoffier
Related topics:
JavaMicroservices

Share:

    In my previous post, Introduction to Eclipse Vert.x, we developed a very simple Vert.x application and saw how this application can be tested, packaged, and executed. That was nice, wasn’t it? Well, that was only the beginning. In this post, we are going to enhance our application to support external configuration, and learn how to deal with different configuration sources.

    So just to remind you, we have an application starting an HTTP server on the port 8080 and replying with a polite “Hello” message to all HTTP requests. The code of the previous post is available in the post-1 directory from https://github.com/redhat-developer/introduction-to-eclipse-vertx. The code developed in this post is in the post-2 directory.

    So, Why Do We Need Configuration?

    That’s a good question. The application works right now, but let’s say you want to deploy it on a machine where the port 8080 is already taken. We would need to change the port in the application code and in the test, just for this machine. That would be sad. Fortunately, Vert.x applications are configurable.

    There are several ways to configure a Vert.x application:

    1. Using a simple JSON file.
    2. Using Vert.x Config.

    In both cases, the application code manipulates the configuration as a JsonObject.

    When using a simple JSON file, the verticle receives the configuration. The configuration can be passed to the command line or use an API. Let’s have a look.

    No ‘8080’ anymore

    The first step is to modify the io.vertx.intro.first.MyFirstVerticle class to not bind to the port 8080, but to read it from the configuration:

    package io.vertx.intro.first;
     
    import io.vertx.core.AbstractVerticle;
    import io.vertx.core.Future;
     
    public class MyFirstVerticle extends AbstractVerticle {
     
      @Override
      public void start(Future fut) {
        vertx
          .createHttpServer()
          .requestHandler(r ->
            r.response()
                .end("</pre>
    <h1>Hello from my first " + "Vert.x application</h1>
    <pre>
    "))
          .listen(config()
            .getInteger("HTTP_PORT", 8080), 
          result -> {
            if (result.succeeded()) {
                fut.complete();
            } else {
                fut.fail(result.cause());
            }
          });
        }
    }

     

    So, the only difference with the previous version is config().getInteger("HTTP_PORT", 8080). Here, our code is now requesting the configuration and checking whether the HTTP_PORT property is set. If not, the port 8080 is used as a fallback. The retrieved configuration is a JsonObject.

    As we are using the port 8080 by default, you can still package our application and run it as before:

    mvn clean package
    java -jar target/my-first-app-1.0-SNAPSHOT.jar

    Simple, right?

    API-based Configuration - Random Port for the Tests

    Now that the application is configurable, let’s try to provide a configuration. In our test, we are going to configure our application to use the port 8081. Previously we were deploying our verticle with:

    vertx.deployVerticle(MyFirstVerticle.class.getName(), 
      context.asyncAssertSuccess());

    Let’s now pass some deployment options:

    private Vertx vertx;
    // New field storing the port.
    private int port = 8081;
    
    @Before
    public void setUp(TestContext context) {
      vertx = Vertx.vertx();
      // Create deployment options with the chosen port
      DeploymentOptions options = new DeploymentOptions()
          .setConfig(new JsonObject().put("HTTP_PORT", port));
      // Deploy the verticle with the deployment options
      vertx.deployVerticle(MyFirstVerticle.class.getName(), 
        options, context.asyncAssertSuccess());
    }

     

    The DeploymentOptions object lets us customize various parameters. In particular, it lets us inject the JsonObject retrieved by the verticle when using the config() method.

    Obviously, the test connecting to the server needs to be slightly modified to use the right port (port is a field):

    vertx.createHttpClient().getNow(port, "localhost", 
      "/", response -> {
        response.handler(body -> {
          context.assertTrue(body.toString()
            .contains("Hello"));
          async.complete();
        });
    });

    Well, this does not really fix our issue. What happens when the port 8081 is also used? Let’s now pick a random port:

    // Pick an available and random
    ServerSocket socket = new ServerSocket(0);
    port = socket.getLocalPort();
    socket.close();
    
    DeploymentOptions options = new DeploymentOptions()
                .setConfig(new JsonObject()
                  .put("HTTP_PORT", port));
    vertx.deployVerticle(MyFirstVerticle.class.getName(), 
         options, context.asyncAssertSuccess());

    So, the idea is very simple. We open a server socket that would pick a random port (that’s why we put 0 in the ServerSocket parameter). We retrieve the used port and close the socket. Be aware that this method is not perfect and may fail if the picked port becomes used between the close method and the start of our HTTP server. However, it should work fine in the very high majority of cases.

    With this in place, our test is now using a random port. Execute them with:

    mvn clean test

    External Configuration - Let’s Run on Another Port

    Ok, a random port is not what we want in production. Could you imagine the face of your ops team if you tell them that your application is picking a random port? It could actually be funny, but we should never mess with the ops team.

    For the actual execution of your application, let’s pass the configuration in an external file. The configuration is stored in a JSON file.

    Create the src/main/conf/my-application-conf.json with the following content:

    {
      "HTTP_PORT" : 8082
    }

    And now, to use this configuration just launch your application with:

    java -jar target/my-first-app-1.0-SNAPSHOT.jar \
      -conf src/main/conf/my-application-conf.json

    Open a browser on http://localhost:8082, and here it is!

    How does that work? Our fat jar is using the Launcher class (provided by Vert.x) to launch our application. This class is reading the -conf parameter and creating the corresponding deployment options when deploying our verticle.

    12 Factor Apps and Other Configuration Store

    While storing the configuration in a JSON file is pretty convenient, it does not always fit the requirement. For instance, if you follow the 12 Factor App principles, it recommends that the application read environment variables as the configuration. What about Consul or Vault to store secrets? To handle all these cases, Vert.x provides a convenient module named: vertx-config. In this section, we change how we retrieve the HTTP port from the environment variable, system properties, and finally the provided configuration file.

    First, add the following dependency to your pom.xml file:

    io.vertx
      vertx-config
      ${vertx.version}

    In your verticle class, update the content of the start method to become:

    @Override
    public void start(Future fut) {
      ConfigRetriever retriever = ConfigRetriever.create(vertx);
      retriever.getConfig(
          config -> {
            if (config.failed()) {
                fut.fail(config.cause());
            } else {
                vertx
                    .createHttpServer()
                    .requestHandler(r ->
                        r.response().end(
                          "</pre>
    <h1>Hello from my first " + "Vert.x application</h1>
    <pre>
    "))
                    .listen(config
                       .result().getInteger("HTTP_PORT", 8080),
                       result -> {
                        if (result.succeeded()) {
                            fut.complete();
                        } else {
                            fut.fail(result.cause());
                        }
                    });
            }
          }
      );
    }

     

    The vertx-config module provides the ConfigRetriever. This object is responsible for retrieving the different configuration chunks and computing the final configuration. Since this process is asynchronous, the result is passed to a handler that executes the rest of the startup logic.

    With this in place, the port is now chosen from 3 different locations:

    • The configuration file, as seen previously (using -conf).
    • System properties. For instance, launch the application with -DHTTP_PORT=8081 to use the port 8081.
    • Environment properties. For instance, launch the application with:
    export HTTP_PORT=8081
    java -jar target/my-first-app-1.0-SNAPSHOT.jar

    vertx-config proposes a lot more features and the config store. Check out its documentation.

    Conclusion

    After having developed your first Vert.x application, we have seen how this application is configurable. And this is without adding complexity to our application. In the next post, we are going to see how we can use vertx-web to develop a small application serving static pages and a REST API. A bit more fancy, but still very simple.

    If you are eager to see more, check the Eclipse Vert.x web site. If you want to deploy a Vert.x application on OpenShift right now, check the http://launch.openshift.io.

    Happy coding and stay tuned for my next Vert.x article!

    Last updated: April 17, 2018

    Recent Posts

    • Cloud bursting with confidential containers on OpenShift

    • Reach native speed with MacOS llama.cpp container inference

    • A deep dive into Apache Kafka's KRaft protocol

    • Staying ahead of artificial intelligence threats

    • Strengthen privacy and security with encrypted DNS in RHEL

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue