Skip to main content
Redhat Developers  Logo
  • Products

    Featured

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat OpenShift AI
      Red Hat OpenShift AI
    • Red Hat Enterprise Linux AI
      Linux icon inside of a brain
    • Image mode for Red Hat Enterprise Linux
      RHEL image mode
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • Red Hat Developer Hub
      Developer Hub
    • View All Red Hat Products
    • Linux

      • Red Hat Enterprise Linux
      • Image mode for Red Hat Enterprise Linux
      • Red Hat Universal Base Images (UBI)
    • Java runtimes & frameworks

      • JBoss Enterprise Application Platform
      • Red Hat build of OpenJDK
    • Kubernetes

      • Red Hat OpenShift
      • Microsoft Azure Red Hat OpenShift
      • Red Hat OpenShift Virtualization
      • Red Hat OpenShift Lightspeed
    • Integration & App Connectivity

      • Red Hat Build of Apache Camel
      • Red Hat Service Interconnect
      • Red Hat Connectivity Link
    • AI/ML

      • Red Hat OpenShift AI
      • Red Hat Enterprise Linux AI
    • Automation

      • Red Hat Ansible Automation Platform
      • Red Hat Ansible Lightspeed
    • Developer tools

      • Red Hat Trusted Software Supply Chain
      • Podman Desktop
      • Red Hat OpenShift Dev Spaces
    • Developer Sandbox

      Developer Sandbox
      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Secure Development & Architectures

      • Security
      • Secure coding
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
      • View All Technologies
    • Start exploring in the Developer Sandbox for free

      sandbox graphic
      Try Red Hat's products and technologies without setup or configuration.
    • Try at no cost
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • Java
      Java icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • API Catalog
    • Product Documentation
    • Legacy Documentation
    • Red Hat Learning

      Learning image
      Boost your technical skills to expert-level with the help of interactive lessons offered by various Red Hat Learning programs.
    • Explore Red Hat Learning
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Building reactive systems with Node.js

August 31, 2021
Alexandros Alykiotis
Related topics:
Event-DrivenKafkaNode.js
Related products:
Red Hat OpenShift

Share:

    If you do a web search for computing terms that go with the word reactive, you'll find a wealth of phrases: Reactive streams, reactive systems, reactive messaging, and reactive programming are examples. The word reactive is also associated with other popular concepts like non-blocking I/O, functional programming, and backpressure.

    Although those are all interesting topics, studying reactive systems is a good place to start. This concept was defined by the Reactive Manifesto as an architectural style for distributed systems that are responsive, elastic, resilient, and message-driven. Other constructs such as reactive streams (an asynchronous and non-blocking backpressure protocol) and reactive programming (such as reactive extensions) are implementation details.

    Although the Reactive Manifesto is language-agnostic and framework-agnostic, Node.js is an excellent framework for carrying out its principles. This article provides general background on Node.js in reactive systems, then takes you step-by-step through a reactive service built with Node.js and Apache Kafka.

    Node.js in reactive systems

    The Reactive Manifesto was initially released in 2013 by a group of developers led by Jonas Boner. In this section, we'll look at the four crucial characteristics the manifesto defines for a reactive system, and how Node.js facilitates them.

    Note: Another compendium white paper, The Reactive Principles (2020), explains in detail the patterns and techniques for building reactive systems.

    Responsiveness

    Reactive systems need to stay responsive even under fluctuating load and when facing failures. Responsiveness is not only about answering calls but doing so in a timely and efficient manner. This last point is essential. Components forming a reactive system must adapt to the available resources and use them carefully. Non-blocking I/O meets this requirement by providing a way to handle multiple concurrent requests with just a few threads. Using non-blocking I/O results in much better resource usage.

    Node.js is based on non-blocking I/O, and Node.js developers already know that they must avoid the trap of "blocking the event loop." Having a background in non-blocking I/O means that Node.js developers are thinking about how to ensure that components respond quickly without blocking for a long period of time. As a result, it's easy for Node.js implementations to be more responsive than those in other languages and frameworks.

    Resilience

    Resilience is not about avoiding failures, because they are inevitable. Resilience is about handling them gracefully. Replication is a crucial approach when handling failures. It avoids service disruption by relying on multiple instances of a service. If an instance crashes, others can handle the requests.

    Nowadays, resilience is achieved by running multiple copies of an application in parallel. Its small size and short startup time make Node.js a great fit for building applications as small, isolated components and deploying them with multiple copies. These qualities limit the scope of failures, allow fast recovery when a failure occurs, and impose low overhead when running multiple copies.

    Elasticity

    Replication is not only a key pattern for handling failures, it is also the cornerstone of elasticity. While non-blocking I/O allows application instances to handle more load than traditional approaches, the ability to scale up and down is essential for adapting the system to the current demand.

    Elasticity is a prerequisite for responsive and resilient systems because they must scale to meet the request load. Node.js is a good fit for elastic systems because it can handle a large number of requests with low overhead. Its small size and fast startup allow the number of instances running the Node.js component to scale up and down efficiently.

    Message-driven

    Node.js uses a non-blocking, event-driven design for everything it does, which makes it a great fit for message-driven systems. That means you don't need extra libraries or abstractions to achieve good performance when using a message-driven approach: You get it for free.

    Reactive systems using Node.js

    We are going to use the reactive koffeeshop demo to demonstrate the creation of a reactive system, the limits of using HTTP to communicate between our components (aka services), and how to build a message-driven reactive system with Kafka. Although this example uses the popular Kafka event streaming platform, any service that implements a modern messaging protocol, such as RabbitMQ, NATS, or ActiveMQ, would work.

    Because we're building a reactive system that includes multiple services, we can choose whatever programming language we prefer. This example sticks with Node.js for all services, but a multi-language example of the koffeeshop demo is also available.

    Services in the koffeshop demo

    The koffeshop demo consists of three different services:

    • koffeeshop-service: This is the application front end and the service that initially takes customer orders.
    • barista-http: This service uses the HTTP protocol to communicate with every other service. Its purpose is to prepare a beverage for every given order.
    • barista-kafka: This service does exactly the same thing as barista-http, but uses a messaging protocol to communicate.

    Building the demo

    To run the demo, you will need Docker, Docker Compose, Node.js, and Kafka. You can download Kafka or run brew install kafka if you are a macOS user.

    Install the demo's services as follows:

    $ git clone git@github.com:nodeshift-starters/reactive-koffeeshop-demo.git 
    
    $ cd reactive-koffeeshop-demo 
    
    $ cd koffeeshop-service/ && npm install
    
    $ cd barista-http/ && npm install
    
    $ cd barista-kafka/ && npm install
    

    Running the example

    Use the following commands to run the demo:

    # terminal-1 - this will spin up the Kafka cluster
    $ docker-compose up
    
    # terminal-2
    $ ./create-topics.sh
    $ cd koffeeshop-service
    $ npm start
    
    # terminal-3
    $ cd barista-http
    $ npm start
    
    # terminal-4
    $ cd barista-kafka
    $ npm start
    

    Now you should have four terminals open. Each of the three services is running in a separate terminal. If you visit the URL http://localhost:8080, you should be presented with the order screen in Figure 1.

    The user interface of the koffeeshop demo allows you to choose an order method and a product.
    Figure 1: Initial user interface of the koffeeshop demo.

    The koffeeshop-service

    Let's take a quick look at the code for the koffeeshop-service:

    const path = require('path');
    const { EventEmitter } = require('events');
    const Fastify = require('fastify');
    const FastifySSEPlugin = require('fastify-sse');
    const { nanoid } = require('nanoid');
    const { Kafka } = require('kafkajs');
    const axios = require('axios');
    
    const { createFallbackBeverage, inQueue } = require('./models/beverage');
    
    require('dotenv').config();
    
    const fastify = Fastify({ logger: { prettyPrint: true } });
    
    fastify.register(require('fastify-static'), {
      root: path.join(process.cwd(), 'public')
    });
    
    fastify.register(FastifySSEPlugin);
    
    fastify.post('/http', async (request, reply) => {
      // if we get an order through http just forward it to the barista-http-services
      const { name, product } = request.body;
      const order = { orderId: nanoid(), customer: name, beverage: product };
      try {
        const response = await axios.post('http://localhost:8081', order);
        reply.send(response.data);
      } catch (err) {
        reply.send(createFallbackBeverage(order));
      }
    });
    
    const kafka = new Kafka({
      clientId: 'koffeeshop-services',
      brokers: [process.env.KAFKA_BOOTSTRAP_SERVER ]
    });
    
    const queue = new EventEmitter();
    
    const producer = kafka.producer(); // orders
    const consumer = kafka.consumer({ groupId: 'koffeeshop' }); // beverages
    
    fastify.get('/queue', (_, reply) => {
      queue.on('update', (data) => {
        reply.sse(data);
      });
    });
    
    fastify.post('/messaging', (request, reply) => {
      const { name, product } = request.body;
      const order = { orderId: nanoid(), customer: name, beverage: product };
      producer.send({
        topic: 'orders',
        messages: [{ value: JSON.stringify({ ...order }) }]
      });
      queue.emit('update', inQueue(order));
      reply.send(order);
    });
    
    const start = async () => {
      // connect the consumer and producer instances to Kafka
      await consumer.connect();
      await producer.connect();
    
      // subscribe to the `queue` topic
      await consumer.subscribe({ topic: 'queue', fromBeginning: true });
    
      // start the fastify server
      fastify.listen(8080, '0.0.0.0', async (err) => {
        if (err) {
          console.error(err);
          process.exit(1);
        }
      });
    
      // start listening for kafka messages
      consumer.run({
        eachMessage: ({ message }) => {
          const beverage = JSON.parse(message.value.toString());
          queue.emit('update', beverage);
        }
      });
    };
    
    start();

    This service uses the Fastify framework to build a simple server and the kafkajs library to communicate with the Kafka cluster. The server is responsible for:

    1. Serving our web application to the client's browser. The application is written with HTML and jQuery.
    2. Receiving orders at the /http endpoint and forwarding them to the http-service.
    3. Receiving orders at the /messaging endpoint and sending them to Kafka for later consumption by the barista-service.
    4. Listening for finished orders and notifying the client about them (using server-sent events).

    The barista-kafka service

    Now let's look at the barista-kafka service:

    const pino = require('pino');
    const { Kafka } = require('kafkajs');
    const Beverage = require('./models/beverage');
    
    require('dotenv').config();
    
    const logger = pino({
      prettyPrint: true
    });
    
    const kafka = new Kafka({
      clientId: 'barista-kafka-node',
      brokers: [process.env.KAFKA_BOOTSTRAP_SERVER ]
    });
    
    const producer = kafka.producer();
    const consumer = kafka.consumer({ groupId: 'baristas' });
    
    const run = async () => {
      // connect the consumer adn producer instances to Kafka
      await consumer.connect();
      await producer.connect();
    
      // subscribe consumer to the `orders` topic
      await consumer.subscribe({ topic: 'orders', fromBeginning: true });
    
      // start listening for messages
      await consumer.run({
        eachMessage: async ({ message }) => {
          // get the order from kafka and prepare the beverage
          const order = JSON.parse(message.value.toString());
          const beverage = await Beverage.prepare(order);
          // debug statement
          logger.info(`Order ${order.orderId} for ${order.customer} is ready`);
          // create a kafka-message from a JS object and send it to kafka
          producer.send({
            topic: 'queue',
            messages: [{ value: JSON.stringify({ ...beverage }) }]
          });
        }
      });
    };
    
    run().catch((err) => logger.error(err));
    
    process.once('SIGINT', consumer.disconnect);
    process.once('SIGINT', producer.disconnect);
    

    Using the kafkajs library, we create a consumer instance that will be used to receive orders registered to Kafka by the koffeeshop-service. We also create a producer instance to send back notifications to Kafka when a beverage has been prepared.

    Using HTTP in a Node.js reactive system

    The first time we try to place an order, we use the HTTP order method. When we select the HTTP option, as shown in Figure 2, the koffeshop-service notifies the barista-service (which is responsible for preparing our beverage) about this new order using plain HTTP.

    The HTTP order method allows you to place one order at a time.
    Figure 2: Display after placing a single order using HTTP.

    As the figure shows, the HTTP method allows us to place only one order at a time. If we were in a real coffee shop, that would mean a lot of waiting while the barista prepared each beverage. Using the HTTP method also means that, if for some reason the barista-service became unavailable, our order would be completely lost.

    Using Kafka messaging in a Node.js reactive system

    This time we choose the Messaging/Kafka order method, as shown in Figure 3. The koffee-shop service sends our order to Kafka (for later consumption by a barista) rather than sending it directly to the barista-service.

    The Messaging/Kafka order method allows you to place multiple orders that can be handled later.
    Figure 3: Display after placing multiple orders using Messaging/Kafka.

    The differences between the two options are obvious now. Using the Messaging/Kafka option, we decoupled our system and made every service independent. So now we can place multiple orders and the barista-service will process them one by one whenever it can.

    What will happen, though, if the barista-service ever goes down?

    Because the orders are sent to Kafka and not to the barista service directly, there will be no problems. Kafka will keep the orders in a queue until the barista comes back to life and starts pulling orders again. This last point is a great example of the resilience of reactive systems.

    In case the coffee shop gets so many people wanting coffee that just one barista isn't enough, we can simply spawn another barista service in a new terminal. Scaling the baristas so easily illustrates the principle of elasticity.

    Conclusion

    Reactive systems offer great advantages over traditional monolithic systems in an enterprise application. If you build high-integrity systems following reactive principles, you will gain much flexibility. Node.js is also a great framework for reactive systems because its architecture is closely aligned with the reactive principles documented in the Reactive Manifesto and The Reactive Principles. If you want to go deeper into reactive systems, please check out the video presentation Reactive with Node.js by Clement Escoffier and myself.

    Related Posts

    • Part 1: An introduction to reactive programming and Vert.x

    • Part 2: Building a reactive system

    • Part 3: A reactive system in action

    • Building resilient event-driven architectures with Apache Kafka

    Recent Posts

    • Storage considerations for OpenShift Virtualization

    • Upgrade from OpenShift Service Mesh 2.6 to 3.0 with Kiali

    • EE Builder with Ansible Automation Platform on OpenShift

    • How to debug confidential containers securely

    • Announcing self-service access to Red Hat Enterprise Linux for Business Developers

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue