Skip to main content
Redhat Developers  Logo
  • Products

    Platforms

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat AI
      Red Hat AI
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • View All Red Hat Products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat Developer Hub
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat OpenShift Local
    • Red Hat Developer Sandbox

      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Secure Development & Architectures

      • Security
      • Secure coding
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • Product Documentation
    • API Catalog
    • Legacy Documentation
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Simplify secure connections to PostgreSQL databases with Node.js

March 28, 2022
Michael Dawson
Related topics:
ContainersDatabasesKubernetesNode.jsSecure Coding
Related products:
Red Hat OpenShiftRed Hat OpenShift Container Platform

Share:

    PostgreSQL is an advanced open source relational database that is commonly used by applications to store structured data. Before accessing a database, the application must connect and provide security credentials. As a Node.js developer, how can you safely share and provide those credentials in JavaScript code without a lot of work? This article introduces service bindings and the kube-service-bindings package, along with a convenient graphical interface in Red Hat OpenShift.

    When using a database, the four basic operations are create, read, update, and delete (CRUD, for short). Our team maintains an example CRUD application on GitHub that shows how to connect to a PostgreSQL database and execute the four basic operations. We use that example to illustrate the security model in this article.

    Security risks when connecting to the PostgreSQL database

    The information you need to connect to a PostgreSQL database is:

    • User
    • Password
    • Host
    • Database
    • Port

    You definitely need to be careful about who has access to the user and password, and ideally, you don't want any of these values to be public. This section looks at some simple methods that fail to protect this sensitive information adequately.

    Setting environment variables explicitly

    Using environment variables is the easiest way to configure a connection and is often used in examples like the following JavaScript code:

    const serviceHost = process.env.MY_DATABASE_SERVICE_HOST;
      const user = process.env.DB_USERNAME; 
      const password = process.env.DB_PASSWORD; 
      const databaseName = process.env.POSTGRESQL_DATABASE
      const connectionString =
        `postgresql://${user}:${password}@${serviceHost}:5432/${databaseName}`;
      connectionOptions = { connectionString };
    
      const pool = new Pool(connectionOptions);

    Unfortunately, using environment variables is not necessarily secure. If you set the environment variables from the command line, anybody with access to the environment can see them. Tools and frameworks also often make it easy to access environment variables for debugging purposes. For example, in OpenShift, you can view the environment variables from the console, as shown in Figure 1. So you need to find a way to provide connection credentials while keeping them hidden from interlopers.

    Pod details in the OpenShift console reveal the environmental variables set in the pod.
    Figure 1: Pod details in the OpenShift console reveal the environmental variables set in the pod.

    Loading environment variables from dotenv

    Instead of setting the credentials in the environment directly, a safer way is to use a package such as dotenv to get the credentials from a file and provide them to the Node.js application environment. The benefit of using dotenv is that the credentials don't show up in the environment outside of the Node.js process.

    Although this approach is better, the credentials still might be exposed if you dump the Node.js environment for debugging through a Node.js diagnostic report. You are also left with the question of how to get the dotenv file securely to the application. If you are deploying to Kubernetes, you can map a file into deployed containers, but that will take some planning and coordination for deployments.

    By this point, you are probably thinking that this seems like a lot of work and are wondering whether you need to configure the connection information for each type of service and set of credentials that are needed by an application. The good news is that for Kubernetes environments, this problem has already been solved. We cover the solution, service binding, in the next section.

    Passing the credentials securely: Service binding in Kubernetes

    Service binding is a standard approach to map a set of files into containers to provide credentials in a safe and scalable way. You can read more about the Service Binding specification for Kubernetes on GitHub.

    The specification does not define what files are mapped in for a given service type. In OpenShift, binding to a PostgreSQL database instance (created using either the Crunchy or the Cloud Native PostgreSQL Operators, as described in an overview of the Service Binding Operator) results in mapping the following files into the application container:

    $ SERVICE_BINDING_ROOT/<postgressql-instance-name>
    ├── user
    ├── host
    ├── database
    ├── password
    ├── port
    ├── ca.crt
    └── tls.key
    └── tls.crt

    SERVICE_BINDING_ROOT is passed to the application through the environment.

    The last three files contain the keys and certificates needed to connect over the widely used Transport Layer Security (TLS) standard and are present only if the database is configured to use TLS.

    Consuming service bindings easily with kube-service-bindings

    Now that you have the credentials available to the application running in the container, the remaining work is to read the credentials from those files and provide them to the PostgreSQL client used within your Node.js application. But wait—that still sounds like a lot of work, and it's also tied to the client you are using.

    To make this easier, we've put together an npm package called kube-service-bindings, which makes it easy for Node.js applications to consume these secrets without requiring developers to be familiar with service bindings.

    The package provides the getBinding() method, which does roughly the following:

    1. Look for the SERVICE_BINDING_ROOT variable in order to determine whether bindings are available.
    2. Read the connection information from the files.
    3. Map the names of the files to the option names needed by the Node.js clients that will connect to the service.

    Figure 2 shows the steps.

    The getBinding() method involves three main steps.
    Figure 2: The getBinding() method involves three main steps.

    Let's assume you connect to PostgreSQL using the popular pg client, a library that provides all the basic commands to interact with the database. In this case you call the getBinding() method with POSTGRESQL and pg to tell kube-service-bindings which client the application is using, and then pass the object returned by getBinding()when you create a Pool object. Minus error checking, the code is as simple as this:

    const serviceBindings = require('kube-service-bindings');
    const { Pool } = require('pg');
    
    let connectionOptions;
    try {
      connectionOptions = serviceBindings.getBinding('POSTGRESQL', 'pg');
    } catch (err) {
    }
    
    const pool = new Pool(connectionOptions);

    The first parameter to getBindings() is POSTGRESQL, to specify that you are connecting to a PostgreSQL database. The second parameter, pg, tells kube-service-bindings that you are using the pg client so that the call will return the information as an object that can be passed when creating a pg Pool object.

    The CRUD example, and more specifically the lib/db/index.js file, has been updated so that it can get the credentials from the environment, or automatically using kube-service-bindings when credentials are available through service bindings.

    With kube-service-bindings, it's easy for Node.js developers to use credentials available through service bindings. The second part is to set up the service bindings themselves. The procedure is to install the Service Binding Operator as described in the overview article mentioned earlier, install an Operator to help you create databases, create the database for your application, and finally apply some YAML to tell the Service Binding Operator to bind the database to your application.

    Setting up service bindings in OpenShift

    With the release of OpenShift 4.8, you can use the OpenShift user interface (UI) to do the service binding. Thus, administrators and operators of a cluster can easily set up the PostgreSQL database instance for an organization. Developers can then connect their applications without needing to know the credentials. You can use the UI for convenience during initial development, and then YAML for more automated or production deployments.

    The UI steps are quite simple:

    1. Create a database using one of the PostgresSQL Operators.

    2. Deploy your application to the same namespace using kube-service-bindings. Figure 3 shows the topology view of the namespace.

      The namespace contains the PostgreSQL database and Node.js application.
      Figure 3: The namespace contains the PostgreSQL database and Node.js application.
    3. Drag a link from the application to the database until you see the "Create a binding connector" box pop up (Figure 4).

      Create a binding from the Node.js application to the PostgreSQL database.
      Figure 4: Create a binding from the Node.js application to the PostgreSQL database.
    4. Finally, release the mouse button. The binding is created (Figure 5) and the credentials are automatically mapped into your application pods. If you've configured your application to retry the connection until service bindings are available, it should then get the credentials and connect to the database.

      The binding is established.
      Figure 5: The binding is established.

    Further resources

    This article introduced you to the credentials needed to connect to a PostgreSQL database and how they can be safely provided to your Node.js applications. To learn more, try the following:

    1. Install and experiment with the CRUD example to explore the code and kube-service-bindings. (If you are really adventurous, you can create your own files and set SERVICE_BINDING_ROOT to point to them.)
    2. Work through how to set up service bindings for a PostgreSQL database using the instructions in the Service Binding Operator overview.
    3. Connect the CRUD example to the PostgreSQL database you created using the UI.

    We hope you found this article informative. To stay up to date with what else Red Hat is up to on the Node.js front, check out our Node.js topic page.

    Last updated: November 9, 2023

    Related Posts

    • Connect Node.js applications to Red Hat OpenShift Streams for Apache Kafka with Service Binding

    • Announcing Service Binding Operator 1.0 GA

    • How to use Quarkus with the Service Binding Operator

    Recent Posts

    • How to deploy language models with Red Hat OpenShift AI

    • AI search with style: Fashion on OpenShift AI with EDB

    • What qualifies for Red Hat Developer Subscription for Teams?

    • How to run OpenAI's gpt-oss models locally with RamaLama

    • Using DNS over TLS in OpenShift to secure communications

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue