Skip to main content
Redhat Developers  Logo
  • AI

    Get started with AI

    • Red Hat AI
      Accelerate the development and deployment of enterprise AI solutions.
    • AI learning hub
      Explore learning materials and tools, organized by task.
    • AI interactive demos
      Click through scenarios with Red Hat AI, including training LLMs and more.
    • AI/ML learning paths
      Expand your OpenShift AI knowledge using these learning resources.
    • AI quickstarts
      Focused AI use cases designed for fast deployment on Red Hat AI platforms.
    • No-cost AI training
      Foundational Red Hat AI training.

    Featured resources

    • OpenShift AI learning
    • Open source AI for developers
    • AI product application development
    • Open source-powered AI/ML for hybrid cloud
    • AI and Node.js cheat sheet

    Red Hat AI Factory with NVIDIA

    • Red Hat AI Factory with NVIDIA is a co-engineered, enterprise-grade AI solution for building, deploying, and managing AI at scale across hybrid cloud environments.
    • Explore the solution
  • Learn

    Self-guided

    • Documentation
      Find answers, get step-by-step guidance, and learn how to use Red Hat products.
    • Learning paths
      Explore curated walkthroughs for common development tasks.
    • See all learning

    Hands-on

    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.
    • Interactive labs
      Learn by doing in these hands-on, browser-based experiences.
    • Interactive demos
      Click through product features in these guided tours.

    Browse by topic

    • AI/ML
    • Automation
    • Java
    • Kubernetes
    • Linux
    • See all topics

    Training & certifications

    • Courses and exams
    • Certifications
    • Skills assessments
    • Red Hat Academy
    • Learning subscription
    • Explore training
  • Build

    Get started

    • Red Hat build of Podman Desktop
      A downloadable, local development hub to experiment with our products and builds.
    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.

    Download products

    • Access product downloads to start building and testing right away.
    • Red Hat Enterprise Linux
    • Red Hat AI
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat Developer Toolset

    References

    • E-books
    • Documentation
    • Cheat sheets
    • Architecture center
  • Community

    Get involved

    • Events
    • Live AI events
    • Red Hat Summit
    • Red Hat Accelerators
    • Community discussions

    Follow along

    • Articles & blogs
    • Developer newsletter
    • Videos
    • Github

    Get help

    • Customer service
    • Customer support
    • Regional contacts
    • Find a partner

    Join the Red Hat Developer program

    • Download Red Hat products and project builds, access support documentation, learning content, and more.
    • Explore the benefits

How to enable OpenTelemetry traces in React applications

March 22, 2023
Purva Naik
Related products:
Red Hat OpenShift

    The main focus of this article is to demonstrate how to instrument React applications to make them observable. For a good overview of observability and OpenTelemetry, please take a look at the article, Observability in 2022: Why it matters and how OpenTelemetry can help.

    10-step OpenTelemetry demonstration

    Related to the OpenTelemetry, we are using the following:

    • Auto instrumentation via sdk-trace-web and a plugin to provide auto instrumentation for fetch.
    • OpenTelemetry Collector (also known as OTELCOL).
    • Jaeger
    • Basic collector deployment pattern. For more information about OTELCOL deployment patterns, please take a look at OpenTelemetry Collector Deployment Patterns.

    Step 1. Set up prerequisites

    In this demo, we are going to use Docker and docker-compose. You can refer to Docker and docker-compose to learn more.

    Step 2. Run the React application example

    You will be using a front-end react application that contains the sample React application code that we will instrument. Please note that the repository also contains an Express application as a back end, but the focus of this tutorial is to instrument the front end only.

    The front-end application contains a button that calls the back end using Express and a scroll component that calls the https://randomuser.me/ free public API. We are going to delegate to OpenTelemetry libraries the job of capturing traces for the button and the scroll component. So every time the user clicks on the button or scrolls the page, the auto-instrumentation plugin will generate traces for this.

    Clone the following GitHub repository from the command line:

    git clone https://github.com/obs-nebula/frontend-react.git

    Step 3. Instrument the React application

    The following list shows the dependencies we added. You may want to use newer versions, depending on when you are reading this article:

    "@opentelemetry/exporter-trace-otlp-http": "^0.35.0",
    
    "@opentelemetry/instrumentation": "^0.35.0",
    
    "@opentelemetry/instrumentation-fetch": "^0.35.0",
    
    "@opentelemetry/resources": "^1.9.1",
    
    "@opentelemetry/sdk-trace-web": "^1.8.0",
    
    "@opentelemetry/semantic-conventions": "^1.9.1"

    Create a file named tracing.js that will load OpenTelemetry. We are going to share more details in the following subsections. The content of the front-end/src/tracing.js file is as follows:

    const { Resource } = require('@opentelemetry/resources');
    
    const { SemanticResourceAttributes } = require('@opentelemetry/semantic-conventions');
    
    const { WebTracerProvider, SimpleSpanProcessor, ConsoleSpanExporter } = require('@opentelemetry/sdk-trace-web');
    
    const { OTLPTraceExporter }  = require('@opentelemetry/exporter-trace-otlp-http');
    
    const { registerInstrumentations } = require('@opentelemetry/instrumentation');
    
    const { FetchInstrumentation } = require('@opentelemetry/instrumentation-fetch');
    
    const consoleExporter = new ConsoleSpanExporter();
    
    const collectorExporter = new OTLPTraceExporter({
    
      headers: {}
    
    });
    
    const provider = new WebTracerProvider({
    
      resource: new Resource({
    
        [SemanticResourceAttributes.SERVICE_NAME]: process.env.REACT_APP_NAME
    
      })
    
    });
    
    const fetchInstrumentation = new FetchInstrumentation({});
    
    fetchInstrumentation.setTracerProvider(provider);
    
    provider.addSpanProcessor(new SimpleSpanProcessor(consoleExporter));
    
    provider.addSpanProcessor(new SimpleSpanProcessor(collectorExporter));
    
    provider.register();
    
    registerInstrumentations({
    
      instrumentations: [
    
        fetchInstrumentation
    
      ],
    
      tracerProvider: provider
    
    });
    
    export default function TraceProvider ({ children }) {
    
      return (
    
       <>
    
          {children}
    
       </>
    
      );
    
    }

    Step 4. Import the required modules

    Next, you will need to import the OpenTelemetry modules. As you can see, we can get the ConsoleSpanExporter and SimpleSpanProcessor from the @opentelemetry/sdk-trace-web so we don’t need to add an extra dependency for this.

    const { Resource } = require('@opentelemetry/resources');
    
    const { SemanticResourceAttributes } = require('@opentelemetry/semantic-conventions');
    
    const { WebTracerProvider,SimpleSpanProcessor, ConsoleSpanExporter } = require('@opentelemetry/sdk-trace-web');
    
    const { OTLPTraceExporter }  = require('@opentelemetry/exporter-trace-otlp-http');
    
    const { registerInstrumentations } = require('@opentelemetry/instrumentation');
    
    const { FetchInstrumentation } = require('@opentelemetry/instrumentation-fetch');

    Step 5. Initialize the tracer

    Since we are using React with JavaScript, to initialize the OpenTelemetry tracer, you will need to create a new instance of the TraceProvider and pass it as a property to your root React component. You can do this by adding the following code to your main application file, in our case, that is in index.js file.

    import TraceProvider from './tracing';
    
    const root = ReactDOM.createRoot(document.getElementById('root'));
    
    root.render(
    
      <TraceProvider>
    
        <App />
    
      </TraceProvider>
    
    );

    Step 6. Create an OTELCOL exporter instance

    To export the traces to OTELCOL, you will need to create an instance of OTLPTraceExporter.

    Note that we are adding a workaround to use XHR instead of sendBeacon, as described in this OpenTelemetry JS upstream issue. With that, we can fix the CORS problem when exporting.

    const collectorExporter = new OTLPTraceExporter({
    
    headers: {}
    
    });

    Step 7. Create the otel-collector-config file

    Now let’s take a look at the yaml file content. To configure the OTELCOL, you will need to create a new file called otel-collector-config.yaml in your root directory. In this file, we are going to configure the receiver, processor, and exporters (using Jaeger and logging as exporters).

    receivers:
    
      otlp:
    
        protocols:
    
          http:
    
            cors:
    
              allowed_origins: ["*"]
    
              allowed_headers: ["*"]
    
    exporters:
    
      logging:
    
        verbosity: Detailed
    
      jaeger:
    
        endpoint: jaeger-all-in-one:14250
    
        tls:
    
          insecure: true
    
    processors:
    
      batch:
    
    service:
    
      telemetry:
    
        logs:
    
          level: "debug"
    
      pipelines:
    
        traces:
    
          receivers: [otlp]
    
          exporters: [logging, jaeger]
    
          processors: [batch]

    Step 8. Create a docker compose file

    Create a docker-compose file and define the services for OTELCOL, Jaeger, and the application as follows:

    version: "2"
    
    services:
    
      front-end:
    
        build:
    
          context:./front-end
    
        depends_on:
    
          - express-server
    
        ports:
    
          - "3000:3000"
    
        env_file:
    
         -./front-end/src/.env
    
      express-server:
    
        build:
    
          context:./express-server
    
        ports:
    
          - "5000:5000"
    
      collector:
    
        image: otel/opentelemetry-collector:latest
    
        command: ["--config=/otel-collector-config.yaml"]
    
        volumes:
    
          - './otel-collector-config.yaml:/otel-collector-config.yaml'
    
        ports:
    
          - "4318:4318"
    
        depends_on:
    
          - jaeger-all-in-one
    
       # Jaeger
    
      jaeger-all-in-one:
    
        hostname: jaeger-all-in-one
    
        image: jaegertracing/all-in-one:latest
    
        ports:
    
          - "16685"
    
          - "16686:16686"
    
          - "14268:14268"
    
          - "14250:14250"

    Step 9. Start the services

    Once you have created OTELCOL config and Docker compose files, you can start the services by running the following command in your terminal:

    $docker-compose up

    Once the services are started you can access the react application at http://localhost:3000.

    As mentioned previously, every time the users scroll the page, it will generate random data through an API call and the OpenTelemetry will generate traces based on the scrolling activity.

    Step 10. View the traces in Jaeger

    You can view the traces in the Jaeger UI by navigating to http://localhost:16686. The horizontal circles in the chart represent button clicks, and the vertical circles represent scrolling activity, as shown in Figure 1.

    The jaeger UI showing circles in the chart that represent button clicks and scrolling activity.
    Figure 1: Application tracing in Jaeger.

    Let's click on one of the horizontal items and expand the trace detail as shown in Figure 2:

    In the Jaeger UI, expanded trace detail shows the Express back-end was called and the OpenTelemetry library name.
    Figure 2: The expanded trace detail shows the Express back-end was called and the OpenTelemetry library name.

    We can see in Figure 2 that the Express back end was called, and the OpenTelemetry library name that is responsible for this trace generation.

    Now let’s click in one vertical circle as shown in Figure 3:

    The trace information showing that the scrolling activity is calling an external API.
    Figure 3: The trace information showing that the scrolling activity is calling an external API.

    We can see in Figure 3 that scrolling activity is calling an external API.

    Collecting and analyzing telemetry data

    You have successfully enabled OpenTelemetry in your React application using the OpenTelemetry collector and Jaeger. You can now start collecting and analyzing telemetry data. You can use the Jaeger UI to view traces, identify performance bottlenecks, and gain deeper understanding about what the React application is doing when calling external systems.

    Further reading

    Want to learn more about observability and OpenTelemetry? Check out these articles:

    • Observability in 2022: Why it matters and how OpenTelemetry can help
    • Distributed tracing with OpenTelemetry, Knative, and Quarkus
    • A guide to the open source distributed tracing landscape
    Last updated: August 14, 2023

    Related Posts

    • Observability in 2022: Why it matters and how OpenTelemetry can help

    • Build a monitoring infrastructure for your Jaeger installation

    • How to use OpenTelemetry to trace Node.js applications

    • Process Formula 1 telemetry with Quarkus and OpenShift Streams for Apache Kafka

    • OpenTelemetry: A Quarkus Superheroes demo of observability

    • Game telemetry with Kafka Streams and Quarkus, Part 1

    Recent Posts

    • Confidential virtual machine storage attack scenarios

    • Introducing virtualization platform autopilot

    • Integrate zero trust workload identity manager with Red Hat OpenShift GitOps

    • Best Practice Configuration and Tuning for Linux and Windows VMs

    • Red Hat UBI 8 builders have been promoted to the Paketo Buildpacks organization

    What’s up next?

    Gitops Cookbook e-book tile card

    GitOps has become a standard in deploying applications to Kubernetes, and many companies are adopting the methodology for their DevOps and cloud-native strategy. Get useful recipes and examples for successful hands-on applications development and deployment with GitOps in this free O'Reilly e-book.

    Download the GitOps Cookbook
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2026 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Chat Support

    Please log in with your Red Hat account to access chat support.