Skip to main content
Redhat Developers  Logo
  • Products

    Featured

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat OpenShift AI
      Red Hat OpenShift AI
    • Red Hat Enterprise Linux AI
      Linux icon inside of a brain
    • Image mode for Red Hat Enterprise Linux
      RHEL image mode
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • Red Hat Developer Hub
      Developer Hub
    • View All Red Hat Products
    • Linux

      • Red Hat Enterprise Linux
      • Image mode for Red Hat Enterprise Linux
      • Red Hat Universal Base Images (UBI)
    • Java runtimes & frameworks

      • JBoss Enterprise Application Platform
      • Red Hat build of OpenJDK
    • Kubernetes

      • Red Hat OpenShift
      • Microsoft Azure Red Hat OpenShift
      • Red Hat OpenShift Virtualization
      • Red Hat OpenShift Lightspeed
    • Integration & App Connectivity

      • Red Hat Build of Apache Camel
      • Red Hat Service Interconnect
      • Red Hat Connectivity Link
    • AI/ML

      • Red Hat OpenShift AI
      • Red Hat Enterprise Linux AI
    • Automation

      • Red Hat Ansible Automation Platform
      • Red Hat Ansible Lightspeed
    • Developer tools

      • Red Hat Trusted Software Supply Chain
      • Podman Desktop
      • Red Hat OpenShift Dev Spaces
    • Developer Sandbox

      Developer Sandbox
      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Secure Development & Architectures

      • Security
      • Secure coding
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
      • View All Technologies
    • Start exploring in the Developer Sandbox for free

      sandbox graphic
      Try Red Hat's products and technologies without setup or configuration.
    • Try at no cost
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • Java
      Java icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • API Catalog
    • Product Documentation
    • Legacy Documentation
    • Red Hat Learning

      Learning image
      Boost your technical skills to expert-level with the help of interactive lessons offered by various Red Hat Learning programs.
    • Explore Red Hat Learning
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

How Tekton Pipelines provide seamless information flow

November 2, 2023
Ritesh Shah
Related topics:
CI/CDContainers
Related products:
Red Hat OpenShift Container Platform

Share:

    Allow me to begin with a brief anecdote. A few years ago, while explaining Tekton to one of my colleagues, I came to a realization. The concept of information flow plays a crucial role in the CI/CD process. However, grasping this concept can be challenging for newcomers entering the world of Tekton and continuous integration. Over time, I’ve had numerous conversations with colleagues within and outside my organization, and my belief in the necessity of simplifying the explanation of information flow within a pipeline has been reaffirmed.

    In today’s dynamic and competitive business environment, the need to efficiently streamline and optimize your software delivery processes has become paramount. Tekton, a powerful open source framework, empowers teams to automate their workflows effectively. However, one particular challenge that often arises is understanding how information can seamlessly pass from one task to another within Tekton Pipelines. This article will demystify this concept, highlighting its significant business value.

    The importance of information flow

    In a Tekton pipeline, efficient information flow is essential for orchestrating complex tasks and ensuring a smooth development process.

    When team members can easily transfer data between tasks, it leads to the following:

    1. Increased productivity: Streamlined information flow reduces manual intervention and accelerates pipeline execution, allowing your team to focus on more critical tasks.
    2. Consistency: Ensuring that the right data is available at each step of the pipeline guarantees consistent and reliable results.
    3. Error reduction: Minimizing data-handling errors reduces downtime and potential issues in production.

    Simplifying the process

    To make the concept of information flow in Tekton Pipelines more accessible, let’s break it down into manageable steps. Here’s a straightforward guide to passing information between tasks:

    1. Project and Persistent Volume Creation (PVC):

    • Begin by creating a project and the necessary persistent volumes to store data.

    2. Task and Pipeline Creation:

    • Define your tasks, specifying inputs and outputs.
    • Construct your pipeline, orchestrating the tasks in the desired order.

    3. Task Runs and Pipeline Execution:

    • Create task run YAML files, indicating how data should flow from one task to another.
    • Execute the pipeline, witnessing the seamless information transfer in action.

    Real-world Tekton Pipeline implementation

    To further assist you in grasping this concept, let's look at a test use case that demonstrates a simple way of passing information within a Tekton pipeline. You’ll find detailed YAML files for each step of the process, from setting up the project and volumes to executing the pipeline. This example serves as a foundation that you can extend and adapt to your specific use cases, enhancing your development efficiency.

    This example is tested with Red Hat OpenShift Pipelines version 1.70 and higher, running on Red Hat OpenShift Container Platform 4.10 and higher. Be sure to install OpenShift Pipelines and Tekton. On the OpenShift Container Platform you can install OpenShift Pipelines operator (1).

    Create a test project

    kind: Project
    apiVersion: project.openshift.io/v1
    metadata:
      name: test

    Before you create tasks and pipeline, ensure that you have a PVC associated with the project where you are creating tasks and pipeline. You need to know your storage class. Check with your storage provider or your company’s Storage admin.

    Create a PVC named test:

    kind: PersistentVolumeClaim
    apiVersion: v1
    metadata:
      name: test
      namespace: test
    spec:
      accessModes:
        - ReadWriteOnce
      resources:
        requests:
          storage: 5Gi
      storageClassName: gp2
      volumeMode: Filesystem

    In the first task, add a workspace like this in spec. First, have a workspace named source in this case and use that in the second task. Capture what you need to pass in data as results and redirect that information to a file, for example, ee_data.json, which you call in the second task.

    apiVersion: tekton.dev/v1beta1
    kind: Task
    metadata:
      name: task1
    spec:
      description: >-
        Add execution environment to automation controller
      workspaces:
        - name: source
          results:
            - name: data
              description: ID to be passed to next task
      steps:
        - name: task1
          image: quay.io/rshah/jq
          workingDir: $(workspaces.source.path)
          resources: {}
          script: |
            #!/usr/bin/env bash
            data="This is the output from task 1"
            printf "%s" "${data}" > ee_data.json
            AC_EE_ID=$(cat ee_data.json)
            printf "%s" ${AC_EE_ID}

    In the next task, task 2, you can reference ee_data.json as shown below:

    apiVersion: tekton.dev/v1beta1
    kind: Task
    metadata:
      name: task2
    spec:
      workspaces:
        - name: source
      steps:
        - name: task2
          image: quay.io/rshah/jq
          workingDir: $(workspaces.source.path)
          resources: {}
          script: |
            #!/usr/bin/env bash
            AC_EE_ID=$(cat ee_data.json)
            printf "%s" ${AC_EE_ID}

    When you run task1 and task2 in a pipeline, both should print the same output from the tasks.

    Create a pipeline

    Create a YAML file of task pipeline and execute it in your OpenShift or Kubernetes environment.

    apiVersion: tekton.dev/v1beta1
    kind: Pipeline
    metadata:
      name: "value_pass_pipeline"
    spec:
      workspaces:
        - name: source
      params:
        - description: Verify the TLS on the registry endpoint (for push/pull to a non-TLS registry)
          name: TLSVERIFY
          type: string
          default: "false"
        - description: Dummy parameter for task1
          name: task1
          type: string
          default: "task1"
        - description: Dummy parameter for task2
          name: task2
          type: string
          default: "task2"
      tasks:
        - name: task1
          taskRef:
            kind: Task
            name: task1
          params:
            workspaces:
              - name: source
                workspace: source
        - name: task2
          taskRef:
            kind: Task
            name: task2
          params:
            runAfter:
              - task1
            workspaces:
              - name: source
                workspace: source

    When you run the pipeline, both tasks will show the same output as shown below. This shows that information from task 1 is picked up by task 2:

    STEP-TASK1 This is the output from task 1
    STEP-TASK2 This is the output from task 1

    Task runs and pipeline run

    A pipeline in execution is a PipelineRun. A PipelineRun will execute individual tasks creating TaskRun.

    apiVersion: tekton.dev/v1beta1
    kind: TaskRun
    metadata:
      name: test-0ij91k-task1
      namespace: test
    spec:
      resources: {}
      serviceAccountName: pipeline
      taskRef:
        kind: Task
        name: task1
      timeout: 59m59.989014151s
      workspaces:
        - name: source
          persistentVolumeClaim:
            claimName: test
    ---
    apiVersion: tekton.dev/v1beta1
    kind: TaskRun
    metadata:
      name: test-0ij91k-task2
      namespace: test
    spec:
      resources: {}
      serviceAccountName: pipeline
      taskRef:
        kind: Task
        name: task2
      timeout: 59m59.989014151s
      workspaces:
        - name: source
          persistentVolumeClaim:
            claimName: test
    ---
    apiVersion: tekton.dev/v1beta1
    kind: PipelineRun
    metadata:
      name: test-0ij91k
      namespace: test
    spec:
      pipelineRef:
        name: test
      serviceAccountName: pipeline
      timeout: 1h0m0s
      workspaces:
        - name: source
          persistentVolumeClaim:
            claimName: test

    This simple example explained how you can pass information from one task to another.

    Summary

    Mastering the art of information flow within Tekton Pipelines can significantly benefit your organization. It empowers your team to work more efficiently, reduce errors, and deliver software faster. By following the steps outlined in this article, you’ll be well on your way to harnessing the full potential of Tekton for your business needs.

    Don’t let the complexity of information transfer hold your development pipeline back. Embrace Tekton’s capabilities and unlock a world of possibilities for your software delivery processes.

    Last updated: November 9, 2023

    Related Posts

    • The new Tekton Pipelines extension for Visual Studio Code

    • Speed up Maven builds in Tekton Pipelines

    • Real-time debugging in Tekton pipelines

    • How to use the VS Code Tekton Pipelines extension

    Recent Posts

    • AI meets containers: My first step into Podman AI Lab

    • Live migrating VMs with OpenShift Virtualization

    • Storage considerations for OpenShift Virtualization

    • Upgrade from OpenShift Service Mesh 2.6 to 3.0 with Kiali

    • EE Builder with Ansible Automation Platform on OpenShift

    What’s up next?

    The automated builds and deployments (CI/CD) of container-based applications can reduce mistakes, improve productivity, and promote more thorough testing. Try this sandbox activity to learn about OpenShift Pipelines for automated builds and deployment.

    Start learning
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue