In the previous article of this series, Deploy your API from a Jenkins Pipeline, we discovered how the 3scale toolbox can help you deploy your API from a Jenkins Pipeline on Red Hat OpenShift/Kubernetes. In this article, we will improve the pipeline from the previous article to make it more robust, less verbose, and also offer more features by using the 3scale toolbox Jenkins Shared Library.
What needs to be improved
Although it was not perfect, the pipeline we designed in the previous article was simple and self-contained. To support production workloads, however, some minor aspects need to be improved:
- The "runToolbox" helper method was duplicated in each pipeline.
- There are several delays used in the "runToolbox" method and tuning those delays proved to be tricky.
- Everything is hardcoded: if we need to change some API metadata, the Pipeline code needs to be updated. A "manifest" would separate the code from the configuration.
- The test credentials are hardcoded. To match most companies' security policies, we would need to generate those test credentials dynamically.
- The "runToolbox" helper method has not been designed to handle multiple parallel runs of the 3scale toolbox.
There is nothing hard or complicated with those improvements; that's the daily bread of every Jenkins Pipeline writer!
Introducing the Toolbox Jenkins Shared Library
To help the Jenkins Pipeline writer, the community around Red Hat Integration came up with a Jenkins Shared Library named 3scale-toolbox-jenkins. It features the following improvements:
- All the code has been factorized in a Jenkins Shared Library and can be reused in all your pipelines.
- A polling loop is used everywhere the pipeline has to wait for an action to complete. No more delay to tune.
- The Toolbox Jenkins Shared Library is fed with an API metadata manifest. You can change the API metadata without having to change your pipeline code.
- The test credentials are generated dynamically from a hash-based message authentication code (HMAC) function. HMAC has been used instead of random data in order to remain idempotent. No matter how many runs of the Jenkins pipeline, the test credentials remain the same but still unguessable.
- Multiple parallel runs of the toolbox are possible because all the Kubernetes objects are prefixed and labeled with the Jenkins build name and number.
- It uses the Jenkins OpenShift Client plugin underneath, which makes it more reliable than the bare
oc
command. - Semantic versioning is implemented to simplify the management of multiple API versions.
First steps with the toolbox Jenkins Shared Library
At the beginning of your Jenkins Pipeline, import the toolbox Jenkins Shared Library:
library identifier: '3scale-toolbox-jenkins@master', retriever: modernSCM([$class: 'GitSCMSource', remote: 'https://github.com/rh-integration/3scale-toolbox-jenkins.git'])
Declare a global variable that will hold the ThreescaleService object, so you can use it from the different stages of your pipeline:
def service = null
From an early stage of your Jenkins Pipeline, you can create the ThreescaleService object from your API metadata manifest:
service = toolbox.prepareThreescaleService( openapi: [ filename: "swagger.json" ], environment: [ baseSystemName: "my_service" ], toolbox: [ openshiftProject: "toolbox", destination: "3scale-tenant", secretName: "3scale-toolbox" ], service: [:], applications: [ [ name: "my-test-app", description: "This is used for tests", plan: "test", account: "john" ] ], applicationPlans: [ [ systemName: "test", name: "Test", defaultPlan: true, published: true ], [ systemName: "silver", name: "Silver" ], [ artefactFile: "https://raw.githubusercontent.com/redhatHameed/API-Lifecycle-Mockup/master/testcase-01/plan.yaml"] ] )
In this example, the API metadata manifest has been inlined in the Pipeline, but you can store it in a YAML file in your Git repository and load it using the readYAML step. This way, your API metadata can change but your pipeline code remains the same.
You can then create all the API management objects anywhere relevant in your pipeline:
service.importOpenAPI() echo "Service with system_name ${service.environment.targetSystemName} created !" service.applyApplicationPlans() service.applyApplication()
Running end-to-end tests is easy, too. Notice how the test credentials are managed automatically:
def proxy = service.readProxy("sandbox") sh """ curl -vfk ${proxy.sandbox_endpoint}/api/beer -H 'api-key: ${service.applications[0].userkey}' curl -vfk ${proxy.sandbox_endpoint}/api/beer/Weissbier -H 'api-key: ${service.applications[0].userkey}' curl -vfk ${proxy.sandbox_endpoint}/api/beer/findByStatus/available -H 'api-key: ${service.applications[0].userkey}' """
Finally, you can promote the new configuration to the production gateway:
service.promoteToProduction()
Jenkins Pipeline examples
We prepared a series of five Jenkins pipelines that showcase the use of the 3scale toolbox Jenkins Shared Library in different contexts:
- A very simple API secured with API Keys and deployed in 3scale hosted.
- An Open API (no security) deployed in a hybrid architecture: 3scale hosted and on-premises.
- An API secured with OpenID Connect deployed in the same hybrid architecture.
- The same API deployed in three different environments (DEV, TEST, and PROD)
- An API deployed in those three environments and with semantic versioning applied (four versions released, combining different security schemes).
You can find those examples in the 3scale-toolbox-jenkins-samples repository.
If you prefer a real-world example, the IntegrationApp-Automation repository contains a composite application that showcases an API deployed through a Jenkins Pipeline.
Conclusion
In this article, we presented a convenient way for Jenkins Pipeline writers to publish their APIs using the 3scale toolbox. This Jenkins Shared Library is presented as best practices and sample code for Jenkins Pipeline writers to use in their daily job. You can choose to reuse this library as-is and contribute to the upstream community or copy it and make it yours!