July 19th DevNation Live: Container pipeline master: Continuous integration + continuous delivery with Jenkins

Join us for the next online DevNation Live on Thursday, July 19th at 12pm EDT for Container pipeline master: Continuous integration + continuous delivery with Jenkins, presented by Red Hat principal technical product marketing manager for Red Hat OpenShift, Siamak Sadeghianfar.

In this session, we’ll take a detailed look into how you can build a super slick, automated continuous integration and continuous delivery (CI/CD) Jenkins pipeline that delivers your application payloads onto the enterprise Kubernetes platform, Red Hat OpenShift. You see how zero-downtime deployment patterns can be part of your release process when you are using a container platform based on Kubernetes.

Automating your build, test, and deployment processes can improve reliability and reduce the need for rollbacks. However, we’ll show you how rollbacks can be handled too.

Register now and join the live presentation at 12pm EDT, Thursday, July 19th.

Session Agenda:

Continue reading “July 19th DevNation Live: Container pipeline master: Continuous integration + continuous delivery with Jenkins”

Share
Red Hat Developer

Announcing updated Red Hat Developer Studio and Container Development Kit

I’m extremely pleased to announce the release of Red Hat Container Development Kit (CDK) 3.5 and Red Hat Developer Studio 12. Whether you are developing traditional or cloud-based applications and microservices, you can run these tools on your Windows, macOS, or Red Hat Enterprise Linux laptop to streamline development:

  • Red Hat Container Development Kit provides a pre-built container development environment to help you develop container-based applications quickly using Red Hat OpenShift and Kubernetes.
  • Red Hat Developer Studio (previously named JBoss Developer Studio) provides a desktop IDE with superior support for your entire development lifecycle. It includes a broad set of tooling capabilities and support for multiple programming models and frameworks. Developer Studio provides broad support for working with Red Hat products and technologies including middleware, business automation, and integration, notably Camel and Red Hat Fuse. Developer Studio is based on Eclipse 4.8 (Photon).

A number of Red Hat Enterprise Linux (RHEL) development tools have been updated. These include Rust 1.26.1, Go 1.10.2, Cargo 1.26, and Eclipse 4.8 (Photon).

Our goals are to improve usability of our tools for developers, while adding new features that matter most for users of Red Hat platforms and technologies.

Overview of new features:

Continue reading “Announcing updated Red Hat Developer Studio and Container Development Kit”

Share

Announcing Red Hat Developer Studio 12.0.0.GA and JBoss Tools 4.6.0.Final for Eclipse Photon

Attention desktop IDE users: Red Hat Developer Studio 12.0 and the community edition, JBoss Tools 4.6.0 for Eclipse Photon, are now available. You can download a bundled installer, Developer Studio, which installs Eclipse 4.8 with all of the JBoss Tools already configured. Or, if you have an existing Eclipse 4.8 (Photon) installation, you can download the JBoss Tools package. This article highlights some of the new features in both JBoss Tools and Eclipse Photon, covering WildFly, Spring Boot, Camel, Maven, and many Java related improvements including full Java 10 support.

Developer Studio / JBoss Tools provides a desktop IDE with a broad set of tooling covering multiple programming models and frameworks. If you are doing container / cloud development, there is integrated functionality for working with Red Hat OpenShift, Kubernetes, Red Hat Container Development Kit, and Red Hat OpenShift Application Runtimes. For integration projects, there is tooling covering Camel and Red Hat Fuse that can be used in both local and cloud deployments.

Continue reading “Announcing Red Hat Developer Studio 12.0.0.GA and JBoss Tools 4.6.0.Final for Eclipse Photon”

Share
Red Hat OpenShift

How to call the OpenShift REST API from C#

When you want to do automated tasks for builds and deployments with Red Hat OpenShift, you might want to take advantage of the OpenShift REST API. In scripts you can use oc CLI command which talks to the REST APIs. However there are times when it is more convenient to do this directly from your C# code without having to invoke an external program. This is the value of having an infrastructure platform that is exposed as services with an open API.

If you want to call the API from your C# code, you have to create a request object, call the API, and parse the response object. The upstream project, OpenShift Origin, provides a Swagger 2.0 specification and you can generate a client library for each programming language. Of course, C# is supported.  This isn’t a new approach, Kubernetes has a repository that is generated by Swagger Codegen.

For C#, we can use Microsoft Visual Studio to generate a C# client library for a REST API. In this article, I’ll walk you through the process of generating the library from the definition.

Continue reading “How to call the OpenShift REST API from C#”

Share

Smart-Meter Data Processing Using Apache Kafka on OpenShift

There is a major push in the United Kingdom to replace aging mechanical electricity meters with connected smart meters. New meters allow consumers to more closely monitor their energy usage and associated cost, and they enable the suppliers to automate the billing process because the meters automatically report fine-grained energy use.

This post describes an architecture for processing a stream of meter readings using Strimzi, which offers support for running Apache Kafka in a container environment (Red Hat OpenShift). The data has been made available through a UK research project that collected data from energy producers, distributors, and consumers from 2011 to 2014. The TC1a dataset used here contains data from 8,000 domestic customers on half-hour intervals in the following form:

Continue reading “Smart-Meter Data Processing Using Apache Kafka on OpenShift”

Share

A Beginner’s Guide to Kubernetes (PodCTL Podcast #38)

If you aren’t following the OpenShift Blog, you might not be aware of the PodCTL podcast. It’s a free weekly tech podcast covering containers, kubernetes, and OpenShift hosted by Red Hat’s Brian Gracely (@bgracely) and Tyler Britten (@vmtyler). I’m reposting this episode here on the Red Hat Developer Blog because I think their realization is spot on—while early adopters might be deep into Kubernetes, many are just starting and could benefit from some insights.

Original Introduction from blog.openshift.com:

The Kubernetes community now has 10 releases (2.5 yrs) of software and experience. We just finished KubeCon Copenhagen, OpenShift Commons Gathering, and Red Hat Summit and we heard lots of companies talk about their deployments and journeys. But many of them took a while (12–18) months to get to where they are today. This feels like the “early adopters” and we’re beginning to get to the “crossing the chasm” part of the market. So thought we’d discuss some of the basics, lessons learned, and other things people could use to “fast-track” what they need to be successful with Kubernetes.

The podcast will always be available on the Red Hat OpenShift blog (search: #PodCTL), as well as on RSS FeedsiTunesGoogle PlayStitcherTuneIn, and all your favorite podcast players.

Continue reading “A Beginner’s Guide to Kubernetes (PodCTL Podcast #38)”

Share
Red Hat OpenShift

Using OpenShift to deploy .NET Core applications

Containers are the new way of deploying applications. They provide an efficient mechanism to deploy self-contained applications in a portable way across clouds and OS distributions. In this blog post we’ll look at what OpenShift brings for .NET Core specifically.

Kubernetes and OpenShift

Kubernetes is the de facto orchestrator for managing containerized applications. Google open-sourced Kubernetes in 2014 and Red Hat was one of the first companies to work with Google on Kubernetes. Red Hat is the 2nd leading contributor to the Kubernetes upstream project.

OpenShift is an open-source DevOps platform that is built on top of Kubernetes. It integrates directly with your application’s source code. This enables continuous integration/continuous deployment (CI/CD) workflows. Tools are available to scale and monitor your applications. The OpenShift Catalog makes it easy to setup middleware and databases. OpenShift comes with comprehensive documentation to install and manage your installation. It can run on-prem and on public clouds such as AWS, GCP and Azure.

Continue reading “Using OpenShift to deploy .NET Core applications”

Share

Why Kubernetes is The New Application Server

Have you ever wondered why you are deploying your multi-platform applications using containers? Is it just a matter of “following the hype”? In this article, I’m going to ask some provocative questions to make my case for Why Kubernetes is the new application server.

You might have noticed that the majority of languages are interpreted and use “runtimes” to execute your source code. In theory, most Node.js, Python, and Ruby code can be easily moved from one platform (Windows, Mac, Linux) to another platform. Java applications go even further by having the compiled Java class turned into a bytecode, capable of running anywhere that has a JVM (Java Virtual Machine).

The Java ecosystem provides a standard format to distribute all Java classes that are part of the same application. You can package these classes as a JAR (Java Archive), WAR (Web Archive), and EAR (Enterprise Archive) that contains the front end, back end, and libraries embedded. So I ask you: Why do you use containers to distribute your Java application? Isn’t it already supposed to be easily portable between environments?

Continue reading “Why Kubernetes is The New Application Server”

Share

Using Red Hat Data Grid to power a multi-cloud real-time game

The scavenger hunt game developed for the audience to play during the Red Hat Summit 2018 demo used Red Hat Data Grid as storage for everything except the pictures taken by the participants. Data was stored across three different cloud environments using cross-site replication. In this blog post, we will look at how data was flowing through Data Grid and explain the Data Grid features powering different aspects of the game’s functionality.

Continue reading Using Red Hat Data Grid to power a multi-cloud real-time game

Share

Red Hat Data Grid on Three Clouds (the details behind the demo)

If you saw or heard about the multi-cloud demo at Red Hat Summit 2018, this article details how we ran Red Hat Data Grid in active-active-active mode across three cloud providers. This set up enabled us to show a fail over between cloud providers in real time with no loss of data. In addition to Red Hat Data Grid, we used Vert.x (reactive programming), OpenWhisk (serverless), and Red Hat Gluster Storage (software-defined storage.)

This year’s Red Hat Summit was quite an adventure for all of us. A trip to San Francisco is probably on the bucket list of IT geeks from all over the world. Also, we were able to meet many other Red Hatters, who work remotely for Red Hat as we do.  However, the best part was that we had something important to say: “we believe in the hybrid/multi cloud” and we got to prove that live on stage.

Photo credit: Bolesław Dawidowicz

 

Continue reading “Red Hat Data Grid on Three Clouds (the details behind the demo)”

Share