Editor’s note: Today, we announced the close of Red Hat’s acquisition by IBM.
In the months since the Red Hat acquisition by IBM was announced, I have been asked numerous times if this deal changes things for Red Hat’s Developer Program and Developer Tools group.
My answer then and now is “no.”
Continue reading “Preserving the Red Hat developer experience”
As a frequent contributor to open source projects (both within and beyond Red Hat), I find one of the most common time-wasters is dealing with code reviews of my submitted code that are negative or obstructive and yet essentially subjective or argumentative in nature. I see this most often when submitting to projects where the maintainer doesn’t like the change, for whatever reason. In the best case, this kind of code review strategy can lead to time wasted in pointless debates; at worst, it actively discourages contribution and diversity in a project and creates an environment that is hostile and elitist.
A code review should be objective and concise and should deal in certainties whenever possible. It’s not a political or emotional argument; it’s a technical one, and the goal should always be to move forward and elevate the project and its participants. A change submission should always be evaluated on the merits of the submission, not on one’s opinion of the submitter.
Continue reading “10 tips for reviewing code you don’t like”
In this series, I’ve been covering new developments of Shenandoah GC coming up in JDK 13. In part 1, I looked at the switch to load reference barriers, and, in part 2, I looked at plans for eliminating an extra word per object. In this article, I’ll look at a new architecture and a new operating system that Shenandoah GC will be working with.
Continue reading “Shenandoah GC in JDK 13, Part 3: Architectures and operating systems”
In this series of articles, I’ll be discussing new developments of Shenandoah GC coming up in JDK 13. In part 1, I looked at the switch of Shenandoah’s barrier model to load reference barriers and what that means.
Continue reading Shenandoah GC in JDK 13, Part 2: Eliminating the forward pointer word
Building responsiveness applications is a never-ending task. With the rise of powerful and multicore CPUs, more raw power is available for applications to consume. In Java, threads are used to make the application work on multiple tasks concurrently. A developer starts a Java thread in the program, and tasks are assigned to this thread to get processed. Threads can do a variety of tasks, such as read from a file, write to a database, take input from a user, and so on.
In this article, we’ll explain more about threads and introduce Project Loom, which supports high-throughput and lightweight concurrency in Java to help simplify writing scalable software.
Continue reading “Project Loom: Lightweight Java threads”
Container-native development is primarily about consistency, flexibility, and scalability. Legacy Application Lifecycle Management (ALM) tooling often is not, leading to situations where it:
- Places artificial barriers on development speed, and therefore time to value,
- Creates single points of failure in the infrastructure, and
- Stifles innovation through inflexibility.
Ultimately, developers are expensive, but they are the domain experts in what they build. With development teams often being treated as product teams (who own the entire lifecycle and support of their applications), it becomes imperative that they control the end-to-end process on which they rely to deliver their applications into production. This means decentralizing both the ALM process and the tooling that supports that process. In this article, we’ll explore this approach and look at a couple of implementation scenarios.
Continue reading “Application lifecycle management for container-native development”
Since starting to update the free online rules and process automation workshops that showcase how to get started using modern business logic tooling, you’ve come a long way with process automation. The updates started with moving from JBoss BPM to Red Hat Decision Manager and from JBoss BPM Suite to Red Hat Process Automation Manager.
In previous labs, we showed how to install Red Hat Decision Manager on your laptop, how to create a new project, and how to create a domain model. This article highlights the newest lab update for Red Hat Process Automation Manager, where you learn to design a process.
Continue reading “Modern business logic tooling workshop, lab 4: Create a process”
The Linux perf tool was originally written to allow access to the performance monitoring hardware that counts hardware events, such as instructions executed, processor cycles, and cache misses. However, it can also be used to count software events, which can be useful in gauging how frequently some part of the system software is executed.
Recently someone at Red Hat asked whether there was a way to get a count of system calls being executed on the system. The kernel has a predefined software trace point,
raw_syscalls:sys_enter, which collects that exact information; it counts each time a system call is made. To use the trace point events, the
perf command needs to be run as root.
Continue reading “How to use the Linux perf tool to count software events”
Since starting to update my free online rules and process automation workshops that showcase how to get started using modern business logic tooling, we’ve come a long way with process automation. The updates started with moving from JBoss BPM to Red Hat Decision Manager and from JBoss BPM Suite to Red Hat Process Automation Manager.
The first lab update showed how to install Red Hat Decision Manager on your laptop, and the second lab showed how to create a new project. This article highlights the newest lab update for Red Hat Process Automation Manager, where you’ll learn how to create a domain model.
Let’s take a look at the lab, shall we?
Continue reading “Modern business logic tooling workshop, lab 3: Create a domain model”
In Part 6 of this series, we looked into details that determine how your integration becomes the key to transforming your customer experience. It started with laying out the process of how I’ve approached the use case by researching successful customer portfolio solutions as the basis for a generic architectural blueprint.
Having completed our discussions on the blueprint details, it’s time to look at a few specific examples. This article walks you through an example integration scenario showing how expanding the previously discussed details provides blueprints for your own integration scenarios.
Continue reading “Integration blueprint example for process automation (part 7)”