Skip to main content
Redhat Developers  Logo
  • Products

    Platforms

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat AI
      Red Hat AI
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • View All Red Hat Products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat Developer Hub
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat OpenShift Local
    • Red Hat Developer Sandbox

      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Secure Development & Architectures

      • Security
      • Secure coding
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • Product Documentation
    • API Catalog
    • Legacy Documentation
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Developing at the edge: Best practices for edge computing

July 16, 2020
Ishu Verma
Related topics:
Artificial intelligenceKubernetesMicroservices
Related products:
Red Hat Ansible Automation Platform for EdgeRed Hat build of MicroshiftRed Hat Enterprise Linux for Edge

Share:

    Edge computing continues to gain force as ever more companies increase their investments in edge, even if they're only dipping their toes in with small-scale pilot deployments. Emerging use cases like Internet-of-Things (IoT), augmented reality, and virtual reality (AR/VR), robotics, and telecommunications-network functions are often cited as key drivers for companies moving computing to the edge. Traditional enterprises are also looking at edge computing to better support their remote offices, retail locations, manufacturing plants, and more. At the network edge, service providers can deploy an entirely new class of services to take advantage of their proximity to customers.

    In this article, we consider edge computing from the perspective of application developers. The developer perspective is vital because the applications being developed today—leveraging emerging technologies like artificial intelligence and machine learning (AI/ML)—reveal new opportunities to deliver services and optimize costs.

    Note: For more about why companies are increasingly looking at edge computing, see my blog post We're headed to edge computing.

    Edgy applications

    The term edge computing has been used to describe everything from actions performed by tiny IoT devices to datacenter-like infrastructure. At the conceptual level, edge computing refers to the idea of bringing computing closer to where it's consumed or closer to the sources of data.

    Although the underlying infrastructure is a key enabler, the benefits of edge computing are realized through the applications. If done right, edge applications can enable new experiences across a range of industries:

    • Healthcare: Doctors and nurses can advance patient care by integrating live data from patient fitness trackers, medical equipment, and environmental conditions into medical records.
    • Smart infrastructure: Cities can leverage real-time data from roadside sensors and cameras to improve traffic flow by synchronizing traffic lights and reducing or increasing traffic lanes; improve traffic safety by detecting wrong-way drivers and dynamically updating the speed limit; and optimize shipping-port utilization by monitoring the loading and unloading of cargo ships.
    • Autonomous driving: Self-driving cars can use real-time data to safely navigate a range of driving conditions.
    • Industry 4.0: Managers on the factory floor can use AI/ML analytics to improve equipment utilization and maintenance.
    • Far edge services: Service providers can use their proximity to customers to offer low-latency (sub-1ms), high-bandwidth, and location-based services for use cases like AR/VR or virtual desktop infrastructure (VDI).

    Technology best practices for edge development

    Edge computing gives companies the flexibility and simplicity of cloud computing for a distributed pool of resources across a large number of locations. In the context of IoT, edge computing's approach to application development differs from the embedded systems of the past. Embedded applications required heavily customized operating systems that were dependent on the underlying hardware. It follows that developers working on embedded applications needed to deeply understand the hardware and interfaces used by their applications. These development tools lacked the flexibility and capabilities we see in tools used for edge computing.

    Consider these technology best practices for edge development:

    • Consistent tooling: Developers need to be able to use the same tools regardless of where the application is deployed. As a result, no special skills are required to create edge applications, or at least, not more than for non-edge applications. As an example of edge tooling, Red Hat CodeReady Workspaces, built on Eclipse Che, provides a Kubernetes-native development solution with an in-browser IDE. This tooling supports rapid application development that can be easily deployed at the edge or in the cloud.
    • Open APIs: Well-defined and open APIs make it possible to access real-time data programmatically so that businesses can offer new services (and classes of services) that were previously impossible. Developers use open APIs to create standards-based solutions that can access data without concern for the underlying hardware interfaces.
    • Accelerated application development: Edge architectures are still evolving, but the design decisions made today will have a lasting impact on future capabilities. Instead of offerings purpose-built for the edge, which limits developer agility, it is better to invest in technologies that can work anywhere—cloud, on-premises, and at the edge. Containers, Kubernetes, and lightweight application services are all examples of technologies that accelerate application development from cloud to edge.
    • Containerization: Most new applications are built as containers because containerized applications are easy to deploy and manage at scale. Containers are an especially good fit for edge application requirements of modularity, segregation, and immutability. Applications will need to be deployed on many different edge tiers, each with their own unique resource characteristics. Combined with microservices, containers representing function instances can be scaled up or down depending on changing resources and other conditions.

    Note: For more about how resource requirements vary between edge and cloud computing, see No more illusions of infinite capacity.

    Additional technology considerations

    For developers, it is important to know that there will not be an either/or choice between edge computing and centralized computing. As edge computing gains greater adoption in the marketplace, the best solutions will often encompass a combination of the two. In such a hybrid computing model, centralized computing will be used for compute-intensive workloads, data aggregation and storage, AI/MI, coordinating operations across geographies, and traditional back-end processing. Edge computing, on the other hand, can help solve problems at the source, in near real-time. Distributed architectures will allow us to place applications at any tier from cloud to edge, wherever it makes the most sense.

    Monolithic edge solutions that require custom tooling and don't integrate with the overall IT infrastructure could cause major pain when edge computing achieves mass deployment. Open source is an obvious choice for providing flexibility, while also future-proofing our current investments in edge computing.

    Conclusion

    As computing moves increasingly to the edge, the benefits of the edge will be realized on the backs of applications. Instead of treating edge computing as a separate computing paradigm, a preferable approach is to use a hybrid computing model that combines the best of centralized and edge computing. For developers building edge applications, they should be leveraging modern application development principles. These principles involve the use of consistent tooling (regardless of the location where the application is deployed), open APIs, and highly modular yet scalable technologies (such as containers, Kubernetes, and lightweight application services). Open source allows flexibility, while also future-proofing the current investments in edge computing.

    Last updated: November 28, 2023

    Recent Posts

    • How to enable Ansible Lightspeed intelligent assistant

    • Why some agentic AI developers are moving code from Python to Rust

    • Confidential VMs: The core of confidential containers

    • Benchmarking with GuideLLM in air-gapped OpenShift clusters

    • Run Qwen3-Next on vLLM with Red Hat AI: A step-by-step guide

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue