Feature image for edge computing

When the ENIAC computer was introduced in 1946, it was housed in a huge room—1,800 square feet—and weighed 30 tons. It had to be assembled in place, and it wasn't going to be moved. The era of electronic computers had arrived, but only for an elite few. The idea of edge computing was science fiction—unbelievable science fiction at that. My, how things have changed.

Mainframes

The IBM mainframe computers, introduced in 1952, became the standard of computing for corporations and government agencies in the 1960s and 1970s. Those of us old enough can remember, for example, getting their home water bill in the form of a punched card with the words "Do not fold, spindle or mutilate" on it. These mainframe computers moved processing to the corporate headquarters. Sales from cash registers, for example, would be sent to headquarters on punched paper tape where it could be read into the mainframes for reporting.

(Author's note: My first job in IT was processing paper tape into a mainframe.)

Midrange computers

In the 1970s, minicomputers (also called midrange computers) became very popular. The Digital VAX, Data General Nova, and the hugely popular IBM System/3x-400 series (System/3, System/32, System/34/, System/36, System/38, and AS/400) moved computing power even closer to the action. Midrange systems started in air-conditioned rooms with raised flooring, then moved to corners in offices, then eventually under desks as they grew in power and shrunk in physical size. Remote offices and small businesses now had a computer in their building. Larger midrange computers started to move into the spaces formerly occupied by the mainframes.

The PC

The 1980s saw the dawn of the desktop PC, and the trend of computing moving closer and closer to the action continued. A small desktop PC, for example, could be integrated into a production environment on a factory floor to record data and control machines. The PC would, typically, send the data to a host—often a midrange computer—and, likewise, get data from the host. This happened over a network: Twinax or ethernet cabling and the associated protocol—SNA, Novell Netware, and ethernet were the choices.

Portables

Adam Osborne brought the popular portable PC to the world in 1981 with the 24.5-pound Osborne 1. While more "luggable" than portable, this enabled the computer to move around more easily. Again, the processing power was moving closer to the action.

Laptops and notebook computers followed and continue to evolve to this day, with a WiFi connection now being a requirement.

Tablets and phones

In the early 1990s, the personal digital assistant, or PDA, arrived. The star was the Palm Pilot, a small device that could store and record information. Users could take notes, make voice recordings, or manage their schedules. This machine was synched to a PC via a cable. This meant, while not real-time, processing could now be carried around in a pocket.

The PDA was the genesis of and gave way to today's "must-have" item, the smartphone. Using WiFi and wireless 5G technology, the smartphone enables real-time data processing in a small form factor. This is one example of computing at the edge.

Likewise, small tablets such as the iPad allow users to carry processing power with them, along with ample screen real estate and advanced communication functionality.

Embedded systems and closer to the edge

But it goes further, and deeper.

Very small sensors and controllers can now be embedded into everyday items. These Internet of Things (IoT) devices range from thermostats to watches—the Apple watch weighs in at just over 30 grams, hence the title of this article—to, well, almost everything. You can even buy a ring that senses and reports data.

These IoT devices might (or might not) have some computing ability beyond just collecting data, but they do often communicate with an edge computer or a cloud-based system that has advanced capabilities. Processing that occurs at or near the edge is where the greatest challenges are presented. Things such as enforcing security and installing updates are paramount because this is where the action takes place.

All of this makes up "the edge." Sensing and processing move closer and closer to where the events occur. Communication capabilities have also improved, with WiFi, 5G, NFC, and more making it easier and more likely that edge devices will communicate with each other. An in-car network, for example, can improve automotive travel; Red Hat is working with GM on that very technology.

Edge computing example: The modern automobile

Let's consider the advanced automobile as a use case for edge computing.

Sensors are used to report current speed, location, road conditions, outside temperature, lane edges, surrounding vehicles, and much more. These readings are reported to the driver and the drivetrain. The driver can use the information to make decisions, while an onboard computer can use the data to make adjustments—keep the car within the lanes, reduce speed based on front-facing radar, reduce torque based on road conditions, and much more.

They could be expanded when "smart highways" are introduced. Sensors can keep track of traffic density and speed; accidents can be used as data, reporting to cars to determine speeds and, perhaps, a new route based on congestion.

All this edge computing will need to be secure, and systems will need to be updated. We already have cars that can receive software updates while offline, i.e., not on the road. I can check the fuel level of my MINI from my smartphone.

The future: Bright or dark?

This is all just a start, only the beginning of more advanced systems running closer and closer to the action. One can easily wonder: Is Kurzweil's Singularity at hand?

The future, truly, is at the edge.

Last updated: August 14, 2023