Commentary

Beyond Cloud Computing: Why We’re Watching This Digital Trend

Written by Paul Chi | Nov 29, 2019 6:52:00 PM

Editor’s note: This piece was originally published in November 2019. It has been lightly edited and updated as of July 2023.

In a letter to investors, 1623 Pro Fund portfolio manager Jeff Fischer wrote in an investor letter:

One of our key pillars to generate strong returns is to invest in growth companies that we believe are benefitting from major trends that should serve as tailwinds for years to come—think 10-year trends and longer, such as data and data analytics, connectivity, and mobility. One trend still early in the making is the world’s digital transformation. Replacing paper, inefficient offline processes, susceptible record keeping, and slow response times with a completely digital—and all but immediate—solution is steadily becoming necessary to compete. … Any company that hopes to thrive needs to get there shortly, because customers demand it, as do employees.

Today, I want to drill down on a theme within the “digital transformation” theme Jeff mentioned. It’s called edge computing.

Looking beyond the cloud

To understand edge computing, I want to first talk about its parent: cloud computing.

Cloud computing is all around us. Quite often, what will power the “digital transformation” trend will be cloud computing. At the most basic level, cloud computing involves accessing programs and data over the internet, not on local hardware. It’s the difference between streaming a movie versus renting a physical DVD and playing it on your home DVD player.

If you watch shows on Netflix, store files on Dropbox, have a subscription service to software like Office 365 (or use a free service like Google Docs), you’re already using cloud computing yourself. Keeping hardware on site can be a costly endeavor, so cloud computing can save companies a significant amount of money—there are no hardware and maintenance costs with the cloud.

Because of these potential savings, cloud computing vendors offer things like software-as-a-service, platform-as-a-service, and infrastructure-as-a-service. The thing to keep in mind is that these are all subscription services for various computing or storage resources delivered over the internet.

Typically, these services are served up to you from large, centralized data centers. These facilities have tons of servers, storage equipment, networking equipment, and more. So when you’re receiving Netflix video or editing a Google Doc, you’re sending signals to and from these data centers.

Geographically speaking, these data centers are all over the place, so the chances are high that you’re communicating with a data center that isn’t far away. According to the U.S. International Trade Commission, there are more than 2,600 data centers in the United States (as of May 2021).

Why edge computing?

Cloud computing can be a great productivity enhancer and has already saved money and time for many businesses and people. However, while there’s so much that cloud computing enables, it also has its limitations. Edge computing fixes some of those limitations.

Edge computing does its computing on the “edge” of the cloud. As you may have guessed, that means it’s closer than the cloud. This can happen within an edge data center, which is much closer to the end user. These smaller data centers can be connected to a building, a cell tower, or another nearby structure.

Another way to utilize edge computing is by putting computing capabilities into the device itself. Rather than send a constant stream of raw data, the device might only send the bits and pieces that are relevant and important. This is accomplished by having computer parts inside the device itself. When we joke that our cars are basically computers now, this is actually not that far from the truth. And the more connected devices we add to our lives, the more computers we’re adding. 

There are many reasons that edge computing makes sense as a technology. Two of the most common problems solved by edge computing are latency and bandwidth issues. 

Latency

Latency is the delay that occurs when sending and receiving data. Sending data to and from a data center, in most instances, is just fine. However, there are times when an application is much more time-sensitive and we can’t afford to wait for even a small delay. A good example of this is autonomous vehicles.

Cars these days are equipped with numerous sensors and cameras that enable cars to detect and see nearby and faraway objects and to react to them. Imagine if cars had to send signals to a data center and back before deciding to brake or steer. It would be a disaster! Because of this, cars have the computing power and software to interpret the data from the sensors and cameras they carry. This way, cars can react to obstacles on the road in real-time, rather than sending data to the cloud and waiting for a response.

Because of the numerous sensors, cameras, and computing power found on self-driving cars and cars equipped with Advanced Driver Assistance Systems (ADAS), these cars are often referred to as “data centers on wheels.” The same principle applies to autonomous drones and quadcopters, which are sometimes referred to as “flying data centers.” They also have sensors and cameras and the computing power to properly interpret the data generated by those sensors and cameras.

Latency is also an issue in other high-performance areas such as manufacturing. Fast-moving manufacturing lines, highly-efficient consumer goods distribution centers, and many other applications require devices that can interpret signals from sensors and cameras much faster than cloud-connected devices could.

Bandwidth

You can think of bandwidth as the amount of data that can be sent at once. You can’t send an infinite amount of data at once, since you’re limited by how much data can pass through fiber-optic cables, routers, or other equipment at any given time. Internet connections have become a lot faster over time. However, with each increase in bandwidth, humans have figured out ways to become even more data-hungry over time.

Because of our data-gobbling nature, bandwidth is also a limiting factor with cloud computing. One example is the surveillance camera. In the past, surveillance cameras were, for lack of a better term, dumb. You turned them on and recorded video, or you turned them off and they stopped recording. Maybe you set it to record to a VCR and regularly changed out the tapes. Some small business owners often re-used the same tape if nothing notable happened during a shift.

Over time, picture quality has gotten better and better. Today’s high-definition cameras can record in 8K resolution, which is much more data than the cameras of old. These newer cameras can also stream video to the internet or to a cloud storage facility. It’s not uncommon for a large building to have many of these cameras… and an organization can have many such buildings… So as you can imagine, the number of cameras can really add up. Combined, they produce a heck of a lot of data. Even with a commercial-grade internet connection, that many cameras can start to eat up available internet bandwidth. 

That’s where today’s smart cameras come in. Using sensors and basic computing technology, smart cameras can decide to record only when there’s motion, which cuts down the amount of data that needs to be sent. It’s unlikely that dozens of motion-triggered cameras will all turn on at once, so this is an example of edge computing solving a bandwidth problem. And this doesn’t apply just to cameras. Each year, the number of internet-connected devices grows. Cutting down on bandwidth usage is a good idea for all devices (not just security cameras!).

Remember, there’s only so much data we can stuff through the internet’s plumbing at once.

An accelerating shift

According to Exploding Topics, 55% of website traffic comes from mobile devices. What's more, 92% of Internet users access the Internet using a mobile phone. Over time, these mobile users will see faster and faster speeds. By 2025, the world is expected to have more than 1 billion 5G connections, led by China, the United States, and Japan. [Ed.: Data in this paragraph is as of July 2023.]

According to a 2018 Deutsche Telekom report, 5G is expected to reach speeds of up to 10 gigabits per second under ideal conditions, which is much faster than 4G. In addition to faster speeds, 5G is expected to provide much lower latency—meaning 5G will be an important enabler of edge computing. With improved latency available on a widespread basis, we will likely see companies devising all sorts of devices to take advantage of the improved mobile connectivity.

All signs point to an increasingly connected world with many more devices of all kinds. This means cities with smart traffic lights and smart street lights, autonomous drones and vehicles, more complex wearable devices, smart meters and other types of monitoring systems utilized by utilities, smart oil rigs, and much more. Many of these devices will require processing at the device level or at nearby edge data centers in order to function efficiently. As this shift happens, we believe edge computing should become much more of a household term. 

Paul Chi, Research Analyst, 1623 Capital

*Mention of specific companies are for illustrative purposes only and do not constitute a recommendation with respect to those securities.