One of the primary trends of the modern-day cloud computing systems is the application of centralized compute within the data center. The edge of the cloud is typically the entry point into the computing services of the cloud. This is usually the case, whether your SaaS is Office 365, ServiceNow, SAP, or something else—the endpoint is usually the access point for end users to query and interact with back-end computing systems.Now, however, the edge of the cloud is moving due to the many devices that can be classified as next-generation Internet of Things, a reality that appears to be one of the driving factors in the push of compute into edge devices.

Let’s start by taking a looking at the most basic of IoT devices on the market today. Fitbit is a good example. With a Fitbit, a wearable device is an extension of a mobile device. The Fitbit collects data, and a smartphone or other mobile device gathers the data and computes of the values of the person collected. For all practical purposes, in this scenario, the Fitbit is an input device of the smartphone or device and that device is the edge device of performs the compute. This is just the beginning of what is yet to come. As the IoT world continues to grow, there is a belief and, in some cases, an expectation that “edge computing” will become the de facto way the cloud will soon work by default.

One of the next big technology milestones is on the verge of beginning a commonplace reality. This new technology is a prime example of the need for edge computing in a modern-day world. What technology am I referring to? Well, that would be driverless cars and other autonomous devices—drones, for example. After all, driverless cars and drones in general make up at least part of the IoT technology platform.

Unlike the Fitbit, driverless cars are going to take a lot more processing power than what is available in current-day smart devices, even though today’s smart devices contain more processing power then the compute power that put a man on the moon.

Although the development of driverless cars is not rocket science (actually, it is quite a different science: the science of deep learning and artificial intelligence), the driverless car of the future may end up being the most powerful computer that most people will ever own. NVIDIA is the company that appears to be the leader of processing chips currently used in the auto industry. We may very well see something close to two hundred independent processors packed into an area the size of a car’s stereo system.

That is the very nature of what driverless cars do with dynamic variables that must be processed independently and completely self-contained. Other systems of driverless cars will be acceptable to utilize the cloud for. GPS is one example of data that could be pulled from the cloud. Satellite radio is another, but for the most part, the driverless car or mobile data center must be able to act completely independently from everything else. As such, the driverless car is one of the best examples of edge computing and how this technology will contribute to the concept of cloud computing in general.

In my opinion, driverless cars are a byproduct of the push into the edge. One of the main arguments for pushing the compute to the edges is simply the amount of data that is involved: the size of the data volumes and the legacy that comes with that, in addition to the limited bandwidth available. Latency is a huge problem, and there is no sign that the data is going to get any smaller.

I foresee the future having more of a hybrid approach, with more compute and storage being handled on edge devices and some services and data collection remaining inside centralized computing systems. The only question left is whether the future of the cloud will be defined by the edge of the cloud.