Internet of Things: how the Edge will eat the Cloud

|

As we head towards an increasingly interconnected world, with the recent growth of the Internet of Things and the slide towards more comprehensive user interfaces, the focal point of data processing, computing and storage will shift away from centralised data centers towards endpoint devices, in the edge of the network. In a recent and unconventional “Maverick” research by Gartner, general opinions on the future of cloud computing are heavily contradicted, sparking new and unconstrained ideas of how a network-wide distribution of resources will naturally occur, lowering down the pressure of data centers and changing the current way of operating the cloud.

“New technologies, new requirements, the explosion of connected things and more real-time user interfaces will fundamentally change computing topologies over the next four to five years.” the study says, predicting that the entire IT infrastructure will take a 180-degree turn due to the transition of processing power and storage capacity to endpoint devices from the current focus on data centers.

Defining the edge

We can visualise the edge by looking at a network. At its core we have the enterprise data centers and cloud providers, while spreaded around all of these we find the endpoint devices, composed of smartphones, laptops, TVs, home devices, routers, local cell towers and many more. The cloud currently operates through its data centers that store and process entire sets of information used in business, commercial or personal activities and transmit this data towards endpoint devices as a result. So the edge is not new, it has always been there and consists in the group of beneficiaries devices of cloud computing & processing. 

What is new, instead, is the potential of these devices, the edge, being transformed into data processing nodes in order to achieve a more uniform distribution of resources.

The timeline

Just like economic trends, information technology trends present themselves with similar patterns that allowed a certain analysis over time and fostered conditions for future predictions. In a quick glimpse done by Gartner throughout the history, we notice:

  • In the ‘80s, mainframes and terminals started to fade away and be replaced by PCs and servers, pulling computing power towards the edge of the networks;
  • Throughout the mid 2000s, mobile devices and software applications started surpassing PCs, pulling computing to more flexible points;
  • In the early 2010s, physical devices, vehicles, home appliances and any other items embedded with electronics started to become interconnected and be referred to as “things”; their connections quickly outnumbered the amount of connected people at that time;

Now, it is expected that in the early 2020s the human-machine interfaces (HMIs) wil become more interactive and immersive and will gradually eat out smart devices that have risen decades before and have been only using apps and browsers; in very simple terms, an HMI is any device that allows interaction with a machine, be it house systems or facility & production lines.

We see the trend of new technologies taking over the previous ones. It is important to mention that the previous technology does not completely dissapear, and neither will cloud. Mainframes did not fade away to PCs and neither did PCs to mobile devices, they simply became less dominant and started to slowly decrease in sales. 

Similar to these, it is expected that the cloud will not fade away when the explosion of connected things will become a standard, it will just become part of a broader concept.

The Internet of Things is inevitable

Growing noisy and needy, the Internet of Things brings together more and more physical objects in a networked environment, adding to them embedded technology to allow human communication and interaction. By 2020, it is expected that “20 billion << things >> will be connected to the internet” and “100 million consumers will shop in AR”, based on the desire of more real-time machine interactions and augmented information delivered faster, Gartner predicts. 

Immersive technologies will soon reach a state of default through which people will want to interact more with other people and consumers better with businesses. This will, ultimately, put a mark on the competitiveness of enterprises and be reflected in the economic growth.

But why?

Cloud has its advantages and played a key role in the digital revolution, solving major problems; but, as technology evolves, negative sides start to be revealed. Cloud is an automated computing style that focuses mostly around storage and processing centralization in data centers, leveraging economies of scale. In time, with the shift towards more interconnected things, the main requirement will be about agility and not scale, with instant interactions and data produced and analyzed right at the edge. The Maverick research sees four major inconsistencies of the cloud that will easily be solved through the edge:

  • latency
  • bandwidth
  • autonomy
  • privacy and security

Even when the edge will start to grow and overcome the cloud, the later will still have some important roles to solve, focused on coordination of processes, aggregation of things, archiving of edge activity, machine learning models’ development and traditional back-office processing. Besides all of these, the cloud will ultimately act also as a fallback for the edge, replacing it where unavailability will occur, but with fewer capabilities.

But how will the Edge come to be? It will be build by everything. Home devices, gaming consoles, routers, mobile devices, transportation devices, cell towers, workplace systems and even manufacturing plants. All of them will contribute, in an outside-in approach, to the definition of the Edge in the next years.

 

The Maverick research by Gartner can be explored in full under the following link: https://www.gartner.com/doc/reprints?id=1-4GH0FTL&ct=171004&st=sb