By Richard Arneson
Just when everybody got comfortable bandying about the cloud, along comes another meteorology-related tech term―fog. Yes, we now have Fog Computing. While in its infancy (in fact, the OpenFog Consortium was created only three (3) short years ago), it will likely become another oft-used word in the networking vernacular.
The consortium was founded in 2015 by Cisco (which coined the term), ARM Holdings, Dell EMC, Intel, Microsoft, and Princeton University, and was a response to the number and precipitous growth of IoT devices. To accommodate those growing numbers (over 9 billion currently in use, estimated to be over 21 billion by 2020), they saw the need to extend cloud computing to the edge. And as the consortium sees it, moving to the edge is best described as moving to the fog.
Fog Computing sounds suspiciously like Edge Computing
Yes, fog and edge computing sound like they’re one and the same, but they are indeed different. They both manage, store and process data at the edge, but, according to Cisco’s Helder Antunes, who is an OpenFog Consortium member, “Edge computing is a component, or a subset of Fog Computing. Think of Fog Computing as the way data is processed from where it is created to where it will be stored. Edge computing refers just to data being processed close to where it is created. Fog Computing encapsulates not just that edge processing, but also the network connections needed to bring that data from the edge to its end point.”
The benefits of Fog Computing
With Fog Computing, organizations have more options for processing data, which is beneficial for applications that require data to be processed more quickly―for instance, an IoT device that needs to respond instantaneously, or as close to that as possible.
By creating low-latency connections between devices, Fog Computing can reduce the amount of bandwidth needed when compared to having it sent to the cloud for processing. It can even be used when there’s no bandwidth connection, which, of course, means it must be processed very, very close to where it was created. And if security is a concern, which it always is, Fog Computing can be protected by virtual firewalls.
The OpenFog Consortium’s three (3) goals for Fog Computing
The OpenFog Consortium’s goal is to create for Fog Computing an open reference architecture, build test beds and operational models, define and advance the technology, educate the marketplace, and promote business development with Fog Computing. It developed and outlined three (3) goals that Fog Computing needs to address and support:
- Horizontal scalability, which means it should serve the needs of multiple industries.
- The ability to operate across the continuum that exists between IoT devices and the cloud.
- Serve as a system-level technology that extends IoT devices over the network edge, through to the cloud, and across an array of network protocol layers.
Before you get too comfortable using the term Fog Computing, get ready for another one that’s slowly gaining steam―Mist Computing.
For more information about Cloud, Edge, or Fog―even Mist―Computing, contact one of the tenured networking professionals at GDT. They maintain the highest certification levels in the industry, and have helped companies of all sizes, and from all industries, realize their digital transformation goals. You can reach them at Engineering@gdt.com. They’d love to hear from you.