Fog computing: fog and cloud along the Cloud-to-Thing continuum

Fog computing is an important evolution in, among others, IoT and especially Industrial IoT or IIoT and many connected applications in areas such as smart manufacturing, the smart building, smart grid, oil and gas, anything Industry 4.0 and more.

Fog computing is on a growth trajectory to play a crucial role in IoT, 5G and other advanced distributed and connected systems (Christian Renaud, 451 Research PR)

As the term fog already suggests there is an important link between fog computing and cloud computing. It’s often called an extension of the cloud to where connected IoT ‘things’ are or in its broader scope of “the Cloud-to-Thing continuum” where data-producing sources are. Fog computing has been evolving since its early days when we wrote about it here (it’s still in its early days). As you’ll read and see below fog computing is seen as a necessity for IoT but also for 5G, embedded artificial intelligence (AI) and ‘advanced distributed and connected systems’.

Fog computing is designed to deal with the challenges of traditional cloud-based IoT systems in managing IoT data and data generated by sources along this cloud-to-thing continuum. It does so by decentralizing data analytics but also applications and management into the network with its distributed and federated compute model.

Fog computing

Fog computing versus edge computing

Fog computing, the term comes from Cisco, is not a network technology. It’s a hybrid system-level architecture approach whereby the possibilties of cloud computing and distributed processing and analytics power are brought to the edge of a network.

Fog is hierarchical, where edge tends to be limited to a small number of peripheral layers

It does so in a different way than edge computing. In edge computing the aim is to bring the intelligence, analytics, computing, communications etc. very close and increasingly into devices such as programmable logic controllers and other, ever more powerful and smaller, devices at the edge and, after analysis etc., to the appropriate system or the cloud (or a data center).

For those who want a really formal answer on what the difference between fog computing and edge computing is, NIST (more below) has an answer : “fog computing works with the cloud, whereas edge is defined by the exclusion of cloud and fog. Fog is hierarchical, where edge tends to be limited to a small number of peripheral layers. Moreover, in addition to computation, fog also addresses networking, storage, control and data-processing acceleration”.

The role of fog nodes in fog computing 

In fog computing the aim is also to bring the data analysis and so forth as close as possible to the data source but in this case to fog nodes, fog aggregation nodes or, when decided so by the fog (IoT) application, to the cloud. That’s an essential difference with edge computing.

Fog nodes may be either physical or virtual elements and are tightly coupled with the smart end-devices or access networks. Fog nodes typically provide some form of data management and communication service between the peripheral layer where smart end-devices reside and the Cloud. Fog nodes, especially virtual ones, also referred as cloudlets, can be federated to provide horizontal expansion of the functionality over disperse geolocations (NIST fog node definition in the 2017 NIST fog computing definition draft)

In other words: in fog computing the fog IoT application will decide what is the best place for data analysis, depending on the data, and then send it to that place.

If the data is highly time-sensitive (typically below or even very far below a second) it is sent to the fog node which is closest to the data source for analysis. If it is less time-sensitive (typically seconds to minutes) it goes to a fog aggregation node and if it essentially can wait it goes to the cloud for, among others, big data analytics.

A fog node can take many shapes. As long as a device has the capacities (computing, storage and connectivity) to do what it needs to do at the edge, it can be a fog node. It could be a switch, a router, an industrial controller or even a video surveillance camera at some industrial location to name a few.

Fog computing visually explained - source Cisco blog post announcing the launch of the OpenFog Consortium
Fog computing visually explained – source Cisco blog post announcing the launch of the OpenFog Consortium

A fog node can also sit at many places: on the factory floor, on an oil rig or inside a car, for example, provided there is connectivity of course.

It’s clear that if a fog node needs to do what it needs to do in milliseconds or at least under a second that’s typically because an action, automated or otherwise needs to follow. And without connectivity that is pretty hard.

Fog computing in action

The actions which are taken based upon the analysis of (IoT) data in a fog node, if that’s where the fog application sent the data from the sensors or IoT end devices to, can also take many shapes.

Just like transducers (sensors and actuators) it sets something in motion (output) based upon an input whereby the data from the input here is analyzed very rapidly. That output or action could be anything, ranging from automatically lowering temperature, changing parameters in a system whatsoever or closing/opening a valve or door to sounding an alarm, alerting an engineer via a message, triggering a change in some form of visualized/readable/viewable display/chart, for instance in a SCADA/HMI system and more.

Fog Computing on the Cisco Technology Radar - before and after - full image and source
Fog Computing on the Cisco Technology Radar – before and after – full image and source

Simply said: instead of transporting all data over the network and then processing it, for instance in the cloud, some operations, mainly analytical, are performed close to the IoT device (where the data gets gathered), hence the edge of the network or the endpoint, and processes IoT data faster for a myriad of possible reasons where that speed matters but also without wasting bandwidth that can thus be saved.

There’s a bit more to it but in a nutshell that is what it does. Of course not all IoT data needs to be analyzed so fast that you need your analysis and computing power this close to the source and it isn’t just about bandwith and latency. It’s also about priorities.

Fog computing offers many benefits, such as: avoiding the costs of ever more bandwidth, solving high latency on the network, less bottlenecks, a reduced risk of connectivity failures, that higher speed of analysis and action, etc.

The OpenFog Consortium and OpenFog Reference Architecture

End 2015 a range of IoT leaders launched the OpenFog Consortium. The aim: accelerate the deployment of fog technologies through the development of an open architecture as the press release stated. The founding partners were not the smallest and included (obviously) Cisco, ARM, Dell, Intel, Microsoft and Princeton University.

Calling fog computing the distributed cloud technology that enables many of the real-time, data-intensive capabilities of the Internet of Things, 5G mobile technology, and artificial intelligence applications, Cisco’s Helder Antunes, who is chair of the Consortium wrote a blog post early 2018 looking back at some of the accomplishments of the OpenFog Consortium in 2017.

The major fog computing milestone no doubt was the release of the OpenFog Reference Architecture as depicted below, describing the various interrelationships of fog computing components. You can also learn more about that OpenFog Consortium Reference Architecture framework in the video at the bottom of this post.

OpenFog Reference Architecture - the various components of the fog computing framework and how they are interconnected - source and courtesy
OpenFog Reference Architecture – the various components of the fog computing framework and how they are interconnected – source and courtesy

A second noteworthy fact for those interested in the usage of fog computing was the publication of the OpenFog Security Requirements and Approaches which expand on the security aspects of the OpenFog architecture and the security challenges in a distributed cloud environment and which you can check out in PDF here.

Also in October 2017 the Institute of Electrical and Electronics Engineers Standards Association said it will use the OpenFog Reference Architecture as the basis for its work on fog standards. As Helder Antunes writes the newly formed IEEE P1934 Standards Working Group on Fog Computing and Networking Architecture Framework expects to complete the first iteration of its work by April 2018.

The definition of fog computing – what is fog computing?

Here is how the consortium defines fog computing:

“Fog computing is a system-level horizontal architecture that distributes resources and services of computing, storage, control and networking anywhere along the continuum from Cloud to Things. By extending the cloud to be closer to the things that produce and act on IoT data, fog enables latency sensitive computing to be performed in proximity to the sensors, resulting in more efficient network bandwidth and more functional and efficient IoT solutions. Fog computing also offers greater business agility through deeper and faster insights, increased security and lower operating expenses”.

However, let’s also add that NIST, known from its work on among others cloud computing and its recently published draft on blockchain, also joined the fog computing evolution, seeking a formal definition of fog computing, just as it did before with the various cloud services and so forth.

With the comments closed on September 21, 2017, NIST Special Publication 800-191 (Draft) defines fog computing as “a horizontal, physical or virtual resource paradigm that resides between smart end-devices and traditional cloud or data centers. This paradigm supports vertically-isolated, latency-sensitive applications by providing ubiquitous, scalable, layered, federated, and distributed computing, storage, and network connectivity”.

The image from the NIST fog computing definition draft below shows fog computing in the broader scope of a cloud-based ecosystem serving smart end-devices.

Fog computing in the broader context of a cloud-based ecosystem serving smart end-devices according to NIST
Fog computing in the broader context of a cloud-based ecosystem serving smart end-devices according to NIST which adds that it is important to note that, in authors’ view, fog computing is not perceived as a mandatory layer for such ecosystem – source

NIST also came up with a formal definition of a fog node it its draft document, defining fog nodes as intermediatary compute elements of the smart end-devices access network that are situated between the cloud and the smart end-devices.

  • From a service level model perspective, as fog computing is an extension of cloud computing, the NIST document took over well-known service models SaaS, PaaS and IaaS for fog computing too.
  • From a node deployment model it identifies four type which also ring a few bells when comparing with cloud: private fog node, public fog node, hybrid fog node and community fog node.

The fog computing market: $18 billion by 2022

According to research, released end October 2017 at the occasion of the Fog World Congress, the fog computing market globally is expected to exceed $18 billion by 2022.

The OpenFog Consortium, which deems fog necessary for IoT, 5G and embedded artificial intelligence, commissioned 451 Research to dive deeper into the main markets for fog computing and networking, compare cloud versus on premise spend and break down the market into various segments (hardware, fog applications and services).

As the chart released at the occasion below shows, 51.6 percent of revenue goes to hardware, followed by fog applications (19.9 percent) and services (15.7 percent).

The biggest markets are transportation, industrial, energy/utilities and healthcare. Cloud revenue is expected to go up by 147 percent by 2022 and fog is expected to go into existing devices and software, working with new single-purpose fog nodes. Fog as a Service (FaaS) should double its growth between 2018 and 2022.

Fog computing and networking - fog computing outlook 2022 by 451 Research for the OpenFog Consortium - source
Fog computing and networking – fog computing outlook 2022 by 451 Research for the OpenFog Consortium – source

While the main takeaways were shared at the Fog World Congress, an additional finding reported in the press release, on top of those mentioned, is that major market changes which drive the growth of fog include investment in the modernization of energy infrastructure, demographic shifts and regulations in healthcare and transportation.

Christian Renaud, research director, Internet of Things, 451 Research, and lead author of the report in the press release: “Through our extensive research, it’s clear that fog computing is on a growth trajectory to play a crucial role in IoT, 5G and other advanced distributed and connected systems. It’s not only a technology path to ensure the optimal performance of the cloud-to-things continuum, but it’s also the fuel that will drive new business value.”

Below is the video that dives deeper into the OpenFog Reference Architecture which puts some other terms and pillars in perspective.

Fog computing: many benefits but not always the best solution

We’ll leave the last word, how else could we, to someone from Cisco who blogged at the occasion of the launch of the Fog Consortium and gave some examples of the possibilities of fog computing, going back to the roots and the first version of this overview.

Fog computing effectively addresses issues related to security, cognition, agility, latency and efficiency (OpenFog Consortium PR)

We quote: “Fog computing can provide immense value across all industries. For example, it might take 12 days via satellite to transmit one day’s worth of data to the cloud from a remote oil rig. With fog computing the data is processed locally, and safety or equipment alerts can be acted upon immediately”. More examples in the blog post.

Is fog always the best solution? No, there are circumstances where cloud computing is a better fit. It’s about striking the right balance and picking the best mix for the purpose of each different scenario. As it always is.

Fog computing is the system-level architecture that brings computing, storage, control, and networking functions closer to the data-producing sources along the cloud-to-thing continuum (OpenFog Consortium PR)

Moreover, in an often cited White Paper from, yes, Cisco, entitled “Fog Computing and the Internet of Things: Extend the Cloud to Where the Things Are” the authors offer an overview of when to consider fog computing as you can read in the paper (PDF opens) (e.g. when data is collected at the extreme edge such as in ships, roadways or, closer, factory floors; when there is high data generation across a large geopgraphic area and of course when analyzing this data and acting upon it needs to happen really fast).

 

 

Top image: Shutterstock – Copyright: phoenixman – All other images are the property of their respective mentioned owners.