Fog computing

IoT, the internet of things, has changed the IT landscape worldwide. Even more than the connectedness of everyday items, so-called smart manufacturing poses new challenges for traditional cloud architectures. IoT will become a key technology for industrial facilities within the framework of Industry 4.0. “Smart manufacturing” is a public-private initiative led by the nonprofit Smart Manufacturing Leadership Coalition (SMLC). The vision: manufacturing facilities and logistics systems that communicate independently to organize work processes, improving both productivity and energy efficiency.

However, so far many states are lagging behind in Industry 4.0 readiness, with rural-based manufacturing sites often lacking access to ultra-fast internet necessary for interconnectivity. The nonprofit group Connected Nation describes the challenges of current broadband expansion plans for rural America, which calls into question whether the smart manufacturing concept can be comprehensively implemented. After all, a fully connected industrial facility could generate several hundred gigabytes of data every day. This leaves vast amounts of data that can neither be wirelessly downloaded from the cloud nor centrally processed using established technologies.

Possible solutions to these obstacles in the implementation of IoT are being developed in the field of fog computing.

What is fog computing? A definition

Fog computing is a cloud technology in which terminal-generated data doesn’t load directly into the cloud, but is instead preprocessed by decentralized mini data centers. The concept involves a network structure extending from the network’s outer perimeter in which data generated by IoT devices is sent to the public cloud central data endpoint or to a private data processing center (private cloud).

Nota

A network’s boundary is called the “edge” in IT jargon. So a device located at the edge of a network (such as a sensor on a production line) is called an “edge device.”

The marketing term “fog computing” originates from the technology company Cisco, one of the leading manufacturers of network solutions. The expression is actually a clever metaphor that refers to the difference between fog and clouds. Both result from an accumulation of fine water droplets but are differentiated by the height at which the phenomenon occurs. In terms of IT architecture, fog computing brings the processing of data back to ground level. This is implemented with computer nodes called fog nodes that act as intermediaries between the cloud and numerous IoT terminal devices.

The aim of so-called “fogging” is to abbreviate communication channels and reduce data transmission through external networks. Fog nodes form an intermediate layer in the network where it is decided which data is processed locally and which is forwarded to the cloud or to a central data center for further analysis or processing.

The following graphic shows a schematic representation of the three layers of fog computing architecture:

  • Edge layer: The edge layer includes all of an IoT architecture’s “smart” devices (edge devices). Data generated from the edge layer is either processed on the device directly or transmitted to a server (fog node) in the fog layer.
  • Fog layer: The fog layer includes a number of powerful servers that receive data from the edge layer, preprocessing and uploading it to the cloud as needed.
  • Cloud layer: The cloud layer is the central data endpoint of a fog computing architecture.

A reference architecture for fog systems is being developed by the OpenFog Consortium.

Note

The OpenFog Consortium was founded in 2015 by Cisco Systems, Intel, Microsoft, Dell, ARM Holdings, and Princeton University as a cross-sector collaboration between industry and science. The initiative aims to standardize fog computing technologies and today is comprised of 57 high-tech companies and academic institutions.

To display this video, third-party cookies are required. You can access and change your cookie settings here.

Differentiation from cloud computing

What sets fog and cloud computing apart is the provision of resources and how data is processed. Cloud computing usually takes place in centralized data centers. Resources such as processing power and storage are bundled by backend servers and made available through the network for use by clients. Communication between two or more end devices always takes place via a server in the background.

Such an architecture is pushed beyond its limits with a concept like “smart manufacturing,” because it requires that data be continuously exchanged between countless end devices. Fog computing makes use of intermediate processing close to the data source in order to reduce data throughput to the data center.

Differentiation from edge computing

It’s not only the data throughput of large-scale IoT architectures that pushes cloud computing to its limits. Another problem is latency. Centralized data processing is always associated with a time delay due to its long transmission paths. End devices and sensors have to communicate with each other through the server in the data center, resulting in a delay in the external processing of the request as well as the response. Such latency times become problematic in IoT-supported production processes where real-time information processing is a must for machines to react immediately when an incident occurs.

One solution to the latency problem is offered by edge computing, a concept within the framework of fog computing in which data processing is not only decentralized, but takes place directly in the end device and thus at the edge of the network. Each “smart device” is equipped with its own micro-controller, enabling basic data processing and communication with other IoT devices and sensors.

Fog computing in practice

Today’s IoT can’t be compared with tomorrow’s. According to a Cisco study, the internet of things is expected to include around 50 billion devices worldwide by 2020. The amount of data that will need to be stored, analyzed, and prepared for further processing will be correspondingly high.

Smart manufacturing is by no means the only area in which fog computing can be applied to unburden systems and facilitate data transfer. Other future projects such as connected cars – semi-autonomous or fully self-driving cars – or connected cities with smart utility grids, also require real-time data analysis. This can’t be achieved with classic cloud computing. For example, a self-driving car collects data on weather conditions, road closures or detours, and traffic levels, all of which need to be assessed without latency, allowing a response to an unforeseeable incident to be made in a timely manner. In such a scenario, fog computing enables processing of data both in the vehicle and via the service provider.

Advantages and disadvantages at a glance

Fog computing offers solutions to a variety of problems associated with cloud-based IT infrastructures. It places priority on strategies that provide easily accessible communication channels and keep uploading to the cloud to a minimum. That being said, decentralized processing at the edge of the network also has disadvantages, mainly the cost and complexity of maintenance and administration of a widely dispersed infrastructure.

Table: Advantages and disadvantages of fog computing architecture

Advantages Disadvantages
✔ Less network traffic: Fog computing reduces traffic between IoT devices and the cloud. ✘ Higher hardware costs: Fog computing requires that IoT devices and sensors be equipped with additional processing units to enable local data processing and device-to-device communication.
✔ Cost savings through use of third-party networks: Network providers bear high costs for high-speed upload to the cloud. Fog computing reduces these. ✘ Little protection against failure or misuse: Companies relying on fog computing have to equip IoT devices and sensors with controllers that are difficult to secure against breakdown or misuse, e.g. in manufacturing facilities at the edge of the network.
✔ Offline availability: In a fog computing architecture, IoT devices are also available offline. ✘ Increased maintenance requirements: Decentralized data processing requires more maintenance, since controllers and storage locations are distributed across the entire network and, unlike cloud solutions, can’t be maintained or administered centrally.
✔ Less latency: Fog computing shortens communication paths, accelerating automated analysis and decision-making processes. ✘ Additional network security requirements: Fog computing is vulnerable to man-in-the-middle attacks
✔ Data security: In fogging, device data is preprocessed by the local network. This enables implementation in which sensitive data can remain internal to the company or be encrypted or anonymized before being uploaded to the cloud.  

Outlook: fog computing and 5G mobile communications

The new 5G mobile communications standard with median download speeds of 1.4 Gbps will begin rollout in certain cities in late 2018 while other areas won’t begin until 2020. Experts predict exorbitant data growth, especially in the professional sector. With 5G, the bandwidth and speed of mobile data transmission will increase exponentially, opening up entirely new application possibilities in the industry and service sectors. 5G promises users latency in the sub-millisecond range. However, the benefits of the new mobile communications standard can’t really be taken full advantage of with current cloud technologies. This is in part because of the number of hops a data package needs from source to its destination in the cloud. This makes direct data upload to the cloud unsuitable for applications that require real-time data processing.

Note

Hop refers to the path of a data package from one network node to the next. The more network nodes a data package passes (called a hop count), the longer the delay time (latency) of the data transmission.

This is where fog computing provides an option that makes 5G usable for the industry. A decentralized fog system provides computing power and storage capacity at the edge of the network. This allows data generated by business applications to be evaluated locally, then selected and aggregated for the cloud. This ensures that important outcomes – such as a command for emergency shutdown of a production facility – are executed in real time. Only data that can’t or would be difficult to evaluate locally, or that requires more detailed analysis, is then transferred to the cloud. This might include unexpected measurements indicating that a machine is due for maintenance.

We use cookies on our website to provide you with the best possible user experience. By continuing to use our website or services, you agree to their use. More Information.