Industry experts have warned that the cloud-computing models deployed in many IoT systems today are ill equipped to deal with the volume of data generated by the billions of IoT devices that are slated to go online in the next couple of years. The need for instant decision making coupled with concerns regarding data security has led the early adopters of the IoT to consider alternative computing models. In this article, we look at the key distinctions between the cloud and the fog computing models and discuss which computing model is more suitable for your IoT applications.
What is Fog Computing?
Fog computing is a term created by Cisco that is used to describe computing on devices in an intermediate layer called the fog layer between the cloud and the IoT edge devices. The fog layer consists of fog nodes, which are essentially industrial controllers, gateway computers, switches, and I/O devices that provide computing, storage, and connectivity services. The fog computing model extends the cloud closer to the edge of your network where the devices reside, and facilitates edge intelligence.
Cloud Computing vs. Fog Computing
The cloud computing model in the IoT is about centralized data processing. In contrast, fog computing focuses on moving computational power, storage capacity, device-control capability, and networking power closer to the devices.
As the IoT evolves and proliferates into virtually every business domain, high-speed data processing, big-data analytics, and shorter response times are becoming the norm. Meeting these requirements through the current centralized cloud-based model is proving to be difficult, whereas the decentralized architecture of the fog computing model can bring computing resources and application services closer to the edge, thereby enabling faster response times.
Fog computing works best in IoT-based systems with geographically dispersed end devices, where connectivity to cloud-based systems is irregular but low latency is a key requirement. IoT applications that periodically generate data in the order of terabytes, where sending data to the cloud and back is not feasible, are also good candidates for the fog computing model.
IoT applications that process large volumes of data at distributed on-site locations and require quick response times are better served by a hybrid model that consists of multiple cloud and fog resources.
The Role of Fog Nodes in Fog Computing
At the heart of the fog computing model are fog nodes. Fog nodes are geographically distributed resource-rich devices that can be deployed anywhere in a network.
Discussion on what the ideal profile of a fog node should be is ongoing, but the critical responsibilities of a fog node are:
• Receives real-time data from IoT devices
• Runs IoT-enabled applications for real-time analytics
• Can typically respond to requests in milliseconds
• Provides temporary data storage until the necessary data is transferred to the cloud
• Sends periodic summaries of the data collected from the devices to the cloud
A framework that is based on a smart gateway (computer or router) with industrial-strength reliability, running a combination of open Linux and Docker container, and embedded with vendor’s own proprietary application, is being touted as an ideal solution. The Linux open platform enables easy porting of IoT applications to the IT infrastructure while providing multi-vendor support and programmability. Some solution providers are proposing a layer of abstraction between the OS and the applications to facilitate easy deployment and management of applications on the fog node. Powered by these features, a fog node can intelligently process large volumes of data received from the sensors and field monitors and send only critical data or a summary of the data to the cloud.
Moxa’s fog computing solution consists of a powerful data-acquisition and device-control platform based on Moxa Industrial Linux. For additional details, download the Fog Computing White Paper. Moxa joins industry leaders, such as Codethink, Hitachi, Plat’Home, Renesas, Siemens, and Toshiba, to create a reliable and secure Linux-based embedded software platform that can be sustained for more than 10 years through the Civil Infrastructure Platform (CIP), which is an open source project hosted by The Linux Foundation. The CIP project aims to provide a base layer of industrial-grade open source software components, tools, and methods to enable long-term management of critical systems. For additional information on the CIP, visit the CIP website.