DEV Community

loading...
Cover image for Cloud: Fog or Edge Computing

Cloud: Fog or Edge Computing

Julia
US Army Veteran | B.S. in Computer Science | Full-stack developer | INTJ-T
Updated on ・4 min read

WHAT IS FOG COMPUTING?

As we learned, cloud computing, as a concept, has been around since 1960’s. However, in recent years a new technology has emerged, called edge or fog computing. In the cloud computing, web applications and information processing are centralized at data centers that are situated in limited locations. This model has numerous technological and cost advantages, however, not all applications are suitable to be migrated to the cloud. For example, some applications require nodes in the vicinity to meet their delay requirements, and therefore they demand a tight control of the locations of the hardware. In addition, a wide deployment of IoT (Internet of Things) devices require mobility support and geo-distribution in addition to the location control.

As per IoT Agenda, TechRadar’s blog post, we can define fog computing as a decentralized computing infrastructure in which data, compute, storage and applications are located somewhere between the data source and the cloud. Fog computing brings the advantages and power of the cloud closer to where data is created and acted upon. Fog computing is envisioned as a highly distributed instantiation of cloud computing. The goal of fog computing is to add the existing value to of cloud computing with advantages such as improved power efficiency, reduced latency, enhanced security, and a much higher scalability for mobile and distributed devices. Below is the list of features that illustrate the contrast between cloud and fog computing:

• Edge location, location awareness, and low latency. Wireless access points and cellular mobile gateways are examples of fog network nodes.
• Support for online analytics and real-time interactions. The fog plays a significant role in the data processing close to the source.
• Geographical distribution. The fog is suited for applications and services that involve widely distributed deployments.
• Support for mobility. It is essential for fog applications to communicate directly with mobile devices.
• Scalability. The fog plays an important role in scaling up Internet services by several orders of magnitude.
• Heterogeneity. Fog nodes come in different form factors and are built upon heterogeneous platform.
• Interoperability and federation. Seamless support of certain services, such as video streaming, requires cooperation of different providers.

CHALLENGES

While fog computing may seem like an ideal solution for many computing problems, it also comes with its own challenges.

Fog Networking

As we previously stated, fog network is heterogeneous. The responsibility of fog network is to connect every component of fog. However, managing such a network and maintaining connectivity and providing services is not easy. There are two techniques that were proposed to create flexible and easily maintainable fog environment.

  1. SDN (Software Defined Networking). In the fog each node should be able to act as a router for nearby nodes, and it should be resilient to node mobility. The challenges of integrating SDN into fog is to accommodate dynamic conditions. Proposed designs of SDN-based mobile architectures show the feasibility by achieving high packet delivery ration with overhead. It accepts the changes from wired ports to heterogeneous wireless interfaces to support applications.
  2. NFV (Network Function Virtualization). NFV replaces the network functions with virtual machine instances. It benefits fog computing in many aspects by virtualizing gateways, switches, load balancers, firewall and intrusion detection devices and placing those instances on fog nodes.

Quality of Service (QoS)

QoS is an important metric for fog computing and can be divided into four categories:

  1. Connectivity. Fog computing provides new opportunities in cost reduction by network relaying, partitioning and clustering. However, selection of fog nodes from end user can drastically impact the performance.
  2. Reliability. Normally, reliability can be improved through periodical check-pointing to resume after failure, rescheduling of failed tasks, or replication to exploit executing in parallel. But these methods may not be suited for fog computing because it is highly dynamic, since there is latency, and it cannot adapt to changes.
  3. Capacity. Capacity has two folds: network bandwidth and storage capacity. In order to achieve high bandwidth and efficient storage use, it is important to know how data is placed in fog network. This problem brings new challenges to fog computing, such as inability to compute data that is spread throughout several nodes. Another challenges come from design interplay between fog and clout to accommodate different workloads.
  4. Delay. Latency-sensitive applications, such as streaming, have their own challenges. To avoid delay, a few solutions have been proposed. One of them is to provide real-time streaming processing rather than batch-processing.

Resource Management

Cloud provisioning and resource management are still interesting topics in fog computing environment.

• Application-aware provisioning plans operator migration ahead, ensuring end-to-end latency restrictions and reduction in network utilization.
• Resource discovery and sharing is critical for application performance in fog. It dynamically selects centralized and flooding strategies to save energy in heterogeneous networks.

Security and Privacy

There are several issues of security and privacy, and quite a few works have been published on the topic. There are four topics that have been extensively studied:

  1. Authentication. With the emergence of biometric authentication in mobile and cloud computing, applying biometric authentication to fog computing will be beneficial.
  2. Access control. This has been a reliable tool on smart devices and cloud, ensuring the security of the system. Expansion of control of data owner into the fog can be achieved exploiting techniques of several encryption schemes together to build an efficient fine-grained data access control.
  3. Intrusion detection. Intrusion detection techniques have been applied to cloud infrastructures to mitigate attacks such as insider attacks, flooding attacks, port scanning, and attacks on VM or hypervisor. These detection systems are deployed on either host machine, VM, or hypervisor. They can also be deployed at network side to detect malicious activity. In fog, this provides new opportunities to investigate how fog computing can help with intrusion detection.
  4. Privacy. Users are concerned about the risk of privacy leaks, such as data, location, or usage leaks on the Internet. In the fog network, privacy-preserving algorithms can be run in between the fog and cloud since computation and storage are sufficient for both sides while those algorithms are usually resource-prohibited at the end device.

Discussion (0)