What is it about?

With the recent development in the Internet of Things (IoT), big data, and machine learning, the number of services has dramatically increased. These services are heterogeneous in terms of the amount of resources and quality of service (QoS) requirements. To cope with the limitations of Cloud infrastructure providers (CIPs) for latency-sensitive services, many Fog infrastructure providers (FIPs) have recently emerged and their numbers are increasing continually. Due to difficulties such as the different requirements of services, location of end-users, and profile cost of IPs, distributing services across multiple FIPs and CIPs has become a fundamental challenge. Motivated by this, a flexible and scalable platform, FLEX, is proposed in this work for the service placement problem (SPP) in multi-Fog and multi-Cloud computing. For each service, FLEX broadcasts the service’s requirements to the resource managers (RMs) of all providers and then based on the RMs’ responses, it selects the most suitable provider for that service. The proposed platform is flexible and scalable as it leaves it up to the RMs to have their own policy for service placement. The problem is formulated as an optimization problem and an efficient heuristic algorithm is proposed to solve it. Our simulation results show that the proposed algorithm can meet the requirements of services.

Featured Image

Why is it important?

FLEX provides two important features, flexibility and scalability. FLEX is flexible as it allows Fog and Cloud IPs to implement their own service placement strategy. It is scalable as new Fog and Cloud providers can easily be added to the platform.

Perspectives

Writing this article was a great pleasure as it has expert co-authors in Fog and Cloud computing systems.

Sadoon Azizi
University of Kurdistan

Read the Original

This page is a summary of: FLEX: A Platform for Scalable Service Placement in Multi-Fog and Multi-Cloud Environments, February 2022, ACM (Association for Computing Machinery),
DOI: 10.1145/3511616.3513105.
You can read the full text:

Read

Contributors

The following have contributed to this page