top of page

Edge Computing Architecture

Updated: Apr 11, 2023

The edge computing architecture identifies the key layers of the edge: the device edge (which includes edge devices), the local edge (which includes the application and network layer), and the cloud edge. But we’re just getting started. Our next article in this series will dive deeper into the different layers and tools that developers need to implement an edge computing architecture.


Edge Computing consists of three main nodes:

  1. Device edge, where the edge devices sit

  2. Local edge, which includes both the infrastructure to support the application and also the network workloads

  3. Cloud, or the nexus of your environment, where everything comes together that needs to come together


Edge computing involves a network of different nodes, which are important parts of the overall architecture. There are three types of nodes in this network:

  1. Device Edge: This node includes physical devices like cameras and sensors that gather data or interact with edge data. Some devices only gather or transmit data, while others have the processing power to perform additional activities. Applications like video analytics and real-time processing are deployed and managed on these devices using IBM Edge Computing solutions.

  2. Local Edge: The systems running on-premises or at the edge of the network. The edge network layer and edge cluster/servers can be separate physical or virtual servers existing in various physical locations or they can be combined in a hyper-converged system. It consists of two sublayers - the application layer and the network layer. Applications that require more resources than the device edge can provide run on this layer, while the network layer includes virtualized or containerized network devices required to run the local edge.

  3. Cloud: This node runs on-premises or in the public cloud and is the source for workloads - applications that require processing power beyond what the other edge nodes and management layers can provide. These workloads are deployed to different edge nodes using the appropriate orchestration layers.

Edge Computing Architecture in Detail

The below figure illustrates a more detailed architecture that shows which components are relevant within each edge node.


1. Device Edge

At the Device Edge, data flows through different types of devices and equipment, including industry-specific solutions, edge devices, gateway-class devices, and gateway servers.


Industry-specific solutions are devices designed for specific applications, such as cameras for surveillance, sensors for measuring temperature and humidity, or machines for manufacturing. These devices collect data at the edge, where the data is generated, and pass it along to other nodes for further processing.


Edge devices are devices that can process data at the edge, such as Raspberry Pi or Arduino boards. They can perform simple data processing or analysis on the data they collect and transmit it to other nodes in the network.


Gateway-class devices act as a bridge between the edge devices and other nodes in the network. They can collect data from multiple edge devices, aggregate it, and perform more complex processing tasks, such as filtering or analyzing the data. Gateway-class devices can also provide additional functionality, such as data encryption or compression.


Gateway servers are larger and more powerful devices that can perform advanced data processing and analysis tasks. They can store and manage data from multiple edge devices and gateway-class devices, and transmit it to other nodes in the network. Gateway servers can also run complex applications, such as machine learning algorithms, to analyze data and make predictions.


In general, data flows from the edge devices to the gateway-class devices, and then to the gateway servers, where it can be analyzed, stored, or transmitted to other nodes in the network. The Device Edge is an important part of the edge computing architecture, as it allows data to be collected, processed, and analyzed closer to where it is generated, enabling faster decision-making and reducing the amount of data that needs to be transmitted to other nodes in the network.


2. Local Edge

The local edge is a key component of edge computing architecture that resides at the edge of the network, closer to where data is generated and consumed. This node is responsible for processing data that is too sensitive, too large, or too time-sensitive to be processed in the cloud.


Industry Solution Cloud: This refers to cloud solutions that cater to specific industries such as healthcare, retail, manufacturing, and transportation. The data generated by devices at the edge is collected and analyzed using industry-specific solutions to gain insights and improve operations.


Multi-Cloud Management: This refers to the management of multiple cloud services and providers that are used to process and store data. Data from devices at the edge can be sent to multiple clouds for processing and storage, and the local edge is responsible for managing the flow of data between different cloud services.


Virtualization Network: This layer is responsible for the virtualization of network devices such as routers, switches, and firewalls. These virtualized network devices enable the local edge to manage the flow of data more efficiently and securely.


Routing and Network Services: These services are responsible for routing the data through the local edge and ensuring that it reaches its destination securely and efficiently. vRAN (Virtual Radio Access Network) is used to virtualize the radio access network in 5G mobile networks, while vFirewall and vEPC (Virtual Evolved Packet Core) are used to provide firewall and packet core functions in the virtualized network.


Orchestration Performance: This layer is responsible for managing the performance of the local edge by ensuring that the right amount of resources are allocated to different applications and services. The local edge uses orchestration tools to manage the performance of different applications running on it.


In summary, the local edge plays a crucial role in managing the flow of data between devices at the edge and cloud services. It uses virtualized network devices and orchestration tools to ensure that data is processed and routed efficiently and securely.


3. Cloud

Data can flow through cloud environments in different ways, depending on the architecture and deployment model.

Containerized workloads are self-contained units of software that include all the necessary dependencies and configurations to run an application. They are designed to be portable and easily moved between different environments, including public cloud, private cloud, and on-premise data centers.


In the public cloud, data is stored and processed in the data centers of cloud service providers like Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP). Users can access these resources over the internet, and pay for what they use on a subscription basis. The cloud provider manages the infrastructure and security of the data center, while users are responsible for managing their own applications and data.


In a private cloud, the cloud infrastructure is operated solely for a single organization, either on-premise or in a third-party data center. Private clouds offer more control and customization over the infrastructure and security but require more upfront investment and maintenance.


In an IT data center, data is stored and processed on-premise using traditional hardware and software infrastructure. IT departments are responsible for managing the infrastructure, security, and maintenance of the data center, including the hardware, operating systems, and applications.


Partner ecosystems are networks of third-party providers who offer complementary products or services that can be integrated with cloud solutions. Cloud providers often have marketplaces where users can browse and purchase partner offerings, such as security tools, analytics software, or developer tools. Partner ecosystems can help users customize and extend their cloud solutions, while also providing additional revenue streams for partners.

0 comments
bottom of page