ZEDEDA Edge Kubernetes Service and ZEDEDA Edge Kubernetes App Flows Overview

Introduction

The ZEDEDA Edge Kubernetes Service and ZEDEDA Edge Kubernetes App Flows extend the power of VM and container orchestration to resource-constrained edge devices. This platform-native implementation enables the deployment, scaling, and management of applications across diverse edge locations. It leverages open-source Kubernetes' extensible architecture and adapts it for edge environments, providing a robust, scalable, and user-friendly platform that addresses the unique challenges of edge computing.  

EdgeKubernetesService.png

Our native Kubernetes solution is developed and maintained by ZEDEDA, as well as the platform to deploy and manage the Kubernetes applications and workloads.  

Our Kubernetes solution addresses the following edge-specific challenges:

  • Intermittent connectivity - Especially when dealing with large fleets, the chances of every single node/cluster being available at a given time are slim. The solution manages cache configurations going to the clusters and state coming from them, so you do not have to worry about the current connectivity state of each cluster to interact with them. 
  • On-premise, disconnected/air-gapped use-cases - You can deploy in scenarios where a cluster may be disconnected for extended periods of time. You still need to have the means to manage the cluster operationally. In the context of Kubernetes, this requirement extends beyond our current approach to Edge-Sync/Local-operator console, where ZEDEDA Cloud is the single source of truth. The fact that a Kubernetes API endpoint is available to you means you can use kubectl to change the state of a cluster. This is true in connected and disconnected use cases. 
  • Large scale - The system is designed to operate at a massive scale, supporting tens of thousands of clusters. Aside from the implications of intermittent connectivity on scaling the orchestrator, this also impacts how one thinks about orchestration itself. At the edge, the objective is to perform ‘fleet management’ of clusters, and not the typical datacenter use case of ‘fleet management’ of nodes inside a cluster. This also involves aggregating clusters through the API/UI thereby enabling you to operate at a fleet-of-clusters level. 

Uses

There are a variety of ways that you can use the Kubernetes solution. 

GitOps for app management

You need k8s cluster orchestration but rely on GitOps for app management. This involves setting up/managing clusters. Meaning the application configurations (manifests) are maintained in a Git repository and you rely on the Kubernetes workflows agent, which installs a component on the cluster that monitors the Git repo for any changes. After a change is detected, the state is pulled from Git into the cluster and activated. You can use this Kubernetes solution if you want to use GitOps for app management.

Fleet management

You want to consume k8s and are looking for Cluster + App management (all API sets). You need infrastructure to deploy applications at the edge using k8s. This involves all functional parts of the solution. If you operate a large set of edge nodes with k8s, you can use this kubernetes solution if you want full fleet management. 

Next Steps

This is a series of articles. You will likely follow them in this order.

  1. ZEDEDA Edge Kubernetes Service and ZEDEDA Edge Kubernetes App Flows Overview - You are here!
  2. Create and Manage ZEDEDA Edge Kubernetes Service and App Flows using the API
  3. Create and Manage a ZEDEDA Edge Kubernetes Service Cluster Using the GUI
  4. Manage an App from the ZEDEDA Edge Kubernetes App Flows Marketplace Using the GUI 
  5. Manage ZEDEDA Edge Kubernetes App Flows Installed Applications Using the GUI
  6. Create and Manage ZEDEDA Edge Kubernetes App Flows Cluster Groupings Using the GUI
  7. Create ZEDEDA Edge Kubernetes App Flows GitOps Repositories Using the GUI
  8. Troubleshoot ZEDEDA Edge Kubernetes Service and App Flows
Was this article helpful?
0 out of 0 found this helpful