hamburger icon close icon
Kubernetes in Azure

GKE vs AKS: Compared on Availability, Security, Tooling & More

 

What is Google Kubernetes Engine (GKE)?

What is Azure Kubernetes Service (AKS)?

GKE is an orchestration and management system designed by Google to help you run Docker containers and Kubernetes clusters on Google Cloud. It was the first large-scale managed Kubernetes service, and is the most mature offering on the market.

 

GKE is based on Kubernetes. To administer GKE, you use the Google Cloud Platform Console or the gcloud command line interface, but you can manage workloads directly using Kubernetes interfaces.

AKS offers fully-managed container orchestration, based on Kubernetes, in the Azure cloud. AKS can help organizations handle software deployments, resource provisioning, and scaling of Docker containers and containerized applications in the Azure cloud.

 

To create AKS clusters, you can either use Azure PowerShell, Azure portal, or the Azure command-line interface (CLI). Additionally, you can create template-driven Kubernetes deployments by using Azure Resource Manager templates.

In this article, you will learn:

AKS vs GKE

Upgrades and Availability

AKS supports minor patches and takes a structured approach to updates, encouraging customers to update to the latest version of Kubernetes. AKS offers automated upgrades for nodes, but the entire process requires some manual work. For example, manually upgrading the core control plane component. AKS supports node auto-repair functionality, which can be used alongside auto-scaling node pools. AKS offers an uptime SLA of 99.95%.

GKE supports multiple recent versions of Kubernetes—typically four minor versions and twelve major versions. GKE offers automated upgrades for the control plane and worker nodes. It also offers detection and remediation of unhealthy nodes, as well as release channels that automate testing new versions. GKE offers auto-repair by default, providing automated cluster health maintenance. GKE offers an uptime SLA of 99.95%.

Monitoring

You can set up monitoring for AKS by using two Microsoft Azure solutions. Azure Monitor can help you assess the health of containers. To monitor Kubernetes components, you can use Application Insights and also configure Istio, a service mesh framework.

You can set up monitoring for GKE by using a Google Cloud service called Stackdriver. This solution offers monitoring for both master and worker nodes. It can integrate logging and Kubernetes components into the platform.

Security

AKS and GKE provide similar security functionality through different tools. Both AKS and GKE deploy Kubernetes with role-based access control (RBAC) enabled by default. To set up network policies on AKS and GKE, you can use Calico. AKS also offers support for network policy.

As for encryption, each service works differently:

  • AKS offers encryption with Azure KMS. You can either have Azure manage encryption keys for you or opt to maintain customer-managed keys.
  • GKE offers data-at-rest encryption with Cloud KMS. To configure encryption keys for GKE, you need to use Google Kubernetes Engine.

Serverless Compute

AKS offers serverless functionality through virtual nodes. Instead of running full virtual machines (VMs), this feature lets you run Kubernetes pods with Azure Container Instances (ACI). This provides more granular and faster scaling. Virtual nodes are a compute option you can use alongside regular Azure VMs. You can simply target specific workloads to run on virtual nodes, and seamlessly add more nodes as needed.

Related content: Read our guide to Azure Container Instances

GKE provides serverless functionality through a feature called Cloud Run for Anthos. Cloud Run is a managed serverless container platform, which offers highly flexible deployments that can scale to zero using per-request scaling, allowing you to dynamically choose whether to use Cloud Run resources or your self-managed Google Cloud VMs.

Related content: Read our guide to Google Anthos

Development Tools

AKS lets you access Kubernetes extensions in Visual Studio Code. Additionally, AKS offers a feature called Bridge to Kubernetes, which enables you to run local code like a service within a cluster. You can use it to run and debug local code without replicating dependencies locally.

GKE provides Cloud Code, a feature that works like an IntelliJ or Visual Studio Code extension. It enables you to deploy, manage, and debug your cluster from within the IDE or code editor. You can directly integrate with Cloud Run as well as with Cloud Run for Anthos.

Learn more in our guides:

AKS vs GKE: How to Choose

When making your final choice of a managed Kubernetes service, consider these pros and cons of AKS and GKE.

AKS Pros

  • Excellent support for Windows and integration with Azure Active Directory
  • Convenient command line tooling
  • Easy configuration of virtual networks and subnets
  • Integrated logging and monitoring

AKS Cons

  • A less mature solution compared to GKE
  • Nodes can only run on Ubuntu Linux or Windows Server, with limited ability to customize the OS
  • You cannot change server types after deploying a node pool
  • Nodes are not automatically updated and do not auto-recover from kubelet failure

GKE Pros

  • Easy deployment of clusters, both via web interface and command line
  • One click or fully automated updates of cluster version and node pools
  • Self-healing when nodes fail due to kubelet issues
  • The service automatically applies selected Kubernetes patches and CVE fixes

GKE Cons

  • Nodes can only run on Container OS or Ubuntu Linux, with no control over operating system or kernel versions.
  • Cluster utilities like kube-dns and ip-masq-agent are not customizable.
  • By default, the control plane runs in a single availability zone (AZ), and this cannot be changed. For production clusters, you must explicitly select a Regional cluster to deploy in multiple AZs.

Kubernetes Storage with NetApp Cloud Volumes ONTAP

NetApp Cloud Volumes ONTAP, the leading enterprise-grade storage management solution, delivers secure, proven storage management services on AWS, Azure and Google Cloud. Cloud Volumes ONTAP capacity can scale into the petabytes, and it supports various use cases such as file services, databases, DevOps or any other enterprise workload, with a strong set of features including high availability, data protection, storage efficiencies, Kubernetes integration, and more.

In particular, Cloud Volumes ONTAP supports Kubernetes Persistent Volume provisioning and management requirements of containerized workloads.

Learn more about how Cloud Volumes ONTAP helps to address the challenges of containerized applications in these Kubernetes Workloads with Cloud Volumes ONTAP Case Studies.

New call-to-action
Yifat Perry, Technical Content Manager

Technical Content Manager

-