Now available
GPU Workloads on Northflank
Northflank now supports GPU workloads! Provision compute nodes with the latest GPU models across cloud providers. Northflank does all of the heaving lifting, meaning that your AI, machine learning, and HPC workloads can be up and running with minimal effort.
Request access
We’re onboarding users to GPU workloads on Northflank in batches to ensure a quality developer experience and to iterate on customer feedback promptly. Enter your details below to register your interest and we’ll be in touch shortly.
Multi-cloud
Northflank supports GPU workloads on AWS, Azure, Civo, and GCP. Wherever you prefer to deploy, get started in just a couple of clicks – Northflank abstracts away any awkward set-up and configuration.
Time slicing & MIG
Securely run multiple independent workloads on each provisioned GPU with time slicing and NVIDIA Multi-Instance GPU (MIG). Save cost on over-provisioning hardware when you don’t need to be.
Wide range of GPUs
Select from a broad range of GPU models such as NVIDIA H100, A100, T4, and many more to optimise for your specific use case and budget requirements.
Run workloads where you need them
Your Preferred Cloud Providers
Deploy your GPU workloads to instances on your preferred cloud providers while leveraging the power of the Northflank platform.
Google Cloud Platform
Deploy in your own GCP account with Google Kubernetes Engine (GKE)
Amazon Web Services
Deploy in your own AWS account with Elastic Kubernetes Service (EKS)
Microsoft Azure
Deploy in your own Azure account with Azure Kubernetes Service (AKS)
Civo
Deploy in your own Civo account with Civo Kubernetes
Availability of GPU models across regions and clouds
Supported GPUs
Deploy onto instances with the following GPUs.