Best HBM2e Cloud GPUs — May 2026

HBM2e (A100 generation) — the most cost-effective HBM available in cloud today.

Updated May 2026 Showing 3 GPU models HBM2e memory

A100 SXM (80GB) vs A100 SXM (40GB) vs A30 — top picks from this guide

A100 SXM (80GB) vs A100 SXM (40GB) vs A30
A100 SXM (80GB)
Ampere · 80 GB
A100 SXM (40GB)
Ampere · 40 GB
A30
Ampere · 24 GB
Specifications
Manufacturer NVIDIA NVIDIA NVIDIA
Architecture Ampere Ampere Ampere
VRAM 80 GB HBM2e 40 GB HBM2e 24 GB HBM2e
Memory Bandwidth 2,039 GB/s 1,555 GB/s 933 GB/s
FP16 (Tensor) 312 TFLOPS 312 TFLOPS 165 TFLOPS
FP32 19.5 TFLOPS 19.5 TFLOPS 10.3 TFLOPS
TDP 400 W 400 W 165 W
Release Year 2020 2020 2021
Segment Data center Data center Data center
Cloud Pricing
Cheapest On-Demand $1.10/hr $0.80/hr $0.25/hr
Providers 6 2 2

Build your own GPU comparison

Select any 2 GPUs from this guide and open them side-by-side.

Tip: GPU comparisons run in pairs. Pick exactly 2 — if you skip selection, we open the top 2 from this guide.