Cloud GPUs on 3+ Providers — May 2026

GPU models offered by three or more cloud providers — multi-vendor availability lowers risk and helps with regional placement.

Updated May 2026 Showing 10 GPU models 3+ providers

H200 SXM vs H100 SXM vs A100 SXM (80GB) — top picks from this guide

H200 SXM vs H100 SXM vs A100 SXM (80GB)
H200 SXM
Hopper · 141 GB
H100 SXM
Hopper · 80 GB
A100 SXM (80GB)
Ampere · 80 GB
Specifications
Manufacturer NVIDIA NVIDIA NVIDIA
Architecture Hopper Hopper Ampere
VRAM 141 GB HBM3e 80 GB HBM3 80 GB HBM2e
Memory Bandwidth 4,800 GB/s 3,350 GB/s 2,039 GB/s
FP16 (Tensor) 990 TFLOPS 990 TFLOPS 312 TFLOPS
FP32 67 TFLOPS 67 TFLOPS 19.5 TFLOPS
TDP 700 W 700 W 400 W
Release Year 2024 2023 2020
Segment Data center Data center Data center
Cloud Pricing
Cheapest On-Demand $2.05/hr $1.57/hr $1.10/hr
Providers 3 7 6

Build your own GPU comparison

Select any 2 GPUs from this guide and open them side-by-side.

Tip: GPU comparisons run in pairs. Pick exactly 2 — if you skip selection, we open the top 2 from this guide.