Best Hopper Cloud GPUs — May 2026
Hopper-class GPUs (H100, H200, GH200) are still the workhorse of large-model training and inference. Compare every Hopper GPU in the cloud.
H200 SXM vs GH200 Superchip vs H100 SXM — top picks from this guide
|
H200 SXM
Hopper · 141 GB
|
GH200 Superchip
Hopper · 96 GB
|
H100 SXM
Hopper · 80 GB
|
|
|---|---|---|---|
| Specifications | |||
| Manufacturer | NVIDIA | NVIDIA | NVIDIA |
| Architecture | Hopper | Hopper | Hopper |
| VRAM | 141 GB HBM3e | 96 GB HBM3 | 80 GB HBM3 |
| Memory Bandwidth | 4,800 GB/s | 4,000 GB/s | 3,350 GB/s |
| FP16 (Tensor) | 990 TFLOPS | 989 TFLOPS | 990 TFLOPS |
| FP32 | 67 TFLOPS | 494.5 TFLOPS | 67 TFLOPS |
| TDP | 700 W | 700 W | 700 W |
| Release Year | 2024 | 2023 | 2023 |
| Segment | Data center | Data center | Data center |
| Cloud Pricing | |||
| Cheapest On-Demand | $2.05/hr | — | $1.57/hr |
| Providers | 3 | 0 | 7 |
Build your own GPU comparison
Select any 2 GPUs from this guide and open them side-by-side.
Tip: GPU comparisons run in pairs. Pick exactly 2 — if you skip selection, we open the top 2 from this guide.