Best HBM3 Cloud GPUs — May 2026

HBM3 powers H100, GH200, and MI300X — the workhorse of frontier AI training right now.

Updated May 2026 Showing 3 GPU models HBM3 memory

MI300X vs GH200 Superchip vs H100 SXM — top picks from this guide

MI300X vs GH200 Superchip vs H100 SXM
MI300X
CDNA 3 · 192 GB
GH200 Superchip
Hopper · 96 GB
H100 SXM
Hopper · 80 GB
Specifications
Manufacturer AMD NVIDIA NVIDIA
Architecture CDNA 3 Hopper Hopper
VRAM 192 GB HBM3 96 GB HBM3 80 GB HBM3
Memory Bandwidth 5,300 GB/s 4,000 GB/s 3,350 GB/s
FP16 (Tensor) 1,307 TFLOPS 989 TFLOPS 990 TFLOPS
FP32 163.4 TFLOPS 494.5 TFLOPS 67 TFLOPS
TDP 750 W 700 W 700 W
Release Year 2023 2023 2023
Segment Data center Data center Data center
Cloud Pricing
Cheapest On-Demand $1.85/hr $1.57/hr
Providers 2 0 7

Build your own GPU comparison

Select any 2 GPUs from this guide and open them side-by-side.

Tip: GPU comparisons run in pairs. Pick exactly 2 — if you skip selection, we open the top 2 from this guide.