Magrenta ng NVIDIA H100 SXM sa Cloud — Ihambing ang 7 Providers
The flagship data center GPU for AI training. Industry standard for large language model training with 80GB HBM3 memory and NVLink interconnects.
Pinakamurang On-Demand
$1.57/hr
Karaniwang On-Demand
$2.44/hr
Pinakamurang Spot
$1.49/hr
Mga Provider
7
Ihambing ang Presyo ng NVIDIA H100 SXM Cloud — 7 Providers
On-Demand
| Provider | Presyo / GPU / oras | Availability | Mga Tala | |
|---|---|---|---|---|
|
|
$1.57/hr | Available | GPU VM, NVLink | Bisitahin ang Provider ↗ |
|
|
$1.99/hr | Available | 24-month contract | Bisitahin ang Provider ↗ |
|
|
$2.20/hr | Available | Marketplace avg | Bisitahin ang Provider ↗ |
|
|
$2.35/hr | Available | PCIe | Bisitahin ang Provider ↗ |
|
|
$2.59/hr | Available | — | Bisitahin ang Provider ↗ |
|
|
$2.99/hr | Available | Secure Cloud, SXM | Bisitahin ang Provider ↗ |
|
|
$3.39/hr | Available | HGX H100 | Bisitahin ang Provider ↗ |
Spot / Preemptible
| Provider | Presyo / GPU / oras | Availability | Mga Tala | |
|---|---|---|---|---|
|
|
$1.49/hr | Available | Spot, 1hr guaranteed | Bisitahin ang Provider ↗ |
|
|
$2.69/hr | Available | Community Cloud | Bisitahin ang Provider ↗ |
Reserved
| Provider | Presyo / GPU / oras | Availability | Mga Tala | |
|---|---|---|---|---|
|
|
$2.50/hr | Available | 12-month, 8-GPU | Bisitahin ang Provider ↗ |
Huling beripikasyon ng presyo: April 13, 2026
Teknikal na Espesipikasyon ng NVIDIA H100 SXM
| Tagagawa | NVIDIA |
|---|---|
| Arkitektura | Hopper |
| VRAM | 80 GB HBM3 |
| Bandwidth | 3,350 GB/s |
| FP16 (Tensor) | 990.0 TFLOPS |
| FP32 | 67.0 TFLOPS |
| TDP | 700W |
| Taon ng Paglabas | 2023 |
| Segmento | Data center |
| Pinakamainam Para sa | Large-scale AI training, distributed workloads, LLM pre-training |