Magrenta ng NVIDIA A100 SXM (80GB) sa Cloud — Ihambing ang 6 Providers
Previous-generation workhorse for AI training and inference. Still widely available at lower cost than H100.
Pinakamurang On-Demand
$1.10/hr
Karaniwang On-Demand
$1.67/hr
Pinakamurang Spot
$0.79/hr
Mga Provider
6
Ihambing ang Presyo ng NVIDIA A100 SXM (80GB) Cloud — 6 Providers
On-Demand
| Provider | Presyo / GPU / oras | Availability | Mga Tala | |
|---|---|---|---|---|
|
|
$1.10/hr | Available | Marketplace avg | Bisitahin ang Provider ↗ |
|
|
$1.23/hr | Available | SXM4 | Bisitahin ang Provider ↗ |
|
|
$1.49/hr | Available | Secure Cloud, SXM | Bisitahin ang Provider ↗ |
|
|
$1.60/hr | Available | SXM | Bisitahin ang Provider ↗ |
|
|
$2.18/hr | Available | Bare metal | Bisitahin ang Provider ↗ |
|
|
$2.40/hr | Available | PCIe | Bisitahin ang Provider ↗ |
Spot / Preemptible
| Provider | Presyo / GPU / oras | Availability | Mga Tala | |
|---|---|---|---|---|
|
|
$0.79/hr | Available | Spot, 1hr guaranteed | Bisitahin ang Provider ↗ |
|
|
$1.39/hr | Available | Community Cloud | Bisitahin ang Provider ↗ |
Huling beripikasyon ng presyo: April 13, 2026
Teknikal na Espesipikasyon ng NVIDIA A100 SXM (80GB)
| Tagagawa | NVIDIA |
|---|---|
| Arkitektura | Ampere |
| VRAM | 80 GB HBM2e |
| Bandwidth | 2,039 GB/s |
| FP16 (Tensor) | 312.0 TFLOPS |
| FP32 | 19.5 TFLOPS |
| TDP | 400W |
| Taon ng Paglabas | 2020 |
| Segmento | Data center |
| Pinakamainam Para sa | AI training, fine-tuning, HPC, inference |