Rent NVIDIA A100 SXM (40GB) in the Cloud — Compare 2 Providers
Budget A100 variant with 40GB VRAM. Good for training models under 13B parameters and batch inference.
Cheapest On-Demand
$0.80/hr
Average On-Demand
$0.97/hr
Providers
2
Compare NVIDIA A100 SXM (40GB) Cloud Pricing — 2 Providers
On-Demand
| Provider | Price / GPU / hr | Availability | Notes | |
|---|---|---|---|---|
|
|
$0.80/hr | Available | Marketplace avg | Visit Provider ↗ |
|
|
$1.14/hr | Available | Secure Cloud, SXM | Visit Provider ↗ |
Prices last verified: April 13, 2026
NVIDIA A100 SXM (40GB) Technical Specifications
| Manufacturer | NVIDIA |
|---|---|
| Architecture | Ampere |
| VRAM | 40 GB HBM2e |
| Memory Bandwidth | 1,555 GB/s |
| FP16 (Tensor) | 312.0 TFLOPS |
| FP32 | 19.5 TFLOPS |
| TDP | 400W |
| Release Year | 2020 |
| Segment | Data center |
| Best For | AI training, fine-tuning, inference on smaller models |