Rent NVIDIA A100 SXM (40GB) in the Cloud — Compare 2 Providers

Budget A100 variant with 40GB VRAM. Good for training models under 13B parameters and batch inference.

VRAM 40 GB HBM2e
Bandwidth 1,555 GB/s
FP16 312.0 TFLOPS
FP32 19.5 TFLOPS
TDP 400W
Architecture Ampere
Cheapest On-Demand $0.80/hr
Average On-Demand $0.97/hr
Providers 2

Compare NVIDIA A100 SXM (40GB) Cloud Pricing — 2 Providers

On-Demand

Provider Price / GPU / hr Availability Notes
Vast.ai $0.80/hr Available Marketplace avg Visit Provider ↗
RunPod $1.14/hr Available Secure Cloud, SXM Visit Provider ↗

Prices last verified: April 13, 2026

NVIDIA A100 SXM (40GB) Technical Specifications

ManufacturerNVIDIA
ArchitectureAmpere
VRAM40 GB HBM2e
Memory Bandwidth1,555 GB/s
FP16 (Tensor)312.0 TFLOPS
FP3219.5 TFLOPS
TDP400W
Release Year2020
SegmentData center
Best ForAI training, fine-tuning, inference on smaller models