← Signal Board
GPU Profile80 GB VRAMData Center GPU

A100 PCIE

The NVIDIA A100 PCIE features the Ampere architecture with 80 GB of HBM2e VRAM in a PCIe form factor.

Rent Lowest Price

Best live route via RunPod

Market Snapshot

Lowest
$1.1900/hr
Spread
$0.0000
Offers Live
1
Providers
1

Live Floor

$1.1900

RunPod

Average Price

$1.1900

Across 1 live offers

Reliability

90.0%

Based on 1 scored hosts

Last Day Delta

+$0.0000

Latest daily low: 2026-03-31

30-Day Price Trend

Daily Lowest Price

Available Offers

A100 PCIE from all providers

Sorted by cheapest hourly rate

ProviderPrice/hrVRAMGPUsReliabilityRegion
RunPod$1.190080 GB1x90.0%--View Deal

GPU Overview

About A100 PCIE

The NVIDIA A100 PCIE features the Ampere architecture with 80 GB of HBM2e VRAM in a PCIe form factor. It offers datacenter scalability with straightforward integration into standard server platforms, reducing deployment complexity for organizations building AI infrastructure.

Its 80 GB of VRAM supports Large Language Models (LLM) and fine-tuning across a range of model sizes. With a benchmark score of 92, the A100 PCIE delivers uncompromising performance for enterprise-grade AI training and inference, particularly in environments where PCIe compatibility is a deployment requirement.

Key Features

  • VRAM80 GB
  • CategoryData Center GPU
  • Benchmark Score92

Best For

LLM InferenceAI TrainingEnterprise AIScalable DeploymentFine-TuningRendering

Provider Split

  • RunPod1

Recent Daily Lows

  • 2026-03-31$0.2948
  • 2026-03-30$0.2948
  • 2026-03-29$0.2948
  • 2026-03-28$0.2948
  • 2026-03-27$0.2948

Related GPUs

B200B100GB200H100 SXM5H100 PCIEH100