The NVIDIA H200 is a Data Center graphics card based on the Hopper architecture. Released on 2024-11-01 with a launch price of $35,000.00, it features 16,896 CUDA cores and 141 GB of HBM3e memory.
Architecture | Hopper (GH200) |
Market Segment | Data Center |
Release Date | 2024-11-01 |
Launch Price | $35,000.00 |
Manufacturing Process | 4nm |
CUDA Cores | 16,896 |
Base Clock Speed | 1450 MHz |
Boost Clock Speed | 1900 MHz |
Transistor Count | 90B |
VRAM Capacity | 141 GB HBM3e |
Memory Bus Width | 5120 bits |
Memory Bandwidth | 4800.0 GB/s |
TDP | 700 W |
Key technical parameters of the NVIDIA H200 include its 16,896 CUDA cores, 1450 MHz base clock, and 1900 MHz boost clock, delivering high performance for data center applications.
TFLOPS
TFLOPS
The NVIDIA H200 achieves 75 TFLOPS in FP32 performance, making it ideal for data center workloads.
The NVIDIA H200 is not designed for gaming. It is optimized for AI training, machine learning, and high-performance computing workloads.
Gen 4+
Enabled
Supported
NVLink 4, PCIe Gen5
Length | 10.5in |
Width | 4.4in |
Height | 2-slot |
Ranking Position | #1 |
Popularity Ranking | #4 |
Cost Effectiveness | 0.9/5 |
Power Efficiency | 1.3/5 |
Available Instances
Starting Price
Category | Rank 1 | Rank 2 | Rank 3 |
---|---|---|---|
Best for Training | NVIDIA H200 | NVIDIA H100 | NVIDIA B200 |
Best for Inference | NVIDIA A40 | NVIDIA A100 | NVIDIA A10 |
Compare GPU specifications and cloud instances to find the best GPU for your workload.