The NVIDIA GH200 Grace Hopper is a Data Center graphics card based on the Hopper architecture. Released on 2023-05-29 with a launch price of $50,000.00, it features 14,592 CUDA cores and 138 GB of HBM3 memory.
Architecture | Hopper (GH100) |
Market Segment | Data Center |
Release Date | 2023-05-29 |
Launch Price | $50,000.00 |
Manufacturing Process | 4nm |
CUDA Cores | 14,592 |
Base Clock Speed | 1410 MHz |
Boost Clock Speed | 1830 MHz |
Transistor Count | 80B |
VRAM Capacity | 138 GB HBM3 |
Memory Bus Width | 5120 bits |
Memory Bandwidth | 3000.0 GB/s |
TDP | 1000 W |
Key technical parameters of the NVIDIA GH200 Grace Hopper include its 14,592 CUDA cores, 1410 MHz base clock, and 1830 MHz boost clock, delivering high performance for data center applications.
TFLOPS
TFLOPS
The NVIDIA GH200 Grace Hopper achieves 67 TFLOPS in FP32 performance, making it ideal for data center workloads.
The NVIDIA GH200 Grace Hopper is not designed for gaming. It is optimized for AI training, machine learning, and high-performance computing workloads.
Gen 4
Enabled
Not Supported
NVLink 4, PCIe Gen5
Length | 12in |
Width | 5in |
Height | 3-slot |
Ranking Position | #2 |
Popularity Ranking | #8 |
Cost Effectiveness | 0.7/5 |
Power Efficiency | 1.4/5 |
Available Instances
Starting Price
Category | Rank 1 | Rank 2 | Rank 3 |
---|---|---|---|
Best for Training | NVIDIA H200 | NVIDIA H100 | NVIDIA B200 |
Best for Inference | NVIDIA A40 | NVIDIA A100 | NVIDIA A10 |
Compare GPU specifications and cloud instances to find the best GPU for your workload.