Overview
4u rackmount server. Optimized for AI inference and enterprise compute workloads.
Key Features
- H200 with 141GB VRAM
- PCIe 5.0 expansion support
- 4u rackmount form factor
Ideal For
enterprise teams and developers and research institutions running large-scale AI training, multi-GPU workloads, and enterprise inference deployments.
Category
4U Server
Contact for pricing
Prices may vary. Verify on vendor site.
Quick Specs
- Category
- 4U Server
Tags
llm-traininggenerative-ai
