AIME GX8-H200
Servers

AIME GX8-H200

AIME

Overview

2u rackmount server and 8x H200 141GB GPUs. Designed for large-scale AI training and multi-GPU deep learning workloads.

Key Features

  • 8x H200 with 141GB VRAM
  • NVIDIA NVLink GPU interconnect
  • 2u rackmount form factor

Ideal For

enterprise teams and developers and startups running large-scale AI training, multi-GPU workloads, and enterprise inference deployments.

Category

HGX Server

Contact for pricing

Prices may vary. Verify on vendor site.

View on AIME →

Quick Specs

Category
HGX Server

Tags

llm-traininggenerative-aihpcsimulation