The new ANNA Ampere L1 server features the power of NVIDIA A100 Tensor Core GPUs and the HGX A100 4-GPU baseboard. The system supports PCI-E Gen 4 for fast CPU-GPU connection and high-speed networking expansion cards.
The new ANNA Ampere L1 server features the power of NVIDIA A100 Tensor Core GPUs and the HGX A100 4-GPU baseboard. The system supports PCI-E Gen 4 for fast CPU-GPU connection and high-speed networking expansion cards.
With GPU Direct RDMA and 1 to 1 mapping between network interconnects and GPU. It uses NVLink to provide GPU to GPU communications in a mesh system at speeds up to 200GB per second.
With 1 + 1 power redundancy it makes the system ideal for HPC and AI workloads. Designed with Speech Recognition, Computer vision and Data science in mind, the ANNA Ampere L1 would be the perfect choice.
Certified by Nvidia for use with Nvidia AI Enterprise suite and with access to NGC.
Chipset
System-On-Chip (SoC)
Drive Bays
4 x 2.5" Hot-swap drive bays
Drive Support
SATA/NVMe Hybrid or SAS with optional HBA
Expansion Slots
1 PCIe 4.0 x8 LP
4 x PCI-E Gen 4 x16 (LP) slots
Form Factor
2U Rackmount
GPU Manufacturer
NVIDIA
GPU Quantity
NVIDIA 4-GPU HGX-A100 (80GB) - SXM4
Manufacturer
Supermicro
Memory (Maximum)
Up to 8TB 3DS ECC DDR4-3200MH SDRAM
Memory Slots
32 DIMM Slots
Memory Type
3200MHz ECC DDR4 SDRAM
Network Connectivity
Dual RJ45 10GbE-aggregate host LAN
Power Supply
2200W Redundant Power Supplies with PMBus (Platinum)
To help our clients make informed decisions about new technologies, we have opened up our research & development facilities and actively encourage customers to try the latest platforms using their own tools and if necessary together with their existing hardware. Remote access is also available
Boston are exhibiting at Gitex 2024!