The IPU-M2000 is a revolutionary next-generation system solution built with the Colossus MK2 IPU. It packs 1 PetaFlop of AI compute and up to 450GB Exchange-Memory™ in a slim 1U blade for the most demanding machine intelligence workloads. Designed from the ground up for high performance training and inference workloads, the IPU-M2000 unifies your AI infrastructure for maximum datacentre utilisation.
A core, new building block for AI infrastructure, the IPU-M2000 is powered by 4 x Colossus Mk2 GC200, Graphcore’s second generation 7nm IPU. It packs 1 PetaFlop of AI compute, up to 450GB Exchange Memory and 2.8Tbps IPU-Fabric for super low latency communication, in a slim 1U blade to handle the most demanding of machine intelligence workloads.
The IPU-M2000 has a flexible, modular design, so you can start with one and scale to thousands. Directly connect a single system to an existing CPU server, add up to eight connected IPU-M2000s or with racks of 16 tightly interconnected IPU-M2000s in IPU-POD64 systems, grow to supercomputing scale thanks to the high-bandwidth, near-zero latency IPU-Fabric interconnet™ architecture built into the box.
For your IPU solution to work you will need to consider a head-node like the Boston Graphcore Poplar Server, you may also want to consider solutions like the Graphcore IPU-POD16 or IPU-POD64, read more by clicking the links. You can test IPU solutions via our remote facility Boston Labs, where our experts will gladly discuss your requirements and organise testing accoridingly.
Created to provide you with in depth information on our bespoke solutions, our product datasheets give you all of the necessary technical information of a product.Download Data Sheet
To help our clients make informed decisions about new technologies, we have opened up our research & development facilities and actively encourage customers to try the latest platforms using their own tools and if necessary together with their existing hardware. Remote access is also available
There are no events coming up right now.