AMD MI350 GPUs Pack 288GB HBM3E at 1400W

Quick Report

AMD's upcoming Instinct MI350 series accelerators have been revealed at ISC 2025 in Hamburg, featuring an impressive 288GB of HBM3E memory and the new CDNA4 architecture. Two days before AMD's official announcement, details have emerged showing that the MI355X variant will consume up to 1,400 watts and is designed primarily for liquid-cooled environments.

Key Highlights

  • Massive Memory: 288GB of HBM3E memory with 8TB/s memory bandwidth per OAM (OCP Accelerator Module)
  • New Architecture: Built on the improved CDNA4 architecture with enhanced data format support
  • Data Format Support: Now includes FP16, FP8, FP6, and FP4 formats, with the latter formats potentially delivering double the performance of FP8
  • Two Variants:
    • MI350X: Standard version with 1,000W power consumption
    • MI355X: Higher-clocked version with up to 1,400W, designed for direct liquid cooling (DLC) environments
  • Performance Questions: A display at ISC 2025 showed theoretical performance figures approximately double what AMD had previously announced, raising questions about the final specifications
  • Scalability Limits: Unlike NVIDIA's systems that can scale to 36 or 72 GPUs, AMD's current solution is limited to 8 GPUs per system, with MI400 series expected to address this limitation in 2026

While the MI350 series represents a significant upgrade for AMD's AI accelerator lineup, it doesn’t yet fully challenge NVIDIA's GB200 or GB300 series in all aspects, particularly in scalability beyond 8 GPUs in a single system. The upcoming MI400 series, expected in 2026, will aim to address this limitation.

AMD will officially unveil the complete specifications and capabilities of the Instinct MI350 series during a livestream on Thursday, June 12, 2025, at 18:30 CEST.

Written using GitHub Copilot Claude 3.7 Sonnet in agentic mode instructed to follow current codebase style and conventions for writing articles.

Source(s)

  • ComputerBase
  • TPU