PR Newswire
TAIPEI, May 19, 2025
TAIPEI, May 19, 2025 /PRNewswire/ -- MSI, a global leader in high-performance server solutions, returns to COMPUTEX 2025 (Booth #J0506) with its most comprehensive lineup yet. Showcasing rack-level integration, modular cloud infrastructure, AI-optimized GPU systems, and enterprise server platforms, MSI presents fully integrated EIA, OCP ORv3, and NVIDIA MGX racks, DC-MHS-based Core Compute servers, and the new NVIDIA DGX Station. Together, these systems underscore MSI's growing capability to deliver deployment-ready, workload-tuned infrastructure across hyperscale, cloud, and enterprise environments.
"The future of data infrastructure is modular, open, and workload-optimized," said Danny Hsu, General Manager of MSI's Enterprise Platform Solutions. "At COMPUTEX 2025, we're showing how MSI is evolving into a full-stack server provider, delivering integrated platforms that help our customers scale AI, cloud, and enterprise deployments with greater efficiency and flexibility."
Full-Rack Integration from Cloud to AI Data Centers
MSI demonstrates its rack-level integration expertise with fully configured EIA 19", OCP ORv3 21", and AI rack powered by NVIDIA MGX, engineered to power modern infrastructure, from cloud-native compute to AI-optimized deployments. Pre-integrated and thermally optimized, each rack is deployment-ready and tuned for specific workloads. Together, they highlight MSI's capability to deliver complete, workload-optimized infrastructure from design to deployment.
Core Compute and Open Compute Servers for Modular Cloud Infrastructure
MSI expands its Core Compute lineup with six DC-MHS servers powered by AMD EPYC 9005 Series and Intel Xeon 6 processors in 2U4N and 2U2N configurations. Designed for scalable cloud deployments, the portfolio includes high-density nodes with liquid or air cooling and compact systems optimized for power and space efficiency. With support for OCP DC-SCM, PCIe 5.0, and DDR5 DRAM, these servers enable modular, cross-platform integration and simplified management across private, hybrid, and edge cloud environments.
To further enhance Open Compute deployment flexibility, MSI introduces the CD281-S4051-X2, a 2OU 2-Node ORv3 Open Compute server based on DC-MHS architecture. Optimized for hyperscale cloud infrastructure, it supports a single AMD EPYC 9005 processor per node, offers high storage density with twelve E3.S NVMe bays per node, and integrates efficient 48V power delivery and OpenBMC-compatible management, making it ideal for software-defined and power-conscious cloud environments.
AMD EPYC 9005 Series Processor-Based Platform for Dense Virtualization and Scale-Out Workloads
Intel Xeon 6 Processor-Based Platform for Containerized and General-Purpose Cloud Services
AI Platforms with NVIDIA MGX & DGX Station for AI Deployment
MSI presents a comprehensive lineup of AI-ready platforms, including NVIDIA MGX-based servers and the DGX Station built on NVIDIA Grace and Blackwell architecture. The MGX lineup spans 4U and 2U form factors optimized for high-density AI training and inference, while the DGX Station delivers datacenter-class performance in a desktop chassis for on-premises model development and edge AI deployment.
AI Platforms with NVIDIA MGX
DGX Station
The CT60-S8060 is a high-performance AI station built on the NVIDIA GB300 Grace Blackwell Ultra Desktop Superchip, delivering up to 20 PFLOPS of AI performance and 784GB of unified memory. It also features the NVIDIA ConnectX-8 SuperNIC, enabling up to 800Gb/s networking for high-speed data transfer and multi-node scaling. Designed for on-prem model training and inferencing, the system supports multi-user workloads and can operate as a standalone AI workstation or a centralized compute resource for R&D teams.
View original content:https://www.prnewswire.co.uk/news-releases/from-rack-integration-to-ai-and-cloud-systems-msi-debuts-full-spectrum-server-portfolio-at-computex-2025-302456482.html