A Shanghai-based startup is betting that the next bottleneck in artificial intelligence infrastructure will not be chips, but electricity. It has now raised fresh funding to pursue that thesis.

Matrix Power Technologies, led by former Huawei executives, has completed a Series A funding round, raising an eight-figure RMB sum exclusively from StarCharge. Zen Advisory served as the company’s long-term financial adviser.

The startup previously secured angel funding from an unnamed publicly listed company, followed by a pre-Series A round led by Yushan Partners. Proceeds from the latest financing will support continued R&D in AI data center power systems, overseas expansion, and working capital.

Founded in 2020, Matrix develops power infrastructure for AI data centers. Its products focus on high-efficiency, high-density architectures, including traditional 54-volt DC in-rack supply systems, next-generation high-voltage direct current designs operating at 400 volts and 800 volts outside the rack, and emerging technologies such as solid-state transformers.

The shift reflects the changing energy profile of AI computing. As compute demand accelerates, single-rack power consumption is rising from tens of kilowatts toward megawatt-scale levels. Conventional designs built around uninterruptible power supply systems paired with lead-acid batteries are increasingly strained by the space constraints, efficiency requirements, and fast response times required by dense GPU clusters.

Industry estimates cited by the company project the global market for data center power equipment will approach RMB 700 billion (USD 101.3 billion) by 2028. Growth patterns differ significantly by region. According to 36Kr, the US leads in both adoption of new architectures and capital investment, and its market is projected to reach USD 40 billion by 2030.

China’s market is expanding quickly but faces structural constraints tied to limited access to high-end GPUs. As a result, the company estimates its overall scale at roughly one quarter of the US market.

The competitive landscape is dominated by established power electronics companies. Delta Electronics, drawing on telecommunications power technologies it previously acquired, controls roughly 80% of the global market, according to the company. Lite-On Technology ranks among the next largest suppliers.

Many other vendors remain focused on traditional UPS-based designs or are transitioning from industrial power supply segments, often without deep expertise in high-voltage DC architectures designed specifically for AI data centers.

Matrix said it is attempting to close that gap by building a full-stack power platform that supports both current and next-generation systems.

Its existing and developing products include 54V DC in-rack systems with power supply units supporting 5.5 kW and 12 kW configurations, battery backup units equipped with integrated lithium battery management systems, capacitor backup units capable of 500-microsecond response times through integrated supercapacitors, and 240V DC battery backup systems.

The company is also preparing higher-voltage systems intended for future AI clusters. Matrix plans to launch cabinet-external 400V and 800V high-voltage direct current platforms. Its HVDC sidecar module is designed to deliver up to 1.2 megawatts per cabinet, with system efficiency ranging from 97.3–97.7% and a dynamic load response time of about 500 microseconds.

Further ahead, the company has begun developing solid-state transformer technology designed to convert electricity directly from a 10 kV medium-voltage grid to 800V DC. By removing several intermediate conversion stages, the approach aims to reduce energy losses and enable what the company describes as an “ultimate power architecture” for gigawatt-scale AI factories.

Before focusing on AI infrastructure, Matrix built its engineering and delivery capabilities in telecommunications power supply systems. It developed a customer base across Southeast Asia and currently holds nearly RMB 100 million (USD 14.5 million) in orders.

Following the Series A financing, the company plans to accelerate expansion into North America.

Technological iteration and time to market remain key challenges. Matrix said it will accelerate development of AI-focused power products while pushing commercialization of its 800V high-voltage DC systems and solid-state transformer platform.

The roadmap aligns with a broader industry shift led by Nvidia and other computing vendors toward large-scale deployment of 800V DC-native servers beginning around 2027.

According to Jackie Ding, vice president of business development at Matrix, the company is currently in discussions with Google and Meta in North America on AI compute power supply solutions. At the same time, it is working with domestic server manufacturers to design power systems for ultra-high compute servers operating at roughly 860 kW.

Matrix’s leadership team draws heavily from Huawei’s power electronics ecosystem.

Founder Li Xiaohua previously served as vice president for North America at Huawei’s Network Energy division. Over more than two decades working on telecommunications power systems, he helped lead Huawei’s participation in the Open Compute Project power architecture working group.

Within that group, the team designed what it describes as the first distributed OCP data center power architecture built around a 54V DC and lithium battery design. According to the company, the architecture was deployed for customers including Google and Meta.

Matrix’s CTO previously held executive roles in Huawei Digital Power’s module power supply and integrated smart energy businesses. Its technical director, formerly a senior technical executive at Delta Electronics, has more than 22 years of experience in the power electronics sector.

Core members of the technical team collectively bring more than two decades of experience across companies including Huawei, Delta Electronics, Emerson, Eltek, and Tyco. The startup said its engineers are among the few China-based teams to have participated deeply in designing Open Compute Project power architectures.

KrASIA features translated and adapted content that was originally published by 36Kr. This article was written by Yuan Silai for 36Kr.

Note: RMB figures are converted to USD at rates of RMB 6.91 = USD 1 based on estimates as of March 16, 2026, unless otherwise stated. USD conversions are presented for ease of reference and may not fully match prevailing exchange rates.