Microsoft has announced a new chip cooling system based on microfluidics that could be three times more effective than current methods. This breakthrough is significant given the enormous energy consumption and heat generation of AI, which contributes to greenhouse gas emissions.
Read: Bandwidth Blog & Smile 90.4FM Tech Tuesday: Packaging from mushrooms!
Many data centres currently use cold plates to cool GPUs, but this technology is limited because the cooling element is separated from the heat source by several layers. According to Microsoft, its new microfluidic system brings the coolant much closer to the source. The prototype uses a liquid that flows through tiny, thread-like channels etched directly onto the back of the chip. This design was inspired by natural patterns, resembling the veins of a leaf or a butterfly wing. Microsoft also used AI to optimize the flow of the coolant through these channels.
Microsoft claims this new technique can reduce the maximum temperature rise inside a GPU by as much as 65 percent. This would enable developers to safely overclock chips for increased performance without the risk of overheating. The improved cooling could also allow Microsoft to pack servers closer together, reducing latency.
While the primary focus of Microsoft’s announcement is on performance and efficiency, the technology offers clear environmental benefits. Better cooling reduces the energy needed to run data centers, which in turn eases the stress on the power grid. As the AI industry continues to grow, innovations like this will be crucial for managing its substantial energy demands.