By 2025, the competition among tech behemoths in the realm of AI infrastructure will have grown even more fierce. Alibaba is making vigorous strides in building its AI infrastructure, having already poured in 380 billion yuan and with intentions to ramp up investment further. It is anticipated that by 2032, the energy consumption scale of Alibaba Cloud's global data centers will soar by a factor of 10, while investment in computing power will witness exponential growth. Earlier, industry players like NVIDIA, Meta, Microsoft, and Google have also unveiled their ambitious investment blueprints. Notably, Meta is set to invest a staggering minimum of 600 billion U.S. dollars in AI infrastructure by 2028, and Microsoft has declared a 4 billion U.S. dollar investment to set up its second data center in Wisconsin. At the technical frontier, super nodes have emerged as a focal point, with Alibaba Cloud and Huawei recently rolling out relevant products. Institutional analysts posit that super nodes not only amplify computing power capacity but also enable global resource allocation through system-level innovations, thereby opening up investment avenues in industry chain segments such as servers, copper connectivity, and liquid cooling.