As the artificial intelligence (AI) craze sweeps the world, the tech sector foresees limitless growth in computing capabilities. Yet, a recent report from Yole Group exposes that the expansion of AI computing power is up against two - fold limitations: physical and economic.
From the perspective of power supply, projections indicate that by 2030, data center parks will require a staggering 1 - 5 gigawatts of electricity. This gives rise to challenges like the protracted construction timelines for transmission lines and the intricate processes involved in grid integration.
In the realm of chip manufacturing, capacity constraints in advanced packaging and high - bandwidth memory are hampering GPU production. Although manufacturers have outlined plans to ramp up production, the forecasts are still shrouded in uncertainty.
Data scarcity also stands as a significant roadblock. The growth rate of online text data is failing to keep pace with the escalating demands of AI training. While multimodal data offers a potential solution, its quality is highly inconsistent. Moreover, the issues surrounding the quality and cost of synthetic data remain unresolved.
Additionally, latency bottlenecks are putting a brake on the speed of AI training computations.
