The largest AI projects now use computing resources that cost in the single-digit millions. The world has a total computer hardware budget of $1 trillion per year. AI are about halfway in time from taking over a majority of the world's hardware. Another 300,000 times increase in resources needed would put AI at needing more of the world's computing resources. This would take AI projects from single digit millions to hundreds of billions. If dedicated artificial intelligence clouds became about 30% of the overall world hardware then this AI projects would need those resources around 2025. AI would need another 3 months to needing over 50% of the world's hardware.
AI was using GPU chips which can improve faster then Moore's law and has shifted to custom ASIC like chips which can be even more energy efficient. AI has also used 16-bit precision chips which can be made faster than higher precision computing.