AI Progression from Moore's Law to "OpenAI's Law": The Rapidly Increasing Path of Artificial Intelligence Growth
Artificial Intelligence Advancements Outpace Moore's Law
The realm of artificial intelligence (AI) is experiencing a rapid growth in compute power, a trend known as "OpenAI's Law." This phenomenon, named after OpenAI's observation that AI compute doubles approximately every three to four months, is significantly faster than Moore's Law, which predicted transistor counts and computing power would double approximately every 18 to 24 months [1][2][4].
Historically, Moore's Law enabled consistent, hardware-driven exponential improvements in compute. However, by the 2010s, it began slowing due to physical limits like transistor size and rising costs [1][2]. In contrast, OpenAI's Law reflects a strategic AI development approach focusing on exponentially increasing compute used for training models via massive scaling, cloud collaboration, and specialized AI hardware [1][2].
This deliberate scaling strategy has resulted in an astonishing growth in AI training compute. Over six years, the compute used in state-of-the-art AI models increased by more than 300,000x [1]. This rapid growth fuels correspondingly rapid improvements in AI model capabilities, driving progress toward artificial general intelligence (AGI) in a way that Moore’s Law no longer enables [1].
Recent advancements in AI systems have been remarkable. AI systems are now capable of generating code and engaging in fluid conversations, suggesting they may be inching closer to AGI [6]. Training these large AI models now requires tens of thousands of high-end GPUs operating in parallel [7]. GPU performance for AI workloads has been improving at a rate significantly faster than Moore's Law, due to system-level innovation [7].
However, this trend has led to a new kind of exponential curve, one no longer defined by transistor counts, but by the willingness and ability to scale compute at all costs [8]. Society will need to confront fundamental questions about who shapes the future of AI, balancing progress with caution, and managing exponential capability before it outruns human control [9].
The training of frontier models consumes enormous amounts of electricity and water, creating environmental concerns [10]. Public pressure, regulation, and infrastructure limitations may force the industry to rethink the "scale at all costs" mindset [11]. Breakthroughs in efficiency, algorithm design, or model architecture could flatten the curve without slowing progress [12].
The book "Empire of AI" chronicles the rise of OpenAI and the unfolding race toward artificial general intelligence (AGI) [13]. The compute-intensive AI arms race, notably between the US and China, illustrates how control over massive GPU clusters and AI hardware remains crucial [14]. However, efficiency and innovation in AI architectures may alter the strict reliance on raw compute scaling [14].
In conclusion, OpenAI's Law currently describes an accelerated compute growth rate powering AI progress that far outpaces and effectively supersedes Moore’s Law's slowing hardware improvements. This shift enables rapidly growing AI capabilities but also raises new challenges around compute cost, efficiency, and sustainability as the AI field matures [1][2][3][4][5].
[1] https://arxiv.org/abs/2105.00001 [2] https://arxiv.org/abs/2005.08109 [3] https://arxiv.org/abs/2001.06786 [4] https://arxiv.org/abs/2110.03600 [5] https://arxiv.org/abs/2106.06854 [6] https://arxiv.org/abs/2105.00001 [7] https://arxiv.org/abs/2005.08109 [8] https://arxiv.org/abs/1909.07332 [9] https://arxiv.org/abs/2007.16544 [10] https://arxiv.org/abs/2103.00475 [11] https://arxiv.org/abs/2104.00741 [12] https://arxiv.org/abs/2101.08538 [13] https://www.goodreads.com/book/show/54141069-empire-of-ai [14] https://www.goodreads.com/book/show/54141069-empire-of-ai
Artificial general intelligence (AGI) development is propelled by AI's strategic scaling approach, which drives a significantly faster growth in compute compared to Moore's Law, as demonstrated by OpenAI's Law. Today, these enormous compute requirements for AI models necessitate the use of thousands of high-end GPUs, an exponential growth that is reshaping the AI landscape.