OpenAI is reportedly experiencing issues with its next artificial intelligence model, Orion. Model evaluations have found only moderate improvements over previous versions, sparking concerns about the rate of advancement we can expect in future generative AI models. OpenAI employees who used and tested Orion provided feedback about its progress to The Information, reporting that the model’s new enhancements were minimal compared to those introduced in the transition from GPT-3 to GPT-4.
While OpenAI has long held its position as the leading AI company, this development may hinder its status as a top AI player. Additionally, slower progress in AI enhancement may upend our current understanding of the future of AI, the limitations of generative AI models, and the growth trajectory of generative AI companies.
AI Growing Pains
OpenAI researchers reported that although its language skills have improved, Orion is experiencing bottlenecks in certain areas, including coding tasks. These struggles can be attributed to OpenAI’s issues with training data availability. The diminishing supply of this data is a significant problem for OpenAI, as high-quality data is required for pre-release training of Large Language Models (LLMs). One estimate predicts that the AI models will run out of material by 2028.
Without this data, advancements in certain areas of AI technology may decelerate. While companies have attempted to introduce synthetic data into AI model training, this may not be an appropriate remedy for the problem.
The increased computing necessary for this AI model training will require more power and be more expensive. Although leading companies continue building more data centers to support their AI strategies, they struggle to find cost-effective energy sources. Without them, the development of future AI models may not be financially feasible.
AI energy consumption is also an important consideration surrounding this topic, as the power required to run these large data centers may also have significant environmental impacts.
Previously, the primary assumption surrounding generative AI was that more data and computing power would lead to more enhanced AI models. However, OpenAI’s bottlenecks with Orion challenge this concept. If the growth trajectory of AI enhancement continues to dwindle, investors may be less keen to financially support these companies’ AI initiatives. Of course, the AI model users will likely be the ones offsetting these lost expenses, as AI companies will price their products accordingly to compensate.