OpenAI's latest large language model, known internally as Orion, has fallen short of performance targets, marking a broader slowdown in AI advancement across the industry's leading companies, according to Bloomberg, corroborating similar media stories in recent days. The model, which completed initial training in September, showed particular weakness in novel coding tasks and failed to demonstrate the same magnitude of improvement over its predecessor as GPT-4 achieved over GPT-3.5, the publication reported Wednesday.
Google's upcoming Gemini software and Anthropic's Claude 3.5 Opus are facing similar challenges. Google's project is not meeting internal benchmarks, while Anthropic has delayed its model's release, Bloomberg said. Industry insiders cited by the publication pointed to growing scarcity of high-quality training data and mounting operational costs as key obstacles. OpenAI's Orion specifically struggled due to insufficient coding data for training, the report said. OpenAI has moved Orion into post-training refinement but is unlikely to release the system before early 2024. The report adds: [...] AI companies continue to pursue a more-is-better playbook. In their quest to build products that approach the level of human intelligence, tech firms are increasing the amount of computing power, data and time they use to train new models -- and driving up costs in the process. Amodei has said companies will spend $100 million to train a bleeding-edge model this year and that amount will hit $100 billion in the coming years.
As costs rise, so do the stakes and expectations for each new model under development. Noah Giansiracusa, an associate professor of mathematics at Bentley University in Waltham, Massachusetts, said AI models will keep improving, but the rate at which that will happen is questionable. "We got very excited for a brief period of very fast progress," he said. "That just wasn't sustainable." Further reading: OpenAI and Others Seek New Path To Smarter AI as Current Methods Hit Limitations.
Read more of this story at Slashdot.