The limits of intelligence — Why AI advancement could be slowing down

The limits of intelligence — Why AI advancement could be slowing down


Generative artificial intelligence has developed so quickly in the past two years that massive breakthroughs seemed more a question of when rather than if. But in recent weeks, Silicon Valley has become increasingly concerned that advancements are slowing.  

One early indication is the lack of progress between models released by the biggest players in the space. The Information reports OpenAI is facing a significantly smaller boost in quality for its next model GPT-5, while Anthropic has delayed the release of its most powerful model Opus, according to wording that was removed from its website. Even at tech giant Google, Bloomberg reports that an upcoming version of Gemini is not living up to internal expectations.  

“Remember, ChatGPT came out at the end of 2022, so now it’s been close to two years,” said Dan Niles, founder of Niles Investment Management. “You had initially a huge ramp up in terms of what all these new models can do, and what’s happening now is you really trained all these models and so the performance increases are kind of leveling off.”

If progress is plateauing, it would call into question a core assumption that Silicon Valley has treated as religion: scaling laws. The idea is that adding more computing power and more data guarantees better models to an infinite degree. But those recent developments suggest they may be more theory than law. 

The key problem could be that AI companies are running out of data for training models, hitting what experts call the “data wall.” Instead, they’re turning to synthetic data, or AI-generated data. But that’s a band-aid solution, according to Scale AI founder Alexandr Wang.  

 “AI is an industry which is garbage in, garbage out,” Wang said. “So if you feed into these models a lot of AI gobbledygook, then the models are just going to spit out more AI gobbledygook.”  

But some leaders in the industry are pushing back on the idea that the rate of improvement is hitting a wall.  

“Foundation model pre-training scaling is intact and it’s continuing,” Nvidia CEO Jensen Huang said on the chipmaker’s latest earnings call. “As you know, this is an empirical law, not a fundamental physical law. But the evidence is that it continues to scale.”

OpenAI CEO Sam Altman posted on X simply, “there is no wall.” 

OpenAI and Anthropic didn’t respond to requests for comment. Google says it’s pleased with its progress on Gemini and has seen meaningful performance gains in capabilities like reasoning and coding. 

If AI acceleration is tapped out, the next phase of the race is the search for use cases – consumer applications that can be built on top of existing technology without the need for further model improvements. The development and deployment of AI agents, for example, is expected to be a game-changer. 

“I think we’re going to live in a world where there are going to be hundreds of millions, billions of AI agents, eventually probably more AI agents than there are people in the world,” Meta CEO Mark Zuckerberg said in a recent podcast interview.  

Watch the video to learn more. 



Source

Elon Musk’s Neuralink filed as ‘disadvantaged business’ before being valued at  billion
Technology

Elon Musk’s Neuralink filed as ‘disadvantaged business’ before being valued at $9 billion

Jonathan Raa | Nurphoto | Getty Images Elon Musk’s health tech company Neuralink labeled itself a “small disadvantaged business” in a federal filing with the U.S. Small Business Administration, shortly before a financing round valued the company at $9 billion. Neuralink is developing a brain-computer interface (BCI) system, with an initial aim to help people […]

Read More
Defense manufacturing startup Hadrian closes 0 million funding round led by Peter Thiel’s Founders Fund
Technology

Defense manufacturing startup Hadrian closes $260 million funding round led by Peter Thiel’s Founders Fund

Defense manufacturing startup Hadrian on Thursday announced the closing of $260 million Series C funding round led by Peter Thiel’s Founders Fund and Lux Capital. The machine parts company said it will use the funding to build a new 270,000 square foot factory in Mesa, Arizona, and expand its Torrance, California, location as it looks […]

Read More
Amazon cuts some jobs in cloud computing unit as layoffs continue
Technology

Amazon cuts some jobs in cloud computing unit as layoffs continue

Attendees walk through an exposition hall at AWS re:Invent, a conference hosted by Amazon Web Services, in Las Vegas on Dec. 3, 2024. Noah Berger | Getty Images Amazon is laying off some staffers in its cloud computing division, the company confirmed on Thursday. “After a thorough review of our organization, our priorities, and what […]

Read More