A Rebel's New Frontier: The Rise of Adaption Labs as AI Industry's Conventional Wisdom Crumbles
In a bold move, computer scientist Sara Hooker has raised $50 million for her new startup, Adaption Labs, which is challenging the conventional wisdom that more computing power and data are the keys to building superior artificial intelligence (AI) models. Hooker, who previously worked at Google and Canadian AI startup Cohere, believes that efficient, self-learning training methods will lead to the best results, rather than relying on massive amounts of data and energy.
Hooker's stance is a radical departure from the industry's traditional approach, which prioritizes building large-scale models that can be shipped to billions of people worldwide. Instead, Adaption Labs is focused on developing models that can learn continuously and adapt to different environments in real-time, incorporating user feedback and exploring alternative training methods like "gradient-free learning."
The startup's ambitious goals are not only seen as a departure from the industry's dominant assumptions but also reflect a growing chorus of voices questioning traditional scaling law principles. Yann LeCun, who recently left Meta to launch AMI Labs, has raised doubts about the industry's reliance on ever-greater computing power, while David Silver, a former Google DeepMind researcher, is working on training self-learning models through experience rather than relying on data.
Adaption Labs' fresh funding will be primarily used to build out its team, with staff members offered an "Adaptive Passport" perk that allows them to take an annual trip to a new country. The company aims to represent itself as a global technology firm from day one, encouraging employees to explore and experience different cultures.
Hooker expects that the industry's reckoning on traditional scaling laws will lead to a year where algorithmic innovation becomes the real driver of progress. "This is the year in which it will really matter," she said, as Adaption Labs poised to disrupt an industry that they have spent their careers advancing.
In a bold move, computer scientist Sara Hooker has raised $50 million for her new startup, Adaption Labs, which is challenging the conventional wisdom that more computing power and data are the keys to building superior artificial intelligence (AI) models. Hooker, who previously worked at Google and Canadian AI startup Cohere, believes that efficient, self-learning training methods will lead to the best results, rather than relying on massive amounts of data and energy.
Hooker's stance is a radical departure from the industry's traditional approach, which prioritizes building large-scale models that can be shipped to billions of people worldwide. Instead, Adaption Labs is focused on developing models that can learn continuously and adapt to different environments in real-time, incorporating user feedback and exploring alternative training methods like "gradient-free learning."
The startup's ambitious goals are not only seen as a departure from the industry's dominant assumptions but also reflect a growing chorus of voices questioning traditional scaling law principles. Yann LeCun, who recently left Meta to launch AMI Labs, has raised doubts about the industry's reliance on ever-greater computing power, while David Silver, a former Google DeepMind researcher, is working on training self-learning models through experience rather than relying on data.
Adaption Labs' fresh funding will be primarily used to build out its team, with staff members offered an "Adaptive Passport" perk that allows them to take an annual trip to a new country. The company aims to represent itself as a global technology firm from day one, encouraging employees to explore and experience different cultures.
Hooker expects that the industry's reckoning on traditional scaling laws will lead to a year where algorithmic innovation becomes the real driver of progress. "This is the year in which it will really matter," she said, as Adaption Labs poised to disrupt an industry that they have spent their careers advancing.