Former Google researchers launch startup to build nature-inspired neural networks
Two prominent artificial intelligence researchers have launched a startup, Sakana AI, that aims to build a new kind of nature-inspired neural network.
The Financial Times reported the startup’s launch this morning.
Tokyo-based Sakana AI is led by Chief Executive Officer David Ha, who was previously the head of research at Stability AI Ltd. It’s best known as the developer of the open-source Stable Diffusion image generation model. Before his stint at the startup, Ha led Google LLC’s AI research group in Japan.
Ha co-founded Sakana AI with Llion Jones, a fellow machine learning researcher who left Google earlier this month. Jones spent more than a decade at the Alphabet Inc. unit.
In 2017, Jones and seven Google colleagues published “Attention is All You Need,” a seminal AI research paper. The paper introduced the concept of Transformers to the world. A Transformer is a type of neural network design that underpins many of the most advanced AI models on the market, including OpenAI LP’s GPT-4.
The primary innovation in the technology is a software mechanism called attention. Using this mechanism, an AI model can take a large amount of contextual information into account when analyzing a piece of data. Furthermore, the model can prioritize that contextual information and pay more attention to the most important details.
Ha and Jones told the Financial Times that Sakana AI is seeking to develop generative AI models. According to the two researchers, the startup will look to nature for new approaches to building neural networks.
As part of its development roadmap, Sakana AI reportedly plans to train AI models using a technique known as evolutionary computation.
In an evolutionary computation project, researchers create many versions of a neural network that each have a slightly different configuration. They then compare those versions to find the ones that offer the highest accuracy and processing speed. From there, the researchers delete the slower neural neural versions and repeat the entire process from scratch multiple times to further refine their AI model.
Jones told CNBC that one of Sakana AI’s goals is to avoid creating a “huge, humongous model.” Today’s generative AI models often require a significant amount of hardware to run and train, which makes them expensive. According to the Financial Times, the startup also hopes its neural networks will be less “brittle” and more adaptable to changing requirements than current software.
Besides Ha and Jones, Sakana AI’s team reportedly includes a part-time researcher hired from academia. The startup plans to recruit an unspecified number of additional staffers down the road.
Ha and Jones didn’t specify whether Sakana AI has raised any outside capital. The startup is entering a crowded market contested by tech giants such as Microsoft Corp. and numerous smaller, heavily funded players. A few of those market players, such as startup Cohere Inc., are led by former Google researchers who co-authored the “Attention is All You Need” paper with Jones.
Image: Unsplash
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU