Exclusive: AI breaktrhough could let your next running shoes learn and adapt to how you move
A breakthrough in artificial intelligence could allow everyday objects like running shoes to adapt to the gait of the wearer in real-time, or helmets to filter and adapt to outside noise more efficiently.
Experts from several leading universities and the small AI model company AIZip created a system that allows AI tools to build other versions of themselves. This self-replication allows for much smaller scale and more efficient models that can run inside objects.
Known as a fully automated AI-design pipeline, it will allow for the development of an AI nanofactory where millions of specialized, efficient models can be generated with minimal human interaction, embedded and adapted in response to sensor data.
What is AIZip and why is it significant?
Over the past year, since the launch of ChatGPT by OpenAI, it has felt like artificial intelligence is everywhere. Office software, operating systems and even phones have been adapted to utilize the power of generative AI.
In the future we could see new shoes that adapt to your walk, adjusting in real time to sensor data on your body and the environment.
However, it has yet to infiltrate the Internet of Things, or the everyday objects we use all the time such as our shoes, clothing or smart home items like toasters and microwaves. That is about to change as AI learns to build other AI models, and the compute and power requirements drop.
Yubei Chen, CTO of Aizip and Professor at UC Davis, said this breakthrough is the first step in transforming AI design. In the future we could see new shoes that adapt to your walk, adjusting in real time to sensor data on your body and the environment.
"With the help of large foundation models, small models will evolve faster than big ones, so the trend of improvements favors the edge,” added Brian Cheung, AI scientist at MIT and Chief Scientist of Aizip.
Sensors everywhere and AI to understand them
In the future, Dr Chen and colleagues envision a world where trillions of sensors are deployed into everything in the world. Clothing will have nanosensors woven into the fabric, homes will have them in every room and even our toothbrushes will have them in the bristles.
To make sense of this vast amount of data the team says artificial intelligence is needed, but current models are bulky and don’t adapt quickly enough. That is where the new system comes into its own, as one AI can build another to complete a specific task.
“We’re witnessing a revolution in human-machine interaction and brain-computer interfaces fueled by advances in brain- and body-sensing technology,” Gert Cauwenberghs a professor at UC San Diego involved in the study told Tom's Guide in an email.
“Making sense of the massive data streaming from these sensors despite the high levels of variability and noise in their biological operating environments is a major challenge that calls for powerful AI, down to the physiological interface,” he added.
“Brain and body sensing in a wearable format requires efficient AI models that can be deployed at the edge. The technology at Aizip enables transformative applications in bio- and neuro-engineering.”
Practical applications
Aside from being able to adapt cushions inside a pair of running shoes to reflect the movement of the athlete, there are numerous transformative applications for this technology. Dr Chen and colleagues suggested improvements in comfort and convenience without harming the planet.
Clothing will have nanosensors woven into the fabric, homes will have them in every room and even our toothbrushes will have them in the bristles.
They are learning from the way nature works, with tiny brains in animals with fewer than a million neurons capable of utilizing efficient wiring to survive and thrive. AI systems powering tomorrow’s embedded technology will need a similar efficient solution.
“This development is more than a technological leap; it represents the dawn of a new era in which every item can become a smart, evolving, and adapting companion,” the team wrote.
The company behind the new research, AIzip, is already working with chip makers like Arm and sensor companies to embed their technology into ever-smaller products. We could start seeing some smart noise cancellation as soon as next year, with shoes and smart homes following soon after, they explained.
A really smart kitchen
Aside from smart footwear, in the research paper the team outlined an idea for a fully immersive smart kitchen where appliances learn from user behavior.
AI Coffee Maker (The Barista):
"This intelligent coffee maker employs face recognition or voice identification to recognize individuals, using their specific preferences to brew coffee. It accurately locates the cup, ensuring precise positioning, and automatically stops pouring to prevent overflows."
Smart Fridge (The Nutritionist):
"Integrated with computer vision, this fridge can identify contents and estimate their nutritional value. A barcode reader embedded in the AI camera collects detailed information about food and drinks, facilitating healthy eating habit tracking and fresh food management. It can even suggest purchases and generate shopping lists."
AI Oven (The Gourmet Chef):
"Equipped with a camera and AI, this oven can identify food types and volume, automatically setting the appropriate cooking parameters. It monitors the cooking process for optimal results and can estimate the nutritional content of meals."
Smart Dishwasher (The Efficient Cleaner):
"Similar to the smart fridge, this dishwasher uses AI to identify dish types and their placement, optimizing water flow and cycle times for efficient cleaning with minimal water use."
Smart Lights (The Illuminator):
"Voice-controlled lighting, adaptable to various accents and personal settings, enhances hand-free kitchen convenience. These lights adjust according to ambient light, optimizing energy use."
AI TV (The Personal Assistant):
"Increasingly popular in kitchens, these AI-powered TVs provide both entertainment and information like weather updates and recipes. They personalize content based on face or voice recognition and support voice commands."