nocred
2 min. read
Foams are everywhere, and for decades scientists believed that foams behave like glass, their microscopic components trapped in static, disordered configurations. Now, engineers at Penn’s School of Engineering and Applied Science have found that foams actually flow ceaselessly inside while holding their external shape. More strangely, from a mathematical perspective, this internal motion resembles the process of deep learning, the method typically used to train modern AI systems.
The discovery could hint that learning, in a broad mathematical sense, may be a common organizing principle across physical, biological and computational systems, and provide a conceptual foundation for future efforts to design adaptive materials.
In a paper in Proceedings of the National Academy of Sciences, the team describes using computer simulations to track the movement of bubbles in a wet foam. Rather than eventually staying put, the bubbles continued to meander through possible configurations. Mathematically speaking, the process mirrors how deep learning involves continually adjusting an AI system’s parameters—the information that encodes what an AI “knows”—during training.
“Foams constantly reorganize themselves,” says John C. Crocker, a chemical and biomolecular engineering (CBE) professor and the paper’s co-senior author. “It’s striking that foams and modern AI systems appear to follow the same mathematical principles. Understanding why that happens is still an open question, but it could reshape how we think about adaptive materials and even living systems.”
During training, modern AI systems continually adjust their parameters—the numerical values that encode what they “know.” Much like bubbles in foams were once thought to descend into metaphorical valleys, searching for the positions that require the least energy to maintain, early approaches to AI training aimed to optimize systems as tightly as possible to their training data.
“The key insight was realizing that you don’t actually want to push the system into the deepest possible valley,” says Robert Riggleman, a CBE professor and co-senior author. “Keeping it in flatter parts of the landscape, where lots of solutions perform similarly well, turns out to be what allows these models to generalize.”
When the Penn researchers looked again at their foam data through this lens, the parallel was hard to miss. Rather than settling into “deep” positions in this metaphorical landscape, bubbles in foams also remained in motion, much like the parameters in modern AI systems, continuously reorganizing within broad, flat regions with similar characteristics. The same mathematics that explains why deep learning works turned out to describe what foams had been doing all along.
Read more at Penn Engineering Stories.
Ian Scheffler
nocred
nocred
Despite the commonality of water and ice, says Penn physicist Robert Carpick, their physical properties are remarkably unique.
(Image: mustafahacalaki via Getty Images)
Organizations like Penn’s Netter Center for Community Partnerships foster collaborations between Penn and public schools in the West Philadelphia community.
nocred