Peek inside a flickering candle flame with these 3D printed shapes

New research from MIT explores fire from a whole host of new perspectives. The research uses deep learning approaches that extract the vibrational characteristics of flames as flickering objects and translate them into sounds and materials.

The 19th century physicist Michael Faraday was known not only for his seminal experimental contributions to electromagnetism, but also for his public speaking. His annual Christmas lectures at the Royal Institution became a holiday tradition that continues to this day. One of his most famous Christmas lectures was about the chemical history of a candle. Faraday illustrated his point with a simple experiment: he placed a candle inside a lamp glass in order to block any breeze and obtain “a silent flame”. Faraday then showed how the shape of the flame flickered and changed in response to disturbances.

“You shouldn’t imagine, because you’re seeing these tongues all at once, that the flame has this particular shape,” Faraday observed. “A flame of this shape never is at any moment. Never does a body of flame, like the one you have just seen coming out of the ball, have the shape that it appears to you. It consists of a multitude of different forms, succeeding one another so quickly that the eye can only recognize them all at once.”

Today, MIT researchers have brought the simple Faraday experiment into the 21st century. Markus Buehler and his postdoc, Mario Milazzo, combined high-resolution imaging with deep machine learning to sonicate a single candle flame. They then used this unique flame as a building block, creating “music” from its flickering dynamics and designing new structures that could be 3D printed into physical objects. Buehler described this and other related work at the American Physical Society meeting last week in Chicago.

Enlarge / The dynamics of a flickering candle flame. Researchers are using deep learning to first explore what the vibration of a single flame sounds like, then generalize the approach to a larger fire that creates a variety of sounds.

MIT

As we reported earlier, Buehler specializes in developing AI models to design new proteins. He is perhaps best known for using sonification to illuminate structural details that might otherwise prove elusive. Buehler discovered that the hierarchical elements of musical composition (pitch, range, dynamics, tempo) are analogous to the hierarchical elements of protein structure. Just as music has a limited number of notes and chords and uses different combinations to compose music, proteins have a limited number of building blocks (20 amino acids) that can combine in many ways to create new protein structures with unique properties. Each amino acid has a particular sound signature, similar to a fingerprint.

Several years ago, Buehler led a team of MIT scientists who mapped the molecular structure of proteins in spider silk threads on musical theory to produce the “sound” of silk. The hope was to establish a radical new way to create designer proteins. This work inspired a sonification art exhibition, “Spider’s Canvas”, in Paris in 2018. Artist Tomas Saraceno worked with engineers from MIT to create an interactive harp-like instrument inspired by the web of a Cyrtophora citricola spider, with each strand of the “web” tuned to a different pitch. Combine these notes into various patterns in the 3D structure of the web and you can generate melodies.

In 2019, Buehler’s team developed an even more advanced system for creating music from a protein structure and then converting the music back to create new proteins never before seen in nature. The goal was to learn how to create similar synthetic spider webs and other structures that mimic the spider’s process. And in 2020, Buehler’s team applied the same approach to model the vibrational properties of the spike protein responsible for the high contagion rate of the novel coronavirus (SARS-CoV-2).

Machine-learning rendered image of a flame and its 3D printed fabrication.
Enlarge / Machine-learning rendered image of a flame and its 3D printed fabrication.

Markus Buhler

Buehler wondered if this approach could be extended enough to study fire. “The flames, of course, are quiet,” he told a news conference. However, “Fire has all the elements of a vibrating string or a vibrating molecule but in a dynamic pattern which is interesting. If we could hear them, what would they sound like? Can we materialize fire? pushing us to generate materials that you could actually feel and touch from this?”

Like Faraday centuries before, Buehler and Milazzo began with a simple experiment involving a single candle flame. (A larger fire will have so many disturbances that it will become too computationally difficult, but a single flame can be considered a basic element of fire.) The researchers lit a candle in a controlled environment, with no movement of the candle. air or any other external element. signals—the quiet flame of Faraday. Then they played sounds from a speaker and used a high-speed camera to capture how the flame flickered and warped over time in response to those acoustic cues.

Simulation of assembly of flames in a princess in a fairy tale garden.

Simulation of assembly of flames in a princess in a fairy tale garden.

Markus Buehler and Mario Milazzo, MIT

“There are characteristic shapes that are created by this, but it’s not the same shapes every time,” Buehler said. “It’s a dynamic process, so what you see [in our images] is just an overview of these. In reality, there are thousands and thousands of images for each expectation of the acoustic signal – a ring of fire.”

He and Milazzo then trained a neural network to classify the original audio signals that created a given flame shape. Researchers have effectively sonicated the vibrational frequencies of fire. The more violently a flame deflects, the more the audio signal changes drastically. The flame becomes a kind of musical instrument, which can be “played” by exposing it to air currents, for example, to make the flame flicker in a particular way – a form of musical composition.

“Fire is vibratory, rhythmic and repetitive, and continually changing, and that’s what defines music,” Buehler said. “Deep learning helps us to mine particular fire patterns and data, and with different fire patterns, you can create this orchestra of different sounds.”

Buehler and Milazzo also used the different shapes of flickering flames as building blocks to design new structures on the computer and then 3D print those structures. “It’s kind of like freezing the flame of a fire in time and being able to look at it from different angles,” Buehler said. “You can touch it, rotate it, and the other thing you can do is look inside the flames, which no human has ever seen.”

Comments are closed.