Sarah Kane, a fourth-year physics major in the College of Arts and Sciences, started her research in astronomy with Bhuvnesh Jain, the Walter H. and Leonore C. Annenberg Professor, and has since gone on to work on other projects that seek to characterize stars within Earth’s galaxy. Currently, she works with assistant professor Robyn Sanderson in a field known as galactic archaeology, which studies the current state of the Milky Way to inform the history of its formation.
“We’re mostly interested in stars because they live a really long time so that they can act like the galactic fossil record, hence the term galactic archaeology,” says Kane, who is from Mendham, New Jersey. The project she is involved in uses machine learning to identify stars with low amounts of metal which were formed in higher abundances in the early history of the galaxy. Kane jokes that “astronomers have a loose definition of metals because they typically refer to any elements that aren’t hydrogen or helium, the most prevalent elements present in the universe following the Big Bang.”
A significant component of Kane’s work is to find alternative ways to represent astronomical data by using sonification. She says, “typically, when we have data, we visualize it—a plot, graph or image—but this has communications limitations for blind people, like me, who don’t have access to that data in the same way. So, instead of visualizing the data, we’re working on ways to sonify it or represent it as sound.”
For more than two years, Kane has collaborated with Astronify, a team of astronomers from the Space Telescope Science Institute (STScI) who sonify data known as light curves, a measure of how bright an object is over time. “One primary use of light curves in astronomy is to detect planets outside our solar system, or exoplanets,” says Kane. “If you measure the amount of light coming from a star and a planet passes that star, you see periodic dips in brightness, which astronomers call exoplanet transits. And that has led to the discovery of thousands of new exoplanets.”
As an Astronify usability tester, Kane helps ensure that the valuable archived data collected from telescopes like Transiting Exoplanets Survey Satellite and Kepler can be converted to audio via the programming toolkits they develop. Kane’s participation in this project fomented a deep appreciation for translating the data retrieved from telescopes into sound and Kane has since become a member of the larger sonification community.
“When you think about astronomy, one of the best things about it is that we can come out with these incredible images that inspire people across the world,” says Kane. “Sonification has the potential to contribute to that sense of wonder, and to me, that’s just incredible.”
Kane recounts how she began work on data sonification through chance correspondence with Scott Fleming of the STScI, who works on different ways to transform and explore data, saying, “I was having trouble downloading data from STScI, so I reached out to Scott, an archive scientist there, and saw he was also the lead for Astronify, which I found out turns astronomical data into sound to help blind people.”
In becoming increasingly involved with sonification, Kane says, she’s come to learn that it’s not only a powerful tool that helps the visually impaired. “There’s plenty of evidence showing that sighted people really enjoy sonification.They can be harmonious and fascinating pieces of music to listen to,” she says. Having multiple sensory inputs for interpreting information also makes it more understandable, Kane says, and in the context of universal design, wherein products or spaces are made more accessible to people with disabilities, “you actually end up making things better for everyone.”