Total solar eclipses are wondrous events that can inspire awe and a desire to learn more about our solar system. Because solar eclipses cause changes in light as the Moon moves between the Sun and the Earth, scientists and science educators have long thought of eclipses as a primarily visual event. Not only is this assumption fundamentally untrue, it excludes people who are blind or low vision from participating in eclipses and eclipse science. The Eclipse Soundscapes Project, an enterprise of ARISA Lab, is committed to expanding that participation with innovative technologies that allow everyone to experience the wonder of a solar eclipse.
The Eclipse Soundscapes Project’s first endeavor was a mobile application that launched in 2017. The app, which is available for iOS and Android devices, includes two tools for experiencing an eclipse with multiple senses. The first is a catalog of audio descriptions that can narrate the progression of a total solar eclipse in real-time. The second is a “Rumble Map,” which allows users to hear and feel an eclipse on the touchscreen of a mobile device.
Audio description uses words to convey information from a picture or video. Eclipse Soundscapes worked with the National Center for Accessible Media at GBH (NCAM) to create audio descriptions for several images that highlight the key moments of a total solar eclipse.
“A good audio description should give the person who is reading or listening to it a good idea of what the image is without the burden of lots of extraneous information,” said Bryan Gould, director of NCAM. “It should focus on, or highlight, what’s most important given the context.”
Audio description is as much an art form as it is a specialized skill. “Everything about it is challenging,” Gould said. “At first pass, almost all the descriptions of an eclipse sound exactly the same. The moon is basically a featureless black disc, and then there’s some form of wispy light behind it.”
The team at NCAM worked closely with Eclipse Soundscapes astrophysicist Dr. Henry “Trae” Winter to identify the most important features of an eclipse — things like the Solar Corona, Helmet Streamers, and Solar Prominences — as well as key moments like Baily’s Beads, the Diamond Ring, and the moment of Totality.
Another challenge of describing eclipses is that they are typically characterized with terms like “light,” “dark,” and “shadow.” These word choices have their limitations when it comes to conveying information to people who may have a limited ability to perceive light and dark.
“We always try to include some tactile imagery as well,” Gould said. “For instance, comparing something to lace or to a flower petal. Even if you haven’t seen these things, those are tactile cues.”
Image: A photo of the Baily’s Beads stage of an eclipse. Credit: Miloslav Druckmuller.
Example of an audio description for Baily’s Beads: On the right side of the moon, orbs of glowing sunlight shimmer off the edge of the moon’s black disk. Called Baily’s Beads, these final areas of the sun’s light appear as glimmering pearls on a wire, made intensely bright by the absence of light surrounding them.
Hear the audio description for Baily’s Beads.
The team at Eclipse Soundscapes also felt it was important to include tactile information about eclipses. This led to one of the most innovative and exciting parts of the mobile application: The Rumble Map. “Utilizing multiple senses creates a more inclusive and engaging experience,” said Dr. Winter, Chief Scientist of the Eclipse Soundscapes Project. “The Rumble Map is our way of letting users ‘touch’ eclipses.”
The Rumble Map allows users to experience images using touch and sound. Eclipse Soundscapes selected images from the same eclipse features used in the audio descriptions. Users select one of those images to display on the screen, then use the touchscreen to navigate the image. When users touch areas of the image with a higher concentration of light, the Rumble Map vibrates the phone’s speakers and plays a high-pitched sound. When users touch darker areas, the vibration diminishes and the pitch lowers.
“We want users, whether they are sighted or blind or low vision, to be able to independently engage with the images and build their own mental maps of what an eclipse is like,” said MaryKay Severino, Education Director of the Eclipse Soundscapes project. “Using tools that integrate sight, sound, and touch allows users to interact with an eclipse using a variety of senses.”
“When I began using the Eclipse Soundscapes app, I was immediately impressed by the clear descriptions of astronomical phenomena and the ‘ear-catching’ experience of exploring the Rumble Map,” said Lindsay Yazzolino, a consultant on the project who is blind. “Not only was this eclipse experience accessible, but it engaged my senses and made me want to keep exploring the richness of the various soundscapes.”
The Eclipse Soundscapes: Mobile Application, which is now bilingual in English and Spanish, is currently being updated to include information about annular eclipses. ARISA Lab also received funding and is in the initial stages of organizing an accessible citizen science project for the upcoming 2023 and 2024 eclipses. The Eclipse Soundscapes: Citizen Science Project will recruit participants to collect audio recordings before, during, and after the 2023 and 2024 eclipses to measure how eclipses impact soundscapes here on Earth.
“Heliophysics has traditionally focused on visual representations of data and information,” said Dr. Winter. “This excludes members of the blind and low vision community as well as others who would benefit from more multi-sensory experiences. Eclipses do not have to be amazing events that a person sees. Eclipses are amazing events to experience.”
The Eclipse Soundscapes: Mobile Application (ES 2.0) is supported in part by NASA Heliophysics Education Activation Team. The Eclipse Soundscapes: Citizen Science Project (ES:CSP) is supported by NASA award No. 80NSSC21M0008. For more information, visit EclipseSoundscapes.org.
By Kelsey Perrett – ARISA Lab