This article was written as part of a series of Design Retrospectives on the prototypes commissioned by Cooper Hewitt’s Interaction Lab for the Activating Smithsonian Open Access Challenge. It was co-authored by the ButtARfly team: Jonathan Lee, Project Lead and Animation Programmer; Rianne Trujillo, Web Developer; Lauren Addario, Audio Advisor and Content Developer; Miriam Langer, Content Developer.

Before we share how we arrived at the inspiration to create ButtARfly, we should explain a few things. First that we live and work in Las Vegas, New Mexico. Second that there is, in fact, a Las Vegas, New Mexico, and that it’s actually the original Las Vegas. Third that we work at New Mexico Highlands University (no, not the University of New Mexico). Finally, that our program, the Cultural Technology Development Lab (CTDL), has been problem-solving ways to bring technology solutions to New Mexico museums and cultural institutions since 2005 as part of NMHU’s Media Arts & Technology department.

CTDL is a research and development program where university faculty and students, museum professionals, and other partners work together on technology and design solutions for cultural institutions. Our mission is to cultivate a homegrown pool of multimedia specialists working with cultural content and committed to serving their communities. Students in our degree program can focus on software development, physical computing, web development and design, video and audio production, graphic design, or photography. They have the opportunity to do paid internships with regional museums and cultural institutions as a way to hone skills learned in the classroom. Because we work for a university and this is part of our specialized research, we are able to do everything in-house. However, being located in one of the poorest counties in one of the poorest states in the country means working with fewer resources than most university-embedded programs, which requires us to rely on collaboration and creative problem-solving.

Creating prototypes is what we do, so when the call for proposals went out for Activating Smithsonian Open Access, it was a natural fit. We started to talk about all the ways we could use Smithsonian’s seemingly endless collection to create something fun, immersive, accessible, and engaging. It was difficult to narrow down which collection to engage with, but after a few discussions we decided on the butterfly collection, because of the range of colors and patterns it offered.

When thinking about how we could make the 2D images in the collection more interactive, we were excited by the idea of bringing them to life in 3D, like in a butterfly pavilion. This runs contrary to the typical experience of looking at butterflies in a museum, where they are dead and pinned into a shadowbox. Riffing on this idea, we came up with the concept that the user would see subtly moving butterflies in a shadowbox collection that could then be released into their camera feed, making it appear that the butterflies were flying around them in their environment. We chose not to use markers and to augment the camera feed so that we could give users more freedom in where they wanted to experience the butterflies. Because we were augmenting the camera feed and we were planning to use web tools to develop the mobile app, we decided to support desktop users as well, to give more options for users to experience the butterflies.

Nine different butterfly specimens of various shapes and sizes, with markings in blue, black, yellow, red, orange, grey, white, and brown, slowly flap their wings in unison a virtual shadowbox. Each butterfly's scientific name can be seen below the corresponding specimen.

Collected butterflies slowly flap their wings in a virtual shadowbox

 

As the first step in creating the prototype, two of our team members, Jonathan and Rianne, developed a proof of concept in P5.js, a web framework for graphics and interaction, and determined it was possible to achieve the effect we wanted. The team hand-cut the images in Photoshop to replace the existing backgrounds with a transparent background and divide each one into a left and right image. We placed the left and right images on two different 3D planes and animated them through code so they appeared to be flapping slowly, a subtle movement that would catch the user’s attention.

We then looked at the metadata available for each butterfly and determined that we could organize them in an educational way. The most consistent piece of data available for each butterfly was their family and scientific name, so we created cards for each butterfly listing that information. We wanted users to be able to learn the families of butterflies by searching for their favorite ones to add to their collection. When building a collection, they can find a butterfly fastest by narrowing results to its family and selecting it from that page. Over time, through repeated collecting, they can begin to memorize these family names and associate them with the butterflies.

Later in the development process, we talked with the ASOA-provided accessibility specialists about how we could allow users to experience the variations in the butterflies through multiple senses. We decided to tie the colors of the butterflies to musical notes, reducing the colors of each butterfly to a simplified palette and matching each color to one note from the pentatonic scale. We chose the pentatonic scale because it minimizes discordant combinations of notes, meaning it would yield the most sonically pleasing combinations as people filled their soundscapes with multiple butterflies.

The colors of a monarch butterfly specimen are broken down into brown and orange which correspond to notes F4, and D4.

To engage more senses than just sight, the team designed a system of sounds to correspond with the colors of the butterflies, inspired by synesthesia.

 

To determine how to tie specific notes to colors, we decided to rely on a table of common links between notes and colors for people who experience synesthesia, a phenomenon of connections and overlaps between the senses. There was no color metadata for the butterfly images in the collection, so we wrote a small program in Python which uses an algorithm to analyze and reduce the colors of each butterfly and select the three most prominent.

We used all open-source development tools for the project, which we do whenever possible because it allows us to create low-cost and non-proprietary solutions for cultural institutions. The web app is built with Vue.js, a Javascript framework for building user interfaces, using the UI library Vuetify. Rianne used these tools to create a minimal interface to the Smithsonian Open Access API to pull in metadata about the butterflies while retaining the user’s focus on the 3D living butterfly experience.

As we continue to work on the ButtARfly prototype, we are looking into adding the ability to sense a user’s nose in the camera view, so the butterflies can hover or land there for brief periods. We are also exploring ways to allow users to share their collections with other users, and to add a more sustained and fluid auditory experience for the tonality of the butterflies.

We are grateful for the opportunity to work with the Cooper Hewitt team. Rachel Ginsberg, Jaade Wills, Katherine Miller, Corey Timpson, and Sina Bahram were incredibly generous with their time and knowledge. We love creating prototypes in our CTDL (now virtual) lab and hope we can continue to work on this prototype until it’s robust enough to release into the world!

 

About the Center for Cultural Technology

New Mexico’s Center for Cultural Technology (CCT) is an educational, community engagement, and R&D partnership between the Department of Media Arts & Technology at New Mexico Highlands University and the New Mexico Department of Cultural Affairs. CCT’s mission is to cultivate a homegrown talent pool of multimedia specialists capable of working with cultural content and committed to serving their communities. Since 2005, we’ve placed over 200 cultural technology interns in museums, libraries, historic sites, and parks across New Mexico and the southwest, creating well over 300 projects in video/audio, exhibits, graphic design, mobile apps, web sites, and more. CCT’s main headquarters is in the Department of Media Arts & Technology, located in the McCaffrey Historic Trolley Building on the campus of New Mexico Highlands University in Las Vegas, NM. CCT’s Museum Classroom satellite facility is located at the New Mexico Museum of Natural History & Science in Albuquerque.

About Activating Smithsonian Open Access (ASOA)

Created by Cooper Hewitt’s Interaction Lab and made possible by Verizon 5G Labs, Activating Smithsonian Open Access fosters a new approach to activating museum collections by expanding access to deep engagement for people of many abilities and interests worldwide, and supporting creative technology teams in the process. Each team received $10,000 to build a functioning prototype of a new digital interaction that enables play and discovery with 2D and 3D digitized assets from the Smithsonian’s Open Access collections and will retain ownership of all intellectual property developed from the program.

About Cooper Hewitt’s Interaction Lab

The Interaction Lab is an embedded R&D program driving the reimagining of Cooper Hewitt’s audience experience, across digital, physical, and human interactions. Since its Fall 2019 launch, the Lab has injected new ideas into the museum’s work through internal workshopping and strategy, a highly participatory public program series merging interactive design and museum practice, and a commissioning program that engages the design community as creative collaborators in creating the next wave of the Cooper Hewitt experience.

Leave a reply

Your email address will not be published. Required fields are marked *