Visualizing Sea Level Rise with Augmented Reality
Looking Glass is a prototype app that uses augmented reality to visualize sea level rise.
Invented by journalist Eli Kintisch as part of his two-year Knight Science Journalism Fellowship, the project involved Rhode Island School of Design’s Charlie Cannnon, Pace University’s Will Pappenheimer, and Zack Brady, a developer at Suits and Sandals, among others.
Looking Glass runs on Layar. It’s not yet ready for the public to try out, though we hope to change that soon.
Supported by Knight Science Journalism at MIT, RISD, RI EPSCoR.
Looking Glass is an app that visualizes future environmental risk using augmented reality. The first iteration of Looking Glass was a prototype development project on sea level rise in a coastal town, Wickford, RI
The spring 2013 prototype used Layar, a free augmented reality app for iOS.
The user first enters the climate scenario they want to visualize in the future — one in which the sea level rise continues at the current pace, or one in which it accelerates according to worst-case IPCC scenarios. Then they choose the date in the future they want to explore. Tide is the next parameter they can select, and then the user can elect to include storm surge from three historic hurricanes that have flooded Rhode Island. This provides users a sense of how a repeat of one of those storms would affect their town in a future with higher seas.
All the while, a sketch at the bottom of the app provides a interactive preview of water level given these factors, allowing the user to try different combinations to best understand what governs flooding in the future.
Accomplishments included researching and finding an effective coastal site for the project, choosing an AR solution, translating scientific climate and flooding projections into a cohesive narrative input screen, coding the app so user choices dictate the future sea level they experience, designing a water intervention that was visually compelling and clear.
Technical challenges we encountered included making the water height appear constant relative to the ground; right now it moves as one lifts or lowers the phone. Also, while the experience of being underwater vs having several feet of water looks different, the visual differences between say 3 feet of flooding vs 4 feet of flooding are hard to discern.
Eli Kintisch created Looking Glass and led the effort with Charlie Cannon, an architect and designer at Rhode Island School of Design. Artist Will Pappenheimer of Pace University designed the water and other elements, and helped with troubleshooting. Zack Brady of Suits and Sandals was the developer. Yangyang Xu of RISD provided design assistance. Funding was provided by Knight Science Journalism at MIT and Rhode Island Experimental Program for Stimulating Competitive Research, a program of the National Science Foundation.
The prototype is not yet available for anyone to download, but we hope to change that soon. The project has been on hiatus since spring 2013, but Eli Kintisch and others are interested in developing it further.