Libra Live

Redefining Web Maps

Process satellite imagery in realtime using voice commands.

Landsat-8 image of Washington D.C. with land/water analysis


Showing the power of accessible data

Making data easy to access is the first step to making it easy to use. With Libra Live, we tested this by utilizing voice to access satellite data.

Simply, say something like “Use Libra to show me Washington DC with no clouds” and Libra Live will search and show imagery. Ask it to show you land or water analysis or vegetation health and Libra Live can make it happen.

“Use Libra to show me vegetation health in Washington, D.C.”

For data creators, like NASA or ESA, Libra Live shows how data can be prepared in ways that are more consumable by mobile and web apps. For app creators, it provides inspiration for new ways to allow users to access data. New ways could be better suited for those looking for data in emergency situations or engaged in driving or medical tasks. Individuals with disabilities or low levels of literacy could also benefit from more ways to access.

Building applications for end-users requires fast querying and processing of data. For data to be app-ready, it needs to be organized in a way that even a voice assistant like Alexa can get to it. Libra Live uses any satellite imagery available on the AWS S3 service and processes it in realtime.


Realtime Processing

Libra Live is a web app that uses the Alexa API and an Alexa skill that queries sat-api to find the requested imagery, sends that information to an AWS Lambda function that does realtime processing using rasterio and then returns the url for the processed image to the user-facing web app.