EcoVR – The Virtual Reality Ecosystem Data Viewer
- See our latest demo video here: https://www.youtube.com/watch?v=R_8YHsvN9t4
- Read our abstract to Siggraph 2017 VR village
New technology has provided ecology with a wealth of new data sources at spatial and temporal resolutions that were previously unmanageable. However, our ability to collect data has rapidly overtaken our ability to work with, analyse and visualize these data in meaningful ways (see Hampton et al, 2013 for more background on this topic). While, these high density and high complexity datasets are crucially important for improving our understanding of ecology, climate change and parameterizing global climate models, much of their potential is lost due to the difficulty in working with such complex data.
Where ecology has struggled to keep up with technological change, the entertainment industry (computer gaming, high-budget film studios, etc.) have spent billions of dollars developing tools for rendering the world in very high resolution. The gaming industry in particular has been developing relatively easy to use tools for creating interactive 3-dimensional models of the world. Additionally, since 2012, virtual reality (VR) and augmented reality (AR) headsets have gone from non-existent to reasonably affordable consumer devices. In the same time frame, off-the-shelf UAV or “drone” technology and robust 3D reconstruction software has become widely available, allowing rapid imaging of outdoor environments that can be easily converted into a 3D model of the area surveyed.
The National Arboretum Virtual Reality Project is a proof of concept project to explore new methods of visualizing complex time-series environmental data, on the landscape, using the Unreal gaming engine and the Oculus Rift VR headset. Working with a group of students from the ANU Computer Science Department TechLauncher project, we have developed a 3D interactive model of the National Arboretum in Canberra, Australia where time-series environmental data is overlaid on a spatially accurate 3D model of the ANU research forest.
In 2014, the Borevitz Lab, in collaboration with researchers at the ANU Fenner school, received an ANU Major Equipment (MEC) grant to instrument the ANU research forest at the newly opened National Arboretum in Canberra, ACT. The National Arboretum is a unique 250-hectare site featuring more than 4,000 trees growing in 94 forests, most of them in single-species stands. Most forests are less than 5 years old, providing the opportunity to monitor and document tree growth from “birth” through maturity over the coming decades. The initial MEC funding allowed us to instrument the ANU forest site with a range of “NextGen” sensing equipment to create the Phenomic Environmental Sensing Array at the National Arboretum (project page).
Map of the National Arboretum PESA project
Visualizing and analyzing complex data
Traditionally, data has been visualized and shared through graphs and figures. To the trained eye, these tools organize data in ways that (sometimes) helps trained scientists make sense of complex data streams. While these tools are well understood and useful, their shortcomings rapidly become obvious when trying to visualized spatially varied and high temporal resolution data. Historically, a data set describing a forest might have included a few tree height and growth measurements are bi-yearly intervals and a weather station near the site. In contrast, at the PESA site, we currently measure, sub-millimeter tree growth, temperature, humidity, sunlight, below ground soil moisture and temperature, for twenty trees at 10-minute intervals. The site will be flown monthly with a UAV providing a 3D model of the entire site with tree height, leaf density and leaf color. As full genome sequencing costs come down, we will be sequencing every tree in the forest enabling us to couple precision environmental measurements with genomics data to look for any genetic variation that contributes to tree growth differences as the trees interact with the environment.
Likewise, it is very difficult to co-visualize huge, varied data types using conventional graphs, particularly if spatial and temporal variation between sensors plays a biologically important role in the data. Thus the challenge is to develop new ways to organize and visualize these new and exponentially more complex data types in ways that facilitate the human native brain’s ability find patterns.
Merging the old and new: Natural History meets virtual reality
The human brain is highly tuned to notice patterns and to organize and synthesize highly complex visual data. In previous centuries, the majority of ecological research was observational. Ecologists spent time on the land, watching and taking notes. Modern ecology added the scientific method, coupled with quantitative analysis, to augment our intuition and provide statistically rigorous tools for quantifying and verifying our intuition. New technology from microclimate mesh sensor networks to drones to NextGen LiDAR and gigapixel imaging give us tools to monitor the earth with unprecedented complexity. Virtual and Augmented Reality gives us the ability to merge these two approaches directly, building immersive replicas of ecosystems where the physical objects in the landscape carry their own datastreams that researchers can access on demand.
We include these data types in the model
|Table 1. Available data sources in model|
|Data Source||Vendor or Source / Data Types||Notes|
|Conventional areial LiDAR||Landscape Digital Elevation Map (DEM)||Poor input maps resolution required extensive post processing to smooth data|
|UAV||ProUAV||Aeronavics Quadcopter with 12mp Canon camera|
|Tree GPS Position||All quad data post processed in Pix4D build point cloud.
Point cloud data processed with custom python scripts to yield individual tree data
|Tree RGB Color|
|Mesh Sensor Network (20 nodes)||Enviro-net.com||Data must be manually downloaded via login (no API) so realtime visualization is not available|
|Air Temperature / Air Humidity|
|Soil Temperature / Soil Moisture||Sensors buried @ 20cm below surface|
|Dendrometers (20)||Environmental Sensing Systems|
|uM resolution tree growth||10-min intervals via sftp to server|