The Virtual Galapagos Project (VGP) began as a concerted effort by a small team consisting of myself and two Colgate undergraduate students, a professor at Colgate University, and member of Colgate’s instructional design team, funded by McGill University. Our primary objective is to design a way for children around the world to learn about science while using the Galapagos Islands as motivation. Our core philosophy is that the science should be digestible, without being presented as overly simplistic, and that students should be able to choose their own path, thus taking agency in their learning. We are currently developing a pilot of this program in the form of a 3D application for computers as well as mobile devices featuring virtual reality elements.
This project began in the summer of 2017, when I spent 10 weeks on the Colgate University campus as a Public Policy Fellow and Labs Without Borders Fellow on behalf of McGill University, building the pilot phase of what would become a much larger, long-term project. This included a 10-day trip to the Galapagos, where we collected thousands of videos and photos, while interviewing locals, park guides, and scientists to truly immerse our viewers in the mind of a scientist.
My expertise came from my background in filmmaking, action photography, and video editing. The VGP was filmed using an array of tools including two GoPro OMNI VR camera rigs, a Kolor Abyss Underwater VR rig, a Sony A7R II, and a Canon 6D. My primary video editing software was Adobe Premier Pro, with primary colour correction done on Blackmagic Design's DaVinci Resolve control surface. Through this experience, I gained great familiarity working with 360° video and the nuances of stitching six different shots together into a single 360° shot using GoPro's VR editing suite of Autopano Video Pro and Autopano Giga. We also utilized Garden Gnome's PanoTour Pro 2.
Illustrations were done first by hand and then rerecorded using a screen recorder on the GPU, primarily using Adobe Photoshop, as my skillset was built in that program; I often used Adobe Dynamic Link to transition into Adobe After Effects. The illustrations were done using a set of 4K monitors in combination with a pair of Intuous and Cintiq tablets from Wacom.
Our first VR modules made frequent use of two main types of 3D models: geography-focused and interface-focused models. Geographic models were based on GIS data obtained from the USGIS, the US Navy, and other sources, pieced together in 3DS Max by my mentor and director of the VisLab, Joe Eakin, and myself. The models for the UI/UX were designed by my colleague Desmond Tuiyot, who adapted features from his larger reconstruction of the ancient Mesoamerica city of Teotihuacán, and assisted me in learning 3DS Max and Maya, adapting my prior experience within the Autodesk suite.
Recently, I have been experimenting with ArcGIS's StoryMaps map tiling library (originally from Esri) which provides a beautiful map-focused storytelling interface for creating interactive, web-based geotours; and Google Earth Studio, an animation platform for educators, researchers, and journalists to combine location data and KML overlays, petabytes of beautiful visual data from Google Earth, and 3D tracking data for After Effects, to tell animated stories about the planet and its people. You can read more about the Google Earth Studio here.
I took over audio design for a month in August 2017 after one of our colleagues regrettably felt he could not produce the results we needed. Over a weekend, I learned Adobe Audition and began cleaning and logging audio clips and interviews, later transitioning into the AVID Pro Tools environment once construction finished on our audio editing bay prior to the Fall term, and the editing surface was installed.
While I was intrigued by the prospect of using binaural microphones, we ultimately decided against it for the Galapagos trip, as we were unclear the applicability of true binaural tracks in a spatial audio space. Ultimately our spatial immersion was created using a combination of Unity3D and inspiration and assistance from the Facebook 360 Developer group community and Facebook's audio editing suite, Facebook 360 Spatial Workstation.
Audio was recorded using a RØDE Stereo VideoMic, and a pair of Zoom H5 recorders. I gained valuable experience monitoring levels, which I hadn't had to do up until then.
In 2018, we entered into conversations with Lev Horodyskyj, a curriculum designer at Arizona State University. Horodyskyj and his team at ASU teach geology in remote locations using an ASU-developed system called SolarSPELL: Solar Powered Educational Learning Library. Designers attach MicroSD chips full of educational media to Raspberry Pi computer CPUs and make the library accessible via mobile phone. No Internet required.
To make the program fit onto a SolarSPELL, as of 2019, we decided to shift to a web-based platform, particularly because our design plan was headlined with accessibility as a top priority.
At this point, the project has moved into a more accessible format with one digital module currently up for access purposes, though the page does not reflect our most recent version — due to restrictions beyond our control, it is confined to internal testing. This live version of Colgate Virtual Galapagos is available at http://virtualgalapagos.colgate.edu. We encourage those interested to check back in later or contact us as we update the site throughout the summer and fall.
This project would not have been possible without the help of a number of select partners, whose continued support has allowed us to hire more students, purchase state-of-the-art equipment, and travel to these remote areas and capture them for the world to see:
McGill University Lab Without Borders
McGill University Office of Science and Society (OSS)
Mr. and Mrs. Stephen and Jane Savidant
Colgate University Geology Department
Colgate University Natural Sciences and Mathematics (NASC) Department
Mr. Robert H.N. Ho, CM OBC
Arizona State University
All the scientists whose materials and/or interviews were used with their permissions