Skip to main content

Project Prometheus

Mapping the Underwater World with Light

Published onMar 02, 2018
Project Prometheus
key-enterThis Pub is a Supplement to
Project Prometheus Report
Project Prometheus Report

Developing and deploying a low-cost, high-precision underwater system to quickly and beautifully map caves, coral reefs, and sunken cities.

We are currently seeking a student from the MIT Media Lab to collaborate on the technology development, testing, and deployment.  A background in mechanical engineering and SCUBA certification would be helpful but not required. 


Most of the underwater world remains far off the map. For many of the most exciting exploration challenges – from Maya cenotes to urban aquifers to archaeological treasures to coral reefs – map-making remains largely pre-industrial and time consuming. The difficulty and expense of mapping these spaces is thus a major barrier to storytelling for science, conservation, and stewardship.  While many tools exist for open-ocean bathymetry (multibeam sonars, etc), cost-effective diver-deployable tools for rapidly mapping complex and enclosed spaces are sorely lacking.

Our challenge is to create diver-deployable tools to map these invisible spaces that are orders of magnitude faster, more precise, and less expensive than current practice – to enable mapping and imaging of these underwater resources at a societal scale.

To this end we propose developing a low-cost, high-precision, diver-deployable underwater Light Detection and Ranging (LIDAR) system – a 3D scanning and navigation system with which to quickly, safely, and beautifully map caves, aquifers, coral reefs, sunken cities, and other large-scale underwater spaces.To satisfy scientific and storytelling needs, this device must be easy to use, have fine spatial resolution, map at swimming speed, produce data in industry standard formats, and be completely open source at both hardware and software levels.


The core of our approach is a team with both the technical skills and the field experience required to design and build a robust, user friendly tool with wide applicability. Our team members, all seasoned field explorers with varied skills, have been involved with projects applying techniques from archaeology, paleontology, ecosystem mapping, geology and geophysics, hydrology, and speleology.  Their work has contributed not only to the academic literature, but to high profile outreach and education efforts across multiple media platforms.

Concretely, our approach involves the design, fabrication, and validation of a hand-held underwater LIDAR system capable of recording point clouds in real time while the user swims through unmapped spaces.  Localization of each LIDAR-generated data point will depend on the fusion of multiple independent navigation signals including commodity inertial measurement chips, machine-vision cameras, and low-cost acoustic positioning tools as the situation warrants and allows.

From an engineering perspective, our core strategy is to exploit commodity electronics and modern manufacturing to keep costs as low as possible.  Instead of packing an expensive off-the-shelf system into a big housing, we will develop a custom system built around commodity electronics and optics. In place of accurate but expensive off-the-shelf positioning tools, we will combine a host of simple sensors (cameras, accelerometers, etc) to measure positions at low cost. Limiting the device itself to recording data, and leaving processing to be done on the surface, dramatically reduces the complexity of the diver-deployed device.

Our testing and field deployments will take place in diverse underwater settings – from coral reefs to complex cave systems to large scale archaeological sites.  Integrated into the testing will be outreach and education activities led by the Ocean Exploration Trust and the National Geographic Society via their explorers on the team.  Collectively, these sites will demonstrate the breadth of potential applications and data quality under difficult conditions. 

Following this initial 9 month program, we would like to deploy these tools to map, explore, and document diverse urban aquifer systems and coastal geo-archaeological sites around the world.This would allow us to develop Augmented Reality apps with which to conduct extensive outreach with local stakeholders.


The core technologies of interest (LIDAR, optical and acoustic tracking, inertial navigation, etc.) are all well-understood.  The key unknowns are thus practical: how much will our final prototypes cost, how many parts will require custom fabrication, etc.  To this end, direct costs, including sensors, housings, and deployment, are reasonably well defined.  Less certain are questions of production scalability, unanticipated failure modes, and performance in turbid conditions.


Current team members have all the requisite skills for completing the project, from design to testing to deployment to public communication.  One key need is a dedicated student to take point on fabrication and testing of the initial prototype.    

Our team members will contribute the following resources to the project:

  • Electronics Designs (C. Jaskolski)

  • Field testing, Bahamas lodging (K. Broad)

  • Engineering, Fabrication (A. Adams)

  • Engineering Support (D. Rissolo)

  • Outreach, Ship time (N. Raineault)

  • Engineering, Visualization Support (M. Bove)

We will also provide logistical and field support for deployment first in The Bahamas, then subsequently (post-9 months) at additional sites:

  • Crystal Caves, The Bahamas (karst created)

  • Seal Cove Cave, Channel Islands (submerged sea cave)

  • Hoyo Negro, Yucatan, Mexico (cenote)

  • Caesarea, Israel (archeological site)

 What we are requesting through this proposal is support for:

  • 9 months of support for one MIT graduate student

  • Materials costs for design, fabrication, and testing of the LIDAR system

  • Travel to and expenses at field sites


April-June: Design and fabrication of the main circuit boards (initial designs already completed by C. Jaskolski, to be prototyped at MIT and tested at Virtual Wonders and MIT), testing & design revision.

April-July: Design and fabrication of the optics, mechanical system, and main housing at MIT, with support from UCSD and Virtual Wonders; design of the software and processing stream by Virtual Wonders and UCSD, with input from MIT.

July-September: Design revision & fabrication of complete prototype; testing in controlled settings and development of the workflow and data-processing pipeline.

September – December: Iterative testing and refinement in real-world settings, then fabrication of a device suitable for field-deployment

December-January: Field Deployment to the Bahamas and beyond.


Allan Adams, Project co-PI, MIT Future Ocean Lab
Corey Jaskolski, Project co-PI, NG Fellow, Virtual Wonders
Beverly Goodman, NG Explorer, University of Haifa
Dominique Rissolo, UCSD Cultural Heritage Engineering Initiative
Kenneth Broad, NG Explorer, Virtual Wonders, University of Miami
Michael Bove, Object-Based Media, MIT Media Lab
Nicole Raineault, Ocean Exploration Trust, E/V Nautilus

Jacob Bernstein:

This is a great marriage of scientific need and technological opportunity. Could you use the help of a recent ML alum? I completed my PhD in Ed Boyden’s lab in Sep. 2016, spent the last year at a medical device company, but I’m in Cambridge moving into embedded hardware design. I have a B.S. in physics from MIT, and am also SCUBA certified.

Allan Adams:

Allan Adams has submitted this pub for publication.

Allan Adams:

Submitting Revision