Researchers at the University of California, San Diego built an artificially intelligent camera technology, powered by an Intel Edison module, that could lead to autonomous monitoring systems for tracking endangered species.
The world’s ocean ecosystems are in serious danger. Across the globe, warming ocean temperatures, ocean acidification, over-fishing, and habitat destruction are threatening countless marine species. Species such as the vaquita porpoise, the most critically-endangered marine mammal in the world, are on the brink of extinction due to destructive fishing and poaching, and myriad other species are threatened due to fishing pressures and habitat destruction. Many populations of marine animals are little-understood by biologists and conservationists, since in many cases the technology does not exist to effectively monitor a species that lives in the vast, harsh ocean.
A group of university students built a smart, submergible system they believe can turn the tide for endangered underwater species. “We have developed an autonomous underwater camera that’s acoustically triggered,” said Antonella Wilby, a PhD student studying computer science at the University of California, San Diego (UCSD), and the project’s lead researcher.
“The camera system started out as a way to provide biologists with a way to monitor the vaquita porpoise, which has never been photographed underwater before. After talking to many biologists and conservation researchers, we realized that the camera system has far-reaching applications beyond just the vaquita application.”
The UCSD researchers are hoping that the camera system will not only capture the first underwater images of the vaquita porpoise, but be useful to marine researchers studying a vast array of species.
The submergible system has the potential to produce data on species and specific behaviors that have never seen observed in the wild before, all in a low-cost, digitally efficient way.
Oceans on the Brink
Conservationists see the world’s oceans over-fished and under-protected, with many ecosystems on the brink of collapse. Some species, such as the vaquita, are in danger of going extinct within the next few years. Others face increasing changes to their environments, and therefore become listed as threatened or endangered. Wilby said many species have been studied so little that scientists don’t have enough data to determine if they are threatened.
The UCSD researchers are working to assist marine biologists and conservationists by providing an innovative tool for biological monitoring of these rarely-observed species. They created the SphereCam, an Intel Edison compute module-powered device with a hydrophone that begins recording video when it hears a vocalization created by a marine animal.
The National Geographic Society signed on to the project early, awarding Wilby a Young Explorers Grant to support the initial field work in the Sea of Cortez, using the system to monitor the vaquita porpoise. Now the research team is expanding the SphereCam’s applications, using the system to investigate the various grouper species in the Cayman Islands and the kelp forests off San Diego.
Wilby said the SphereCam is unique because it can detect how far an animal is from the camera, based on the strength of the signal detected by the hydrophone. This device improves upon time lapse-based models that spent a lot of time recording images of water — and little else. When the camera’s hydrophone “hears” acoustic signatures of interest within a certain range, it begins filming.
The round SphereCam is around 1.5 times bigger than a soccer ball. Wilby began developing the SphereCam in 2014.
Finding the Right Underwater Tech
Dr. Ryan Kastner, the professor of computer science and engineering at UCSD who is overseeing the project, said the key to the SphereCam was finding technology that could overcome the harsh environment and still give the researchers the data they needed.
“The issue with the video is that it’s very power hungry and you get a lot of data,” said Dr. Kastner. He wanted technology that could not only run on low power, but also could learn to turn on and off to ensure the researchers only got the data and images they wanted.
The researchers incorporated Intel Edison because it allowed them the flexibility, energy-efficiency and compute power to fit their needs. Not only would the module run on batteries for up to a week and fit inside the SphereCam, preventing it from getting wet. It could also be programmed to do exactly what the researchers wanted, such as processing sound.
Using acoustic and optical sensors, the camera system leverages acoustic signatures of certain marine species and uses them to autonomously trigger the video recording. It integrates an ultrasonic hydrophone (trigger) and an onboard computer that samples audio, logs data and processes signals. Six cameras allow the system to capture video in a full 360-degree view in high quality 1080p format.
Programming the Edison to process the vaquita’s clicks, however, required a complex algorithm that translated ultrasonic frequencies before identifying the ones that were 139 kilohertz. Eventually, Kastner said, researchers want to use machine learning to ensure SphereCam distinguishes between the vaquita’s echolocation clicks and the concert of deep ocean sounds.
“You don’t want to hear whale sounds,” explained Andrew Hostler, a student in the Electrical Engineering Department at California Polytechnic State University, San Luis Obispo. He, along with Ethan Slattery of the Computer Engineering Department at University of California, Santa Cruz, joined Wilby and Kastner to help implement the sound processing.
“You don’t want to hear side scan sonar. You just want ultrasonic, echolocation clicks. Then you want to amplify and digitize them,” said Hostler.
Although the device is an imperfect solution, the UCSD team claims it’s the first device to attempt to autonomously capture images of the vaquita. The SphereCam went into the Gulf of California in September, but had to be removed two weeks later due to illegal fishing activity in the refuge. The team plans to redeploy the system in the spring.
“It’s still a long shot,” Wilby said. “But we’ll keep trying. Extinction is threatening countless marine species, and I hope that this can eventually be another tool in biologists’ arsenals to understand the planet’s ecosystems and end extinction.”
Editor’s note: Dive deeper into details in the research paper, “Autonomous Acoustic Trigger for Distributed Underwater Visual Monitoring Systems” and “Design of a Low-Cost and Extensible Acoustically-Triggered Camera System for Marine Population Monitoring.”
Ken Kaplan contributed to this story.