[MARMAM] New Publication: 'More than a whistle: Automated detection of marine sound sources with a convolutional neural network"

Ellen White elw1g13 at soton.ac.uk
Tue Oct 4 13:19:00 PDT 2022


Dear Colleagues,

We are excited to announce a new scientific publication from the University of Southampton: 'More than a whistle: Automated detection of marine sound sources with a convolutional neural network' in the journal Frontier of Marine Science (Special Issue Ocean Observation). The article is open access and can be viewed at:
https://www.frontiersin.org/articles/10.3389/fmars.2022.879145
[https://www.frontiersin.org/files/MyHome%20Article%20Library/879145/879145_Thumb_400.jpg]<https://www.frontiersin.org/articles/10.3389/fmars.2022.879145>
More than a whistle: Automated detection of marine sound sources with a convolutional neural network<https://www.frontiersin.org/articles/10.3389/fmars.2022.879145>
www.frontiersin.org


Authors: Ellen L White, Paul White, Jonathon Bull, Denise Risch, Suzanne Beck and Ewan Edwards.

Abstract
The effective analysis of Passive Acoustic Monitoring (PAM) data has the potential to determine spatial and temporal variations in ecosystem health and species presence if automated detection and classification algorithms are capable of discrimination between marine species and the presence of anthropogenic and environmental noise. Extracting more than a single sound source or call type will enrich our understanding of the interaction between biological, anthropogenic and geophonic soundscape components in the marine environment. Advances in extracting ecologically valuable cues from the marine environment, embedded within the soundscape, are limited by the time required for manual analyses and the accuracy of existing algorithms when applied to large PAM datasets. In this work, a deep learning model is trained for multi-class marine sound source detection using cloud computing to explore its utility for extracting sound sources for use in marine mammal conservation and ecosystem monitoring. A training set is developed comprising existing datasets amalgamated across geographic, temporal and spatial scales, collected across a range of acoustic platforms. Transfer learning is used to fine-tune an open-source state-of-the-art ‘small-scale’ convolutional neural network (CNN) to detect odontocete tonal and broadband call types and vessel noise (from 0 to 48 kHz). The developed CNN architecture uses a custom image input to exploit the differences in temporal and frequency characteristics between each sound source. Each sound source is identified with high accuracy across various test conditions, including variable signal-to-noise-ratio. We evaluate the effect of ambient noise on detector performance, outlining the importance of understanding the variability of the regional soundscape for which it will be deployed. Our work provides a computationally low-cost, efficient framework for mining big marine acoustic data, for information on temporal scales relevant to the management of marine protected areas and the conservation of vulnerable species.

Reference for the paper: White, E.L. White, P. Bull, J. Risch, D. Beck, S and Edwards, E, 2022. More than a whistle: Automated detection of marine sound sources with a convolutional neural network. Frontiers of Marine Science (9). DOI=10.3389/fmars.2022.879145.

Please feel free to contact the lead author Ellen White on behalf of all authors if you have any questions,

Ellen White
Post-graduate Research Student
University of Southampton
School of Ocean and Earth Sciences
National Oceanography Centre Southampton SO14 3ZH, UK

Contact Information:
Email: elw1g13 at soton.ac.uk
Phone: 07715926069
Twitter: @OceansE11en


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.uvic.ca/pipermail/marmam/attachments/20221004/bf8b7445/attachment.html>


More information about the MARMAM mailing list