Investigation on the interactions between ecological environment, wildlife animals, and human activities using soundscape information

Start from this August, I have been awarded for a three years project entitled “Investigation on the interactions between ecological environment, wildlife animals, and human activities using soundscape information“. This project is funded by the Ministry of Science and Technology, Taiwan. It will be conducted at the Academia Sinica, Research Center for Information Technology Innovation and collaborated with Dr. Yu Tsao, the associated research fellow of our institute. This project is also cooperated with Taiwan Forestry Research Institute, Dr. Yu-Huang Wang, Academia Sinica Grid Computing to maintain Asian Soundscape.

If you are interested in this project, please contact

Project abstract

Investigating the impacts of environmental change and human development on biodiversity is already an important research issue in the entire globe. The recent development of autonomous recorders has facilitated the long-term acoustic monitoring to collect big data relevant to the soundscape ecology, such as geophony, biophony, and anthrophony. Soundscape information can be employed to characterize the landscape environment and to investigate animal and human activities, which are the essential information to study the dynamics of ecological communities and interactions of vocal communication between different animal species. However, it is still difficult to analyze various types of sound in our environment when there are no sufficient template data and analysis tools. In this study, we integrate the expertise of ecology and information science to conduct big data analysis of soundscape ecology by using the forest and underwater recordings collected from Taiwan, Japan, and Hong Kong. The temporal and spatial variations of soundscape will be interpreted by data visualization. Machine learning techniques will also be applied to establish sound catalogs to understand the complexity of local soundscape. Acoustic diversity index will be developed in a manner of noise resistant and employed as a quantitative index to investigate the temporal and spatial changes in biodiversity. To test the acoustic adaptation hypothesis and acoustic niche hypothesis, information on the vocalizations of wildlife animals will be collected by a generalized sound detector. Besides, the predictive model of animal occurrence will also be built, and the impacts of anthropogenic noise on the appearance, behavior, and acoustic communication space of wildlife animals will be studied. The outcome of the current research will help us to understand the correlation between soundscape information and biodiversity. It will also provide the community of ecological research an analysis toolbox of soundscape information to investigate the interactions between habitats and animal communities from long duration recordings in the future.

Research outputs

Please select a recording location you want to listen in the Google Map at the bottom of the embedded page. At each location, you can check whether the recording data is available or not. Or you can click the link of analysis result to visualize the long-term change of soundscape. We will update this map when the data and analysis result is available without specific notification.

Within the page of analysis result, you will see a similar figure below.This is the visualization of soundscape change in each recording location. The x-axis represents the 24 hours of each day, y-axis represents the date. The value (color) in each grid represents the unsupervised label of soundscape scene recognized by machine learning algorithm (Number 0 indicates missing recording).

Currently, we don’t have sufficient database to recognize each sound source for each soundscape scene. Even though, it is still possible to recognize different soundscape scenes when the weather is changing or different wild animals are making calls due to the unique combination of acoustic features. We welcome everyone to find interested soundscape scenes and select the associated recordings in our database to investigate the difference among soundscape scenes. We believe that you will find many stories behind these information.

Journal Articles

  1. Tzu-Hao Lin, Shih-Hua Fang, Yu Tsao. (2017) Improving biodiversity assessment via unsupervised separation of biological sounds from long-duration recordings. Scientific Reports, 7: 4547.

Conference Presentations

  1. Tzu-Hao Lin, Yu-Huang Wang, Han-Wei Yen, Yu Tsao (2017) Listening to the ecosystem: the integration of machine learning and a long-term soundscape monitoring network. International Symposium on Grids & Clouds 2017 (March, Taipei, Taiwan)
  2. Tzu-Hao Lin, Chih-Kai Yang, Lien-Siang Chou, Shih-Hau Fang, and Yu Tsao (2016) Acoustic response of Indo-Pacific humpback dolphins to the variability of marine soundscape. 5th Joint Meeting of the Acoustical Society of America and Acoustical Society of Japan (November, Honolulu, USA)
  3. Tzu-Hao Lin, Lien-Siang Chou, and Yu-Huang Wang (2016) Investigation on the dynamics of soundscape by using unsupervised detection and classification algorithms. Ecoacoustics Congress 2016 (June, Michigan, USA)