Stefan Kahl, Ph.D.

Dr Stefan Kahl
Dr Stefan Kahl

How can computers learn to recognize birds from their sounds? As a Lead for BirdNET Technology within the K. Lisa Yang Center for Conservation Bioacoustics, I am trying to find an answer to this question. My research is mainly focused on the detection and classification of avian sounds using machine learning. Automated observation of avian vocal activity and species diversity can be a transformative tool for ornithologists, conservation biologists, and birdwatchers to assist in long-term monitoring of critical environmental niches.

With a background in computer vision and deep learning, I am mainly focusing on developing new methods to process large data collections of environmental sounds. After completing my master’s degree in Applied Computer Science in 2014, I became a research assistant at the Chemnitz University of Technology, Germany. I was involved in research projects covering human-computer and human-robot interaction, multimodal media retrieval, and mobile application development.

I joined the Yang Center in 2019, continuing my work on a bird sound recognition system I call BirdNET. My goal is to assist experts and citizen scientists in their work of monitoring and protecting our birds by developing a wide range of applications such as smartphone apps, public demonstrators, web interfaces, and robust analysis frameworks.

Year Hired: 2019

Contact Information
K. Lisa Yang Center for Conservation Bioacoustics
Cornell Lab of Ornithology
159 Sapsucker Woods Road, Ithaca, NY 14850, USA.
Email: sk2487@cornell.edu
Website: https://birdnet.cornell.edu/

Social Media: Google Scholar, GitHub, Twitter: @kahst

Degree(s):
Ph.D. Deep learning for bioacoustics, Chemnitz University of Technology, Germany, in progress
M.Sc. Applied Computer Science, Chemnitz University of Technology, Germany, 2013
B.Sc. Applied Computer Science, Chemnitz University of Technology, Germany, 2011

Recent Publications

Wood, C.M. et al. (2024) ‘Real-time acoustic monitoring facilitates the proactive management of biological invasions’, Biological Invasions [Preprint]. Available at: https://doi.org/https://doi.org/10.1007/s10530-024-03426-y.
Wood, C.M. and Kahl, S. (2024) ‘Guidelines for appropriate use of BirdNET scores and other detector outputs’, Journal of Ornithology [Preprint]. Available at: https://doi.org/10.1007/s10336-024-02144-5.
Wood, C.M. et al. (2024) ‘A scalable and transferable approach to combining emerging conservation technologies to identify biodiversity change after large disturbances’, Journal of Applied Ecology, 61(4), pp. 797–808. Available at: https://doi.org/10.1111/1365-2664.14579.
Ghani, B. et al. (2023) ‘Global birdsong embeddings enable superior transfer learning for bioacoustic classification’, Scientific Reports, 13(1), p. 22876. Available at: https://doi.org/10.1038/s41598-023-49989-z.
Sossover, D. et al. (2023) ‘Using the BirdNET algorithm to identify wolves, coyotes, and potentially their interactions in a large audio dataset’, Mammal Research [Preprint]. Available at: https://doi.org/10.1007/s13364-023-00725-y.
Kelly, K.G. et al. (2023) ‘Estimating population size for California spotted owls and barred owls across the Sierra Nevada ecosystem with bioacoustics’, Ecological Indicators, 154, p. 110851. Available at: https://doi.org/10.1016/j.ecolind.2023.110851.
Kahl, S. et al. (2023) ‘Overview of BirdCLEF 2023: Automated Bird Species Identification in Eastern Africa 4.0’, in Working Notes of CLEF.
McGinn, K. et al. (2023) ‘Feature embeddings from the BirdNET algorithm provide insights into avian ecology’, Ecological Informatics, 74, p. 101995. Available at: https://doi.org/10.1016/j.ecoinf.2023.101995.
Brunk, K.M. et al. (2023) ‘Quail on fire: changing fire regimes may benefit mountain quail in fire-adapted forests’, Fire Ecology, 19(1), p. 19. Available at: https://doi.org/10.1186/s42408-023-00180-9.
Joly, A. et al. (2023) ‘Overview of LifeCLEF 2023: Evaluation of AI Models for the Identification and Prediction of Birds, Plants, Snakes and Fungi’, in A. Arampatzis et al. (eds) Experimental IR Meets Multilinguality, Multimodality, and Interaction. Cham: Springer Nature Switzerland (Lecture Notes in Computer Science), pp. 416–439. Available at: https://doi.org/10.1007/978-3-031-42448-9_27.
Kahl, S. et al. (2022) ‘Overview of BirdCLEF 2022: Endangered bird species recognition in soundscape recordings’, in. CLEF 2022 - Conference and Labs of the Evaluation Forum, p. 1929. Available at: https://hal.inrae.fr/hal-03791428 (Accessed: 16 December 2022).
Joly, A. et al. (2022) ‘Overview of LifeCLEF 2022: an evaluation of Machine-Learning based Species Identification and Species Distribution Prediction’, CLEF 2022 - Conference and Labs of the Evaluation Forum [Preprint].
Wood, C.M. et al. (2022) ‘The machine learning–powered BirdNET App reduces barriers to global bird research by enabling citizen science participation’, PLOS Biology, 20(6), p. e3001670. Available at: https://doi.org/10.1371/journal.pbio.3001670.
Kahl, S. et al. (2021) ‘Overview of BirdCLEF 2021: Bird call identification in soundscape recordings’, CLEF 2021 [Preprint].
Wood, C.M. et al. (2021) ‘Survey coverage, recording duration and community composition affect observed species richness in passive acoustic surveys’, Methods in Ecology and Evolution [Preprint]. Available at: https://doi.org/10.1111/2041-210X.13571.
Kahl, S. et al. (2021) ‘BirdNET: A deep learning solution for avian diversity monitoring’, Ecological Informatics [Preprint]. Available at: https://doi.org/10.1016/j.ecoinf.2021.101236.
Kahl, S. et al. (2018) ‘Recognizing Birds from Sound - The 2018 BirdCLEF Baseline System’, Computer Vision and Pattern Recognition [Preprint]. Available at: https://doi.org/arXiv:1804.07177v1.
Kahl, S. et al. (2017) ‘Large-Scale Bird Sound Classification using Convolutional Neural Networks.’, LifeCLEF 2017, Dublin, Ireland, 12-13 September 2017.
McGinn, K. et al. (In Press) ‘Frequent, heterogeneous fire supports a forest owl assemblage.’, Ecological Applications [Preprint].