Introduction

Preface

This content was prepared by the participating students as part of the “Proximate sensing” course, conducted by Dr. Lea Heidrich. It is dedicated to the examination of sensor-based insect recording methods and aims to capture and analyze the different reactions of insects to offered color panels.

Protocol

The vast majority of animal species worldwide are insects (Høye et al. 2021). They fulfill important ecological functions, such as the pollination of plants, but also the maintenance of food webs as a food source for birds and other animals (Hallmann et al. 2017). Recent research results show that the number of insect populations is declining significantly worldwide (Høye et al. 2021, Montgomery et al. 2021, Hallmann et al. 2017). This decline has caused a stir in science and politics and is increasingly gaining attention in the general public as well (Høye et al. 2021, Hallmann et al. 2017). Climate change, habitat loss and degradation, as well as fragmentation, are reasons for the drastic decline of insects on a global scale (Hallmann et al. 2017), the effects of which on future ecosystem services could be severe (Alcocer et al. 2022).

This situation highlights the need for effective monitoring to better track insect losses, assess future scenarios, and also to better evaluate the development of suitable conservation programs, for example, in agriculture and forestry (Alcocer et al. 2022, Zhao et al. 2022, Høye et al. 2021).

Traditional methods of insect monitoring are carried out using active or passive traps and are subsequently evaluated by experts (Bjerge et al. 2022). Commonly used trap types include Malaise traps (Malaise, 1937), where large tent-like nets guide insects into a common area (Montgomery et al. 2021). Light traps are used to capture nocturnal insects (ibid.). Pan traps (Moericke, 1951) are bowls filled with liquid, and pitfall traps (Hertz, 1927) capture ground-dwelling insects using containers placed near the ground (ibid.). This traditional monitoring poses a challenge because setting up, checking, and evaluating traps is often associated with high costs and time expenditure, and is also ethically questionable (Bjerge et al. 2022, Høye et al. 2021, Möglich et al. 2023). Another problem is the focus on a few rare species due to the high effort involved (Høye et al. 2021).

To reduce these costs and time expenditure, technological development in recent years has led to traditional monitoring methods being increasingly replaced by passive automated monitoring (Montgomery et al. 2012, Høye et al. 2021). In this approach, sensors such as audio devices or cameras effectively, continuously, and non-invasively record data throughout daily and seasonal cycles (Høye et al. 2021).

The use of camera sensors is proving particularly successful in identifying species, recording their abundance, and analyzing their behavior (Bjerge et al. 2022, Wittmann et al. 2024). In recent studies, passive, automated insect monitoring has already been successfully applied (ibid.). A study within the Natur 4.0 project by Möglich et al. (2023) demonstrated that it is possible to record the decline of moths using an automated moth trap made from commercially available hardware. Bjerge et al. (2022), with their real-time monitoring and classification, were able to successfully track individual living insects. Høye et al. (2021) analyzed the behavior of honeybees using time-lapse cameras.

The passive automated monitoring method presents the challenge of analyzing and identifying a large amount of data (Montgomery et al. 2021). The decreasing number of experts for species identification further exacerbates this problem (Möglich et al. 2023). The method of deep learning monitoring is currently gaining increasing relevance for the identification of insects as well as their abundance, biomass, and diversity (Høye et al. 2021). Furthermore, it enables the analysis of phenotypic characteristics and the behavior of insects (ibid.).

The passive automated monitoring of insects is still in its early stages, meaning that numerous factors have not yet been sufficiently tested. Therefore, within the scope of this project, a camera sensor box was developed that attracts, recognizes, identifies, and tracks both diurnal and nocturnal flying insects. To achieve this, the hardware of two previously tested sensor boxes was combined. The first is the automated moth trap from the Nature 4.0 project (Zeuss et al. 2024; Möglich et al. 2023), and the second is the automated insect camera trap “Insect Detect” by Maximilian Sittinger (Sittinger 2024). Colored plates made of two different materials are intended to attract the insects, which will be released into a flight cage in the botanical garden in Marburg.

Updated: