AlertTrap: A Study on Object Detection in Remote Insect Trap Monitoring System Using on the Edge Deep Learning Platform

Authors

  • An Dinh Le Electrical and Computer Engineering Department, University of California San Diego, USA https://orcid.org/0009-0000-4684-715X
  • Duy Anh Pham Joint Lab for Artificial Intelligence and Data Science, Osnabrück University, Germany
  • Dong Thanh Graduate School of Bioresource and Bioenvironmental Sciences, Kyushu University, Japan https://orcid.org/0009-0007-2380-3649
  • Hien Bich Vo Electrical and Computer Engineering Department, Vietnamese-German University, Vietnam https://orcid.org/0000-0002-0537-2861

DOI:

https://doi.org/10.47852/bonviewJCCE42023264

Keywords:

fruit fly, environmental data, smart IoT, edge computing, Tensor Processing Unit (TPU)

Abstract

Fruit flies are one of the most harmful insect species to fruit yields. In AlertTrap, implementation of Single-Shot Multibox Detector (SSD) architecture with different state-of-the-art backbone feature extractors such as MobileNetV1 and MobileNetV2 appears to be potential solutions for the real-time detection problem. SSD-MobileNetV1 and SSD-MobileNetV2 perform well and result in AP at 0.5 of 0.957 and 1.0, respectively. You Only Look Once (YOLO) v4-tiny outperforms the SSD family with 1.0 in AP at 0.5; however, its throughput velocity is considerably slower, which shows SSD models are better candidates for real-time implementation. We also tested the models with synthetic test sets simulating expected environmental disturbances. The YOLOv4-tiny had better tolerance to these disturbances than the SSD models. The Raspberry Pi system successfully gathered environmental data and pest counts, sending them via email over 4 G. However, running the full YOLO version in real time on Raspberry Pi is not feasible, indicating the need for a lighter object detection algorithm for future research. Among model candidates, YOLOv4-tiny generally performs best, with SSD-MobileNetV2 also comparable and sometimes better, especially in scenarios with synthetic disturbances. SSD models excel in processing time, enabling real-time, high-accuracy detection. TFLITE versions of SSD models also process faster than their inference graph on Tensor Processing Unit (TPU) hardware, suggesting realtime implementation on edge devices like the Google Coral Dev Board. The results demonstrate the feasibility of real-time implementation of the fruit fly detection models on edge devices with high performance. In addition, YOLOv4 tiny is shown to be the most probable candidate because YOLOv4-tiny demonstrates a robust testing performance toward citrus fruit fly detection. Nevertheless, SSD-MobileNetV2 will be the better model, considering the inference time.

 

Received: 27 April 2024 | Revised: 6 June 2024 | Accepted: 17 June 2024

 

Conflicts of Interest

The authors declare that they have no conflicts of interest in this work.

 

Data Availability Statement

The GitHub datasets that support the findings of this study are openly available in AlertTrap-Dataset at https://github.com/a11to1n3/AlertTrap-Dataset?tab=readme-ov-file.

 

Author Contribution Statement

An Dinh Le: Conceptualization, Methodology, Software, Validation, Formal analysis, Investigation, Writing – original draft, Writing – review & editing, Visualization. Duy Anh Pham: Conceptualization, Methodology, Software, Validation, Formal analysis, Investigation, Writing – original draft, Visualization. Dong Thanh Pham: Conceptualization, Validation, Investigation, Resources, Data curation, Visualization. Hien Bich Vo: Supervision, Project administration, Funding acquisition.

 

 

Metrics

Metrics Loading ...

Downloads

Published

2024-06-24

Issue

Section

Research Articles

How to Cite

Le, A. D. ., Pham , D. A., Thanh, D. ., & Vo , H. B. (2024). AlertTrap: A Study on Object Detection in Remote Insect Trap Monitoring System Using on the Edge Deep Learning Platform. Journal of Computational and Cognitive Engineering. https://doi.org/10.47852/bonviewJCCE42023264