Skip navigation
Please use this identifier to cite or link to this item: https://repositorio.ufpe.br/handle/123456789/62424

Share on

Title: Enhancing mosquito egg counting accuracy through deep learning image restoration
Authors: AMARAL, Victoria Pantoja do
Keywords: mosquito surveillance; deep learning; image restoration; aedes aegypti
Issue Date: 1-Apr-2025
Citation: AMARAL, Victoria Pantoja do. Enhancing mosquito egg counting accuracy through deep learning image restoration. 2025. Trabalho de Conclusáo de Curso (Sistemas de Informação) - Universidade Federal de Pernambuco, Recife, 2025
Abstract: Monitoring Aedes aegypti populations is crucial for dengue prevention, with egg counts collected from ovitraps serving as a primary method for tracking. This study addresses the limitations of smartphone-captured images, which may suffer from motion blur, defocus, and noise—factors that significantly impair automated counting accuracy. We evaluated three deep learning image restoration models—MPRNet, Real-ESRGAN, and Restormer—to enhance image quality prior to automated egg detection. Using a dataset of 82 ovitrap images, the models were trained and evaluated based on both perceptual metrics (PSNR and NIQE) and their impact on automated egg counting compared to manual counts. Among the tested models, Real-ESRGAN achieved the best performance, improving counting accuracy from 78.4% to 106.5%. In contrast, MPRNet and Restormer performed poorly with the provided training data, reaching 331.7% and 1.5% accuracy, respectively. The results demonstrate that appropriate image enhancement techniques can improve the precision of mosquito egg counting under real-world conditions without requiring specialized equipment, potentially contributing to more efficient disease prevention strategies.
URI: https://repositorio.ufpe.br/handle/123456789/62424
Appears in Collections:(TCC) - Sistemas da Computação

Files in This Item:
File Description SizeFormat 
TCC Victoria Pantoja do Amaral.pdf9,77 MBAdobe PDFThumbnail
View/Open


This item is protected by original copyright



This item is licensed under a Creative Commons License Creative Commons