Abstract:
Maize is the main food crop grown in Sub-Saharan Africa and is consumed by people with varying food preferences. Maize production is continuously and severely affected by several threats, such as weeds, insects, bacteria, viruses, nematodes, fungi, drought, and climate change. In the year 2016 fall armyworms (faw) invaded the crops in Africa causing severe damage. The worm grows very fast and destroys the maize crops in a short period. Different initiatives have been explored to raise farmer awareness, such as the introduction of mobile applications such as the Food and Agriculture Organisation Fall Armyworm Monitoring and Early Warning System (FAO-FAMEWS), PlantVillage-NURU and others. All the smartphone apps mentioned directly assist small-scale farmers in identifying and being aware of the faw. Moreover, they lack internet of things (IoT) and machine learning (ML) components that would allow them to cover a larger region and correct more data in a shorter period of time while analyzing it in real time. In this study, we design a system to assist large-scale farmers in identifying the faw-caused pattern and sending a report to the farmers highlighting the size of the infection and the location for prompt action. In the proposed system we integrated unmanned aerial vehicle (UAV) techniques and ML algorithms to detect early indicators of faw in big farms and swiftly analyze the acquired data. The UAV, outfitted with IoT sensors and ML algorithms, allows capturing, analyzing, and immediately notifying the farmer of infected areas for further action. The study is divided into three stages. First, the captured UAV images are cropped to 150x150 pixels, and the Shi-Tomas corner detection algorithm is applied to the cropped images. These images are then trained on convolution neural network (CNN) based models, specifically the VGG16, VGG19, InceptionV3 and MobilenetV2 to distinguish between healthy and infected images. The results show that this approach outperforms other CNN-based models in terms of accuracy, sensitivity, specificity, precision, and F1-score. Both InceptionV3 and MobileNetV2 outperformed other models by achieving an accuracy of 100%, a sensitivity of 1.00, a specificity of 1.00, a precision of 1.00, and an F1-score of 1.00. Despite its ability to achieve the highest precision and sensitivity in a real-time classification environment, the proposed algorithm can introduce noise to background images, leading to misclassification. The second part of the study entails developing an autonomous system that classifies the UAV images into four groups, namely healthy, infected, weed and redundant counting infected images and sending a report to a farmer. The system is
P a g e | xii
divided into three parts: data collecting using UAV, data processing and analysis with CNN-based models (Hybrid CNN, VGG16, InceptionV3, Resnet50, and XceptionNet), and reporting to end users via a short messaging service. Moreover, the proposed realtime monitoring system's accuracy and training time are enhanced by using the hybrid CNN (HCNN) model. As a result, the system achieved 96.98% accuracy and reduced the training time by 16% up to 44%, making it quicker at detecting infected maize plants. The third stage entailed developing two algorithms that improve the system designed in the previous step by developing a model that enhances the contrast of images captured in other farms to correlate with those used for training the proposed model to improve accuracy; and developing an algorithm that counts infected images, pinpoints the location of infected images, and estimates the infected areas on each UAV image. The first algorithm reduces the percentage error of images captured from other farms from 17.1% of the health group to 0.68% and -57.82% of the infected group to 6.64%, while the second algorithm improves the system by pinpointing infected images and displaying the size of infection on each UAV image. In conclusion, the proposed method improves faw detection in large-scale farms, saving time and money. It uses UAV technology to quickly scan fields and upload images to the cloud for analysis with CNN-based models. The system reports infected areas, allowing farmers to target only affected areas when spraying pesticides.