Enhanced Smart Crop Health Assessment using Interactive Masked Vision Transformer with Multi Network Attention Mechanism on UAV Imager
| International Journal of Electronics and Communication Engineering |
| © 2026 by SSRG - IJECE Journal |
| Volume 13 Issue 3 |
| Year of Publication : 2026 |
| Authors : Sharmila G |
How to Cite?
Sharmila G, "Enhanced Smart Crop Health Assessment using Interactive Masked Vision Transformer with Multi Network Attention Mechanism on UAV Imager," SSRG International Journal of Electronics and Communication Engineering, vol. 13, no. 3, pp. 312-323, 2026. Crossref, https://doi.org/10.14445/23488549/IJECE-V13I3P125
Abstract:
Soybeans have become one of the most significant oilseed and food crops worldwide. However, soybean crops are susceptible to numerous factors. Damage due to pests, illnesses, and other factors exceeds 20 per cent of the world's manufacturing. The usage of Unmanned Aerial Vehicles (UAVs) in crop fields was found to be a significant tool for identifying disease patches, enabling professionals and agriculturalists to make better decisions. Furthermore, in this context, deep learning (DL) has made important developments in Artificial Intelligence (AI). As a result, several studies have employed DL to solve a wide range of diverse problems. In the agricultural sector, DL has gained significant interest in improving crop productivity. This study introduces an Unmanned Aerial Vehicle-Based Soybean Crop Health Monitoring Using Advanced Deep Learning Architectures (UAVSCHM-DLA) model. The aim is to present an intelligent system that is capable of monitoring and assessing soybean crop health using integrated UAV and leaf images. Initially, Histogram Equalisation (HE) and Bilateral Filtering (BF) methods are applied to perform image pre-processing. For effective feature extraction, a vision transformer with the Interactive Mask Self-Attention (IMViT) method is employed. Finally, multiple neural networks with an attention mechanism (MNet-Attn) method are implemented for classification. The comparison of the UAVSCHM-DLA technique illustrated superior accuracies of 98.20% and 97.01% on the leaf and UAV datasets, respectively.
Keywords:
Crop Monitoring, Crop Health Assessment, Unmanned Aerial Vehicles, Deep Learning, Interactive Mask Self Attention, Vision Transformer.
References:
[1] Shanxin Zhang et al., “Monitoring of Soybean Maturity using UAV Remote Sensing and Deep Learning,” Agriculture, vol. 13, no. 1, pp. 1-21, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[2] Everton Castelão Tetila et al., “Detection and Classification of Soybean Pests using Deep Learning with UAV Images,” Computers and Electronics in Agriculture, vol. 179, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[3] Everton Castelão Tetila et al., “Automatic Recognition of Soybean Leaf Diseases using UAV Images and Deep Convolutional Neural Networks,” IEEE Geoscience and Remote Sensing Letters, vol. 17, no. 5, pp. 903-907, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[4] Jayme Garcia Arnal Barbedo, “Deep Learning for Soybean Monitoring and Management,” Seeds, vol. 2, no. 3, pp. 340-356, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[5] Tej Bahadur Shahi et al., “Recent Advances in Crop Disease Detection using UAV and Deep Learning Techniques,” Remote Sensing, vol. 15, no. 9, pp. 1-29, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[6] Yu-Hyeon Park et al., “Detection of Soybean Insect Pest and a Forecasting Platform using Deep Learning with Unmanned Ground Vehicles,” Agronomy, vol. 13, no. 2, pp. 1-16, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[7] Pengting Ren et al., “Estimation of Soybean Yield by Combining Maturity Group Information and Unmanned Aerial Vehicle Multi-Sensor Data Using Machine Learning,” Remote Sensing, vol. 15, no. 17, pp. 1-23, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[8] Bo Zhang, and Dehao Zhao, “An Ensemble Learning Model for Detecting Soybean Seedling Emergence in UAV Imagery,” Sensors, vol. 23, no. 15, pp. 1-19, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[9] Abdelmalek Bouguettaya et al., “Deep Learning Techniques to Classify Agricultural Crops Through UAV Imagery: A Review,” Neural Computing and Applications, vol. 34, pp. 9511-9536, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[10] Maitiniyazi Maimaitijiang et al., “Crop Monitoring Using Satellite/UAV Data Fusion and Machine Learning,” Remote Sensing, vol. 12, no. 9, pp. 1-23, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[11] Xun Yu et al., “FEL-YoloV8: A New Algorithm for Accurate Monitoring Soybean Seedling Emergence Rates and Growth Uniformity,” IEEE Transactions on Geoscience and Remote Sensing, vol. 63, pp. 1-17, 2025.
[CrossRef] [Google Scholar] [Publisher Link]
[12] Muhammad Aqeel et al., “Real-Time Crop Health Monitoring Using AI-Based Drone Surveillance and YOLOv12,” Pakistan Journal of Scientific Research, vol. 5, no. 1, pp. 29-39, 2025.
[CrossRef] [Google Scholar] [Publisher Link]
[13] Xiaoming Li et al., “HSDT-TabNet: A Dual-Path Deep Learning Model for Severity Grading of Soybean Frogeye Leaf Spot,” Agronomy, vol. 15, no. 7, pp. 1-21, 2025.
[CrossRef] [Google Scholar] [Publisher Link]
[14] Mashrur Kabir et al., “Design and Implementation of a Crop Health Monitoring System,” Doctoral Thesis, Brac University, pp. 1-182, 2023.
[Google Scholar] [Publisher Link]
[15] Sean Wallinger et al., “Toward a Cost-Effective Smart Crop Health Monitoring System,” 2023 IEEE MetroCon, Hurst, TX, USA, pp. 1 3, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[16] Amelia Sarah Binti Abdul Rahman et al., “Multispectral Image Analysis for Crop Health Monitoring System,” 2022 IEEE 5th International Symposium in Robotics and Manufacturing Automation (ROMA), Malacca, Malaysia, pp. 1-6, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[17] Thangavel Murugan et al., “Research Advances in Maize Crop Disease Detection Using Machine Learning and Deep Learning Approaches,” Computers, vol. 15, no. 2, pp. 1-53, 2026.
[CrossRef] [Google Scholar] [Publisher Link]
[18] Juntao Tong et al., “ToT-Net: A Generalized and Real-Time Crop Disease Detection Framework via Task-Level Meta-Learning and Lightweight Multi-Scale Transformer,” Smart Agricultural Technology, vol. 12, pp. 1-17, 2025.
[CrossRef] [Google Scholar] [Publisher Link]
[19] Muhammad Nouman Noor et al., “An Effective Approach for Recognition of Crop Diseases Using Advanced Image Processing and YOLO v8,” Food Science & Nutrition, vol. 141, no. 2, pp. 1-26, 2026.
[CrossRef] [Google Scholar] [Publisher Link]
[20] V. Gopinath, and M. Sangeetha, “Spatial-Temporal Digital Twin for Crop Disease Prediction: A Hybrid ConvLSTM-Graph Neural Network Approach,” 2025 3rd International Conference on Intelligent Cyber Physical Systems and Internet of Things (ICoICI), Coimbatore, India, pp. 570-575, 2025.
[CrossRef] [Google Scholar] [Publisher Link]
[21] Hardeep Kaur, Bhanu Priya, and Kuldeep Singh, “CNNx: Optimizing Smart CNN Models for Efficient Banana Disease Detection and Severity Estimation,” Concurrency and Computation: Practice and Experience, vol. 38, no. 1, 2026.
[CrossRef] [Google Scholar] [Publisher Link]
[22] Vivek Parganiha, and Monika Verma, “An Efficient Disease Prediction in Smart Agriculture Using Advanced Deep Learning Methods for Improving Crop Productivity,” Journal of Phytopathology, vol. 173, no. 5, 2025.
[CrossRef] [Google Scholar] [Publisher Link]
[23] Anqi Kang et al., “A2Former: An Airborne Hyperspectral Crop Classification Framework Based on a Fully Attention-Based Mechanism,” Remote Sensing, vol. 18, no. 2, pp. 1-29, 2026.
[CrossRef] [Google Scholar] [Publisher Link]
[24] Zhen Du et al., “UniHSFormer X for Hyperspectral Crop Classification with Prototype-Routed Semantic Structuring,” Agriculture, vol. 15, no. 13, pp. 1-32, 2025.
[CrossRef] [Google Scholar] [Publisher Link]
[25] Gourav Mondal, Rajesh Kumar Dhanaraj, and Md. Shohel Sayeed, “UAV‐MCND: A Novel System for Multiclass Natural Disaster Classification Using FusionNet‐4 and Water Wheel‐Guided Walrus Optimization,” International Journal of Intelligent Systems, vol. 2025, no. 1, pp. 1-25, 2025.
[CrossRef] [Google Scholar] [Publisher Link]
[26] Qihao Chen et al., “IMViT: Adjacency Matrix-Based Lightweight Plain Vision Transformer,” IEEE Access, vol. 13, pp. 18355-18545, 2025.
[CrossRef] [Google Scholar] [Publisher Link]
[27] Xiaobin Wei et al., “Improved MNet-Atten Electric Vehicle Charging Load Forecasting Based on Composite Decomposition and Evolutionary Predator–Prey and Strategy,” World Electric Vehicle Journal, vol. 16, no. 10, pp. 1-23, 2025.
[CrossRef] [Google Scholar] [Publisher Link]
[28] Sayali Shinde, and Vahida Attar, “An Indian UAV and Leaf Image Dataset for Integrated Crop Health Assessment of Soybean Crop,” Data in Brief, vol. 60, pp. 1-16, 2025.
[CrossRef] [Google Scholar] [Publisher Link]
[29] Jing Zhang et al., “ED-Swin Transformer: A Cassava Disease Classification Model Integrated with UAV Images,” Sensors, vol. 25, no. 8, pp. 1-16, 2025.
[CrossRef] [Google Scholar] [Publisher Link]
[30] Girma Tariku et al., “Advanced Image Pre-Processing and Integrated Modeling for UAV Plant Image Classification,” Drones, vol. 8, no. 11, pp. 1-18, 2024.
[CrossRef] [Google Scholar] [Publisher Link]

10.14445/23488549/IJECE-V13I3P125