Deep Learning Techniques on Very High Resolution Images for Detecting Trees and Their Health Conditions

Yaseen Al-Mulla*, Ahsan Ali, Krishna Parimi

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Very high-resolution remote sensing imagery and imagery from unmanned aerial vehicles have been acknowledged as well as valued in recent years for a variety of purposes, especially in object detection. On the other hand, deep learning (DL) has evolved as a tool for assessing pattern recognition applications and standard machine learning techniques. This study applied several DL applications in the Sultanate of Oman for detecting trees and examining their health status using very high-resolution satellite imagery data. The DL model efficiently distinguished the date palm trees from other plants and other land uses, according to our results. Aside from date palms, the model developed in this study can serve as a starting point for models to identify other types of diseased plants and trees.
Original languageEnglish
Title of host publicationIGARSS 2023 - 2023 IEEE International Geoscience and Remote Sensing Symposium, Proceedings
Place of PublicationPasadena, CA, USA
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages6542-6544
Number of pages3
ISBN (Electronic)979-8-3503-2010-7
ISBN (Print)979-8-3503-3174-5
DOIs
Publication statusPublished - Jul 16 2023
Event2023 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2023 - Pasadena, United States
Duration: Jul 16 2023Jul 21 2023

Publication series

NameIGARSS 2023 - 2023 IEEE International Geoscience and Remote Sensing Symposium

Conference

Conference2023 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2023
Country/TerritoryUnited States
CityPasadena
Period7/16/237/21/23

Keywords

  • AVHR
  • Deep learning
  • Drone
  • Remote Sensing

ASJC Scopus subject areas

  • Computer Science Applications
  • General Earth and Planetary Sciences

Cite this