Transfer Deep Learning for Remote Sensing Datasets: A Comparison Study
Ver/ Abrir
Impacto
Scholar |
Otros documentos de la autoría: Hernandez-Sequeira, Itza; Fernandez-Beltran, Ruben; Pla, Filiberto
Metadatos
Mostrar el registro completo del ítemcomunitat-uji-handle:10234/9
comunitat-uji-handle2:10234/43662
comunitat-uji-handle3:10234/159451
comunitat-uji-handle4:
INVESTIGACIONMetadatos
Título
Transfer Deep Learning for Remote Sensing Datasets: A Comparison StudyFecha de publicación
2022-07-17Editor
IEEECita bibliográfica
HERNANDEZ-SEQUEIRA, Itza; FERNANDEZ-BELTRAN, Ruben; PLA, Filiberto. Transfer Deep Learning for Remote Sensing Datasets: A Comparison Study. En IGARSS 2022-2022 IEEE International Geoscience and Remote Sensing Symposium. IEEE, 2022. p. 3207-3210.Tipo de documento
info:eu-repo/semantics/conferenceObjectVersión
info:eu-repo/semantics/submittedVersionPalabras clave / Materias
Resumen
Remote sensing is also benefiting from the quick development of deep learning algorithms for image analysis and classification tasks. In this paper, we evaluate the classification performance of a well-known Convolu ... [+]
Remote sensing is also benefiting from the quick development of deep learning algorithms for image analysis and classification tasks. In this paper, we evaluate the classification performance of a well-known Convolutional Neural Network (CNN) models, such as ResNet50, using a transfer learning approach. We compare the performance when using vector-features acquired from general purpose data, such as the ImageNet [1], versus remote sensing data like BigEarthNet [2], UCMerced [3], RESISC45 [4] and So2Sat [5]. The results show that the model pre-trained on RESISC-45 data achieved the highest accuracy when classifying the Eurosat [6] testing dataset. This was followed by the model pre-trained on Imagenet with 95.94% and BigEarthNet with 95.93%. When presented with diverse remote sensing data, the classification improved in regards to large quantities of general-purpose data. The experiments carried out also show, that multi modal (co-registered synthetic aperture radar and multispectral) did not increase the classification rate with respect to using only multispectral data. The source codes of this work are available for reproducible research at https://github.com/itzahs/CNN-RS. [-]
Descripción
Ponencia presentada en IGARSS 2022 - 2022 IEEE International Geoscience and Remote Sensing Symposium, 17-22 July 2022, Kuala Lumpur, Malaysia
Publicado en
IGARSS 2022 - 2022 IEEE International Geoscience and Remote Sensing SymposiumEntidad financiadora
Ministerio de Ciencia, Innovación y Universidades (Spain) | Universitat Jaume I
Código del proyecto o subvención
RTI2018-098651-B-C54 | PREDOC/2020/50