dc.contributor.author | Ibáñez Fernández, Damián | |
dc.contributor.author | Fernandez-Beltran, Ruben | |
dc.contributor.author | Pla, Filiberto | |
dc.contributor.author | Yokoya, Naoto | |
dc.date.accessioned | 2022-06-02T14:40:42Z | |
dc.date.available | 2022-06-02T14:40:42Z | |
dc.date.issued | 2022-03-22 | |
dc.identifier.citation | D. Ibañez, R. Fernandez-Beltran, F. Pla and N. Yokoya, "DAT-CNN: Dual Attention Temporal CNN for Time-Resolving Sentinel-3 Vegetation Indices," in IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 15, pp. 2632-2643, 2022, doi: 10.1109/JSTARS.2022.3161190. | ca_CA |
dc.identifier.issn | 1939-1404 | |
dc.identifier.issn | 2151-1535 | |
dc.identifier.uri | http://hdl.handle.net/10234/197903 | |
dc.description.abstract | The synergies between Sentinel-3 (S3) and the forthcoming fluorescence explorer (FLEX) mission bring us the opportunity of using S3 vegetation indices (VI) as proxies of the
solar-induced chlorophyll fluorescence (SIF) that will be captured
by FLEX. However, the highly dynamic nature of SIF demands a
very temporally accurate monitoring of S3 VIs to become reliable
proxies. In this scenario, this article proposes a novel temporal
reconstruction convolutional neural network (CNN), named dual
attention temporal CNN (DAT-CNN), which has been specially
designed for time-resolving S3 VIs using S2 and S3 multitemporal observations. In contrast to other existing techniques, DATCNN implements two different branches for processing and fusing
S2 and S3 multimodal data, while further exploiting intersensor
synergies. Besides, DAT-CNN also incorporates a new spatial–
spectral and temporal attention module to suppress uninformative spatial–spectral features, while focusing on the most relevant
temporal stamps for each particular prediction. The experimental
comparison, including several temporal reconstruction methods
and multiple operational Sentinel data products, demonstrates
the competitive advantages of the proposed model with respect to
the state of the art. The codes of this article will be available at
https://github.com/ibanezfd/DATCNN. | ca_CA |
dc.format.extent | 12 p. | ca_CA |
dc.format.mimetype | application/pdf | ca_CA |
dc.language.iso | eng | ca_CA |
dc.publisher | IEEE | ca_CA |
dc.relation | Productos avanzados L3 y L4 para la misión FLEX-S· (FLEXL3L4) | ca_CA |
dc.relation.isPartOf | IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, Vol. 15 (2022) | ca_CA |
dc.rights.uri | http://creativecommons.org/licenses/by/4.0/ | ca_CA |
dc.subject | biophysical products | ca_CA |
dc.subject | fluorescence explorer (FLEX) | ca_CA |
dc.subject | Sentinel-2 (S2) | ca_CA |
dc.subject | Sentinel-3 (S3) | ca_CA |
dc.subject | temporal resolution | ca_CA |
dc.subject | vegetation mapping | ca_CA |
dc.subject | image reconstruction | ca_CA |
dc.subject | flexible printed circuits | ca_CA |
dc.subject | data models | ca_CA |
dc.subject | spatial resolution | ca_CA |
dc.subject | convolutional neural networks | ca_CA |
dc.subject | satellites | ca_CA |
dc.title | DAT-CNN: Dual Attention Temporal CNN for Time-Resolving Sentinel-3 Vegetation Indices | ca_CA |
dc.type | info:eu-repo/semantics/article | ca_CA |
dc.identifier.doi | 10.1109/JSTARS.2022.3161190 | |
dc.rights.accessRights | info:eu-repo/semantics/openAccess | ca_CA |
dc.type.version | info:eu-repo/semantics/publishedVersion | ca_CA |
project.funder.identifier | 10.13039/100014440 | ca_CA |
project.funder.name | Ministerio de Ciencia, Innovación y Universidades | ca_CA |
oaire.awardNumber | RTI2018-098651-B-C54 | ca_CA |