Robot depth estimation inspired by fixational movements
Visualitza/
Metadades
Mostra el registre complet de l'elementcomunitat-uji-handle:10234/9
comunitat-uji-handle2:10234/7036
comunitat-uji-handle3:10234/8620
comunitat-uji-handle4:
INVESTIGACIONMetadades
Títol
Robot depth estimation inspired by fixational movementsData de publicació
2020-09-18Editor
Institute of Electrical and Electronics EngineersISSN
2379-8920Cita bibliogràfica
DURAN, Angel J.; DEL POBIL, Angel P. Robot depth estimation inspired by fixational movements. IEEE Transactions on Cognitive and Developmental Systems, 2020.Tipus de document
info:eu-repo/semantics/articleVersió de l'editorial
https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7274989Versió
info:eu-repo/semantics/acceptedVersionParaules clau / Matèries
Resum
Distance estimation is a challenge for robots, human
beings and other animals in their adaptation to changing environments. Different approaches have been proposed to tackle
this problem based on classical vision ... [+]
Distance estimation is a challenge for robots, human
beings and other animals in their adaptation to changing environments. Different approaches have been proposed to tackle
this problem based on classical vision algorithms or, more
recently, deep learning. We present a novel approach inspired
by mechanisms involved in fixational movements to estimate a
depth image with a monocular camera. An algorithm based
on microsaccades and head movements during visual fixation
is presented. It combines the images generated by these micromovements with the ego-motion signal, to compute the depth
map. Systematic experiments using the Baxter robot in the
Gazebo/ROS simulator are described to test the approach in two
different scenarios, and evaluate the influence of its parameters
and its robustness in the presence of noise. [-]
Proyecto de investigación
Ministerio de Economia y Competitividad/DPI2015- 69041-R, UJI-B2018-74Drets d'accés
http://rightsstatements.org/vocab/CNE/1.0/
info:eu-repo/semantics/openAccess
info:eu-repo/semantics/openAccess
Apareix a les col.leccions
- ICC_Articles [419]