Robot depth estimation inspired by fixational movements
Ver/ Abrir
Metadatos
Mostrar el registro completo del ítemcomunitat-uji-handle:10234/9
comunitat-uji-handle2:10234/7036
comunitat-uji-handle3:10234/8620
comunitat-uji-handle4:
INVESTIGACIONMetadatos
Título
Robot depth estimation inspired by fixational movementsFecha de publicación
2020-09-18Editor
Institute of Electrical and Electronics EngineersISSN
2379-8920Cita bibliográfica
DURAN, Angel J.; DEL POBIL, Angel P. Robot depth estimation inspired by fixational movements. IEEE Transactions on Cognitive and Developmental Systems, 2020.Tipo de documento
info:eu-repo/semantics/articleVersión de la editorial
https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7274989Versión
info:eu-repo/semantics/acceptedVersionPalabras clave / Materias
Resumen
Distance estimation is a challenge for robots, human
beings and other animals in their adaptation to changing environments. Different approaches have been proposed to tackle
this problem based on classical vision ... [+]
Distance estimation is a challenge for robots, human
beings and other animals in their adaptation to changing environments. Different approaches have been proposed to tackle
this problem based on classical vision algorithms or, more
recently, deep learning. We present a novel approach inspired
by mechanisms involved in fixational movements to estimate a
depth image with a monocular camera. An algorithm based
on microsaccades and head movements during visual fixation
is presented. It combines the images generated by these micromovements with the ego-motion signal, to compute the depth
map. Systematic experiments using the Baxter robot in the
Gazebo/ROS simulator are described to test the approach in two
different scenarios, and evaluate the influence of its parameters
and its robustness in the presence of noise. [-]
Proyecto de investigación
Ministerio de Economia y Competitividad/DPI2015- 69041-R, UJI-B2018-74Derechos de acceso
http://rightsstatements.org/vocab/CNE/1.0/
info:eu-repo/semantics/openAccess
info:eu-repo/semantics/openAccess
Aparece en las colecciones
- ICC_Articles [417]