Mostrar el registro sencillo del ítem

dc.contributor.authorSzczurek, Krzysztof Adam
dc.contributor.authorMarin, Raul
dc.contributor.authorMatheson, Eloise
dc.contributor.authorPerier, Hugo
dc.contributor.authorBuonocore, Luca Rosario
dc.contributor.authorDi Castro, Mario
dc.date.accessioned2022-11-28T14:41:03Z
dc.date.available2022-11-28T14:41:03Z
dc.date.issued2021
dc.identifier.citationSzczurek, K.; Prades, R.; Matheson, E.; Perier, H.; Buonocore, L. and Di Castro, M. (2021). From 2D to 3D Mixed Reality Human-Robot Interface in Hazardous Robotic Interventions with the Use of Redundant Mobile Manipulator. In Proceedings of the 18th International Conference on Informatics in Control, Automation and Robotics - ICINCO, ISBN 978-989-758-522-7; ISSN 2184-2809, pages 388-395. DOI: 10.5220/0010528503880395ca_CA
dc.identifier.isbn9789897585227
dc.identifier.issn2184-2809
dc.identifier.urihttp://hdl.handle.net/10234/200952
dc.descriptionPart de la conferència: ICINCO 2021: 18th International Conference on Informatics in Control, Automation and Robotics (juliol 2021)ca_CA
dc.description.abstract3D Mixed Reality (MR) Human-Robot Interfaces (HRI) show promise for robotic operators to complete tasks more quickly, safely and with less training. The objective of this study is to assess the use of 3D MR HRI environment in comparison with a standard 2D Graphical User Interface (GUI) in order to control a redundant mobile manipulator. The experimental data was taken during operation with a 9 DOF manipulator mounted in a robotized train, CERN Train Inspection Monorail (TIM), used for the Beam Loss Monitor robotic measurement task in a complex hazardous intervention scenario at CERN. The efficiency and workload of an operator were compared with the use of both types of interfaces with NASA TLX method. The usage of heart rate and Galvanic Skin Response parameters for operator condition and stress monitoring was tested. The results show that teleoperation with 3D MR HRI mitigates cognitive fatigue and stress by improving the operators understanding of both the robot’s pose and the surr ounding environment or scene.ca_CA
dc.format.extent8 p.ca_CA
dc.format.mimetypeapplication/pdfca_CA
dc.language.isoengca_CA
dc.publisherSCITEPRESS – Science and Technology Publications, Lda.ca_CA
dc.relation.isPartOfProceedings of the 18th International Conference on Informatics in Control, Automation and Robotics (ICINCO 2021)ca_CA
dc.rights© 2021 SciTePress, Science and Technology Publications, Lda - All rights reserved.ca_CA
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/ca_CA
dc.subjectHuman-Robot Interfaceca_CA
dc.subjectroboticsca_CA
dc.subjectteleoperationca_CA
dc.subjectvirtual realityca_CA
dc.subjectmixed realityca_CA
dc.subjectoperator workloadca_CA
dc.subjectgalvanic skin responseca_CA
dc.titleFrom 2D to 3D Mixed Reality Human-Robot Interface in Hazardous Robotic Interventions with the Use of Redundant Mobile Manipulatorca_CA
dc.typeinfo:eu-repo/semantics/conferenceObjectca_CA
dc.identifier.doi10.5220/0010528503880395
dc.rights.accessRightsinfo:eu-repo/semantics/openAccessca_CA
dc.relation.publisherVersionhttps://www.scitepress.org/Link.aspx?doi=10.5220/0010528503880395ca_CA
dc.type.versioninfo:eu-repo/semantics/publishedVersionca_CA


Ficheros en el ítem

Thumbnail

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

© 2021 SciTePress, Science and Technology Publications, Lda - All rights reserved.
Excepto si se señala otra cosa, la licencia del ítem se describe como: © 2021 SciTePress, Science and Technology Publications, Lda - All rights reserved.