Abstract
This study examines the feasibility of an integrated motion and vibrotactile system for controlling a humanoid robotic arm using natural and intuitive movements rather than a complicated control schema. We examine a head-mounted display integrated into a system of arm-based motion sensors that control a humanoid robotic arm to determine if there is an observable difference between third-person and first-person perspective in the control of the robot arm. We look at vibration as a form of haptic feedback to relay the limitations of the robot arm back to the user. An experiment shows 30 participants were able to complete both gross and fine motor control tasks without fail indicating that this type of sensor based control systems is intuitive and easy to use. The majority of participants found the method of control to be intuitive, the inclusion of first-person perspective to be beneficial, and the vibration feedback to be either inconsequential or confusing.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Sakamoto, D., Takayuki, K., Ono, T., Ishiguro, H., Hagita, N.,: Android as a telecommunication medium with a human-like presence. In: International Conference on Human-Robot Interaction (HRI) (2007)
Tsui, K.M., Desai, M., Yanco, H., Uhlik, C.: Exploring use cases for telepresence robots. In: International Conference on Human-Robot Interaction (HRI) (2011)
Kristoffersson, A., Eklundh, K.S., Loutfi, A.: Measuring the quality of interaction in mobile robotic telepresence: a pilot’s perspective. Int. J. Soc. Robot. 5(1), 89–101 (2013)
Vicon: What is motion capture? (2015). http://www.vicon.com/what-is-motion-capture
Windolf, M., Götzen, N., Morlock, M.: Systematic accuracy and precision analysis of video motion capturing systems—exemplified on the Vicon-460 system. J. Biomech. 41(12), 2776–2780 (2008)
Thewlis, D., Bishop, C., Daniell, N., Paul, G.: Next generation low-cost motion capture systems can provide comparable spatial accuracy to high-end systems. J. Appl. Biomech. 29(1), 112–117 (2013)
Han, C., Kim, S., Oh, C., Seo, P., Hong, D., Ryu, Y.: The study of improving the accuracy in the 3D data acquisition of motion capture system. In: Second International Conference on Future Generation Communication and Networking Symposia (2008)
Yang, P.F., Sanno, M., Brüggemann, G.P., Rittweger, J.: Evaluation of the performance of a motion capture system for small displacement recording and a discussion for its application potential in bone deformation in vivo measurements. Proc. Inst Mech. Eng. Part H: J. Eng. Med. 226(11), 838–847 (2012)
Novo, C., Alharbi, S., Fox, M., Ouellette, E., Biden, E., Tingley, M., Chester, V.: The impact of technical parameters such as video sensor technology, system configuration, marker size and speed on the accuracy of motion analysis systems. IngenierÃa Mecánica. TecnologÃa y Desarrollo 5(1), 265–271 (2014)
Chen, X., Davis, J.: Camera placement considering occlusion for robust motion capture. Computer Graphics Laboratory, Stanford University, Technical report 2.2.2, p. 2 (2000)
Vlasic, D., Adelsberger, R., Vannucci, G., Barnwell, J., Gross, M., Matusik, W., Popovic, J.: Practical motion capture in everyday surroundings. ACM Trans. Graph. (TOG), 26(3) (2007)
Microsoft. https://www.microsoft.com/en-us/kinectforwindows/meetkinect/features.aspx
Staranowicz, A., Brown, G.R., Mariottini, G.: Evaluating the accuracy of a mobile Kinect-based gait-monitoring system for fall prediction. In: Proceedings of the 6th International Conference on Pervasive Technologies Related to Assistive Environments (2013)
Zhang, Z.: Microsoft Kinect sensor and its effect. IEEE Multimedia 19(2), 4–10 (2012)
Whitehead, A., Johnston, H., Fox, K., Crampton, N., Tuen, J.: Homogeneous accelerometer-based sensor networks for game interaction. Comput. Entertain. (CIE) 9(1), 1 (2011)
Whitehead, A., Crampton, N., Fox, K., Johnston, H.: Sensor networks as video game input devices. In: Proceedings of the 2007 Conference on Future Play. ACM (2007)
Burt, C.: Having fun, working out: adaptive and engaging video games for exercise. Master’s thesis, Carleton University, Ottawa (2014)
Godwin, A., Agnew, M., Stevenson, J.: Accuracy of inertial motion sensors in static, quasistatic, and complex dynamic motion. J. Biomech. Eng. 131(11) (2009)
Pollard, N., Hodgins, J., Riley, M., Atkeson, C.: Adapting human motion for the control of a humanoid robot. In: Proceedings of ICRA 2002, IEEE International Conference on Robotics and Automation (2002)
Obaid, M., Häring, M., Kistler, F., Bühling, R., André, E.: User-defined body gestures for navigational control of a humanoid robot. In: Ge, S.S., Khatib, O., Cabibihan, J.J., Simmons, R., Williams, M.A. (eds.) Social Robotics, pp. 367–377. Springer, Heidelberg (2012)
Shin, J. Lee, J., Gleicher, M., Shin, S.: Computer puppetry: an importance-based approach. ACM Trans. Graph. (2001)
El-laithy, R., Huang, J., Yeh, M.: Study on the use of Microsoft Kinect for robotics applications. In: Position Location and Navigation Symposium (PLANS) (2012)
Liying, C., Sun, Q., Su, H., Cong, Y., Zhao, S.: Design and implementation of human-robot interactive demonstration system based on Kinect. In: Control and Decision Conference (2012)
Qian, K., Niu, J., Yang, H.: Developing a gesture based remote human-robot interaction system using Kinect. Int. J. Smart Home 7(4) (2013)
Van den Bergh, M., Carton, D., De Nijs, R., Mitsou, N., Landsiedel, C., Kuehnlenz, K., Wollherr, D., Van Gool, L., Buss, M.: Real-time 3D hand gesture interaction with a robot for understanding directions from humans. In: RO-MAN. IEEE (2011)
Parzych, M., Dabrowski, A., Cetnarowicz, D.: Aspects of Microsoft Kinect sensor application to servomotor control. Bull. Pol. Acad. Sci. Tech. Sci. 62(3), 595–601 (2014)
Manigandan, M., Jackin, M.: Wireless vision based mobile robot control using hand gesture recognition through perceptual color space. In: International Conference on Advances in Computer Engineering (ACE) (2010)
Hussein, M.: Motion Control of Robot by using Kinect Sensor. Res. J. Appl. Sci. Eng. Technol. 8(11), 1384–1388 (2014)
Reddivari, H., Yang, C., Ju, Z., Liang, P., Li, Z., Xu, B.: Teleoperation control of Baxter robot using body motion tracking. In: International Conference on Multisensor Fusion and Information Integration for Intelligent Systems (MFI) (2014)
Du, G., Zhang, P., Mai, J., Li, Z.: Markerless Kinect-based hand tracking for robot teleoperation. Int. J. Adv. Robot. Syst. 9(10) (2012)
Miller, N., Jenkins, O.C., Kallmann, M., Matric, M.J.: Motion capture from inertial sensing for untethered humanoid teleoperation. In: International Conference of Humanoid Robotics, pp. 547–565 (2004)
National Aeronautics and Space Administration: Robonaut (2015). http://robonaut.jsc.nasa.gov/
Prabakar, M., Kim, J.H.: TeleBot: design concept of telepresence robot for law enforcement. In: Proceedings of the 2013 World Congress on Advances in Nano, Biomechanics, Robotics, and Energy Research (ANBRE 2013), Seoul, Korea (2013)
Kuchenbecker, K., Fiene, J., Niemeyer, G.: Improving contact realism through event-based haptic feedback. IEEE Trans. Vis. Comput. Graph. 219–230 (2006)
Kron, A., Schmidt, G., Petzold, B., Zah, M.I., Hinterseer, P., Steinbach, E.: Disposal of explosive ordnances by use of a bimanual haptic telepresence system. In: International Conference on Robotics and Automation (2004)
Okamura, A.: Methods for haptic feedback in teleoperated robot-assisted surgery. Ind. Robot.: Int. J. 31(6), 499–508 (2004)
Sugiyama, J., Miura, J.: A wearable visuo-inertial interface for humanoid robot control. In: Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (2013)
Jun, H.: The effect of composite vs. first person perspective view in real world telerobotic operations. Master’s thesis, Iowa State University (2011)
Okura, F., Ueda, Y., Sato, T., Yokoya, N.: Teleoperation of mobile robots by generating augmented free-viewpoint images. International Conference on Intelligent Robots and Systems (IROS) (2013)
Rybarczyk, Y., Coelho, T., Cardoso, T., De Oliveira, R.: Effect of avatars and viewpoints on performance in virtual world: efficiency vs. telepresence. EAI Endorsed Trans. Creat. Technol. 14(1), 4 (2014)
Langevin, G.: InMoov (2015). http://www.inmoov.fr/project/
Arsenault, D., Whitehead, A.: Gesture recognition using Markov Systems and wearable
Charness, G., Gneezy, U., Kuhn, M.A.: Experimental methods: between-subject and within-subject design. J. Econ. Behav. Organ. 81(1), 1–8 (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Kilby, C., Whitehead, A. (2018). A Study of Viewpoint and Feedback in Wearable Systems for Controlling a Robot Arm. In: Ahram, T., Falcão, C. (eds) Advances in Human Factors in Wearable Technologies and Game Design. AHFE 2017. Advances in Intelligent Systems and Computing, vol 608. Springer, Cham. https://doi.org/10.1007/978-3-319-60639-2_14
Download citation
DOI: https://doi.org/10.1007/978-3-319-60639-2_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-60638-5
Online ISBN: 978-3-319-60639-2
eBook Packages: EngineeringEngineering (R0)