Skip to main content

A Study of Viewpoint and Feedback in Wearable Systems for Controlling a Robot Arm

  • Conference paper
  • First Online:
Advances in Human Factors in Wearable Technologies and Game Design (AHFE 2017)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 608))

Included in the following conference series:

  • 1910 Accesses

Abstract

This study examines the feasibility of an integrated motion and vibrotactile system for controlling a humanoid robotic arm using natural and intuitive movements rather than a complicated control schema. We examine a head-mounted display integrated into a system of arm-based motion sensors that control a humanoid robotic arm to determine if there is an observable difference between third-person and first-person perspective in the control of the robot arm. We look at vibration as a form of haptic feedback to relay the limitations of the robot arm back to the user. An experiment shows 30 participants were able to complete both gross and fine motor control tasks without fail indicating that this type of sensor based control systems is intuitive and easy to use. The majority of participants found the method of control to be intuitive, the inclusion of first-person perspective to be beneficial, and the vibration feedback to be either inconsequential or confusing.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Sakamoto, D., Takayuki, K., Ono, T., Ishiguro, H., Hagita, N.,: Android as a telecommunication medium with a human-like presence. In: International Conference on Human-Robot Interaction (HRI) (2007)

    Google Scholar 

  2. Tsui, K.M., Desai, M., Yanco, H., Uhlik, C.: Exploring use cases for telepresence robots. In: International Conference on Human-Robot Interaction (HRI) (2011)

    Google Scholar 

  3. Kristoffersson, A., Eklundh, K.S., Loutfi, A.: Measuring the quality of interaction in mobile robotic telepresence: a pilot’s perspective. Int. J. Soc. Robot. 5(1), 89–101 (2013)

    Article  Google Scholar 

  4. Vicon: What is motion capture? (2015). http://www.vicon.com/what-is-motion-capture

  5. Windolf, M., Götzen, N., Morlock, M.: Systematic accuracy and precision analysis of video motion capturing systems—exemplified on the Vicon-460 system. J. Biomech. 41(12), 2776–2780 (2008)

    Article  Google Scholar 

  6. Thewlis, D., Bishop, C., Daniell, N., Paul, G.: Next generation low-cost motion capture systems can provide comparable spatial accuracy to high-end systems. J. Appl. Biomech. 29(1), 112–117 (2013)

    Article  Google Scholar 

  7. Han, C., Kim, S., Oh, C., Seo, P., Hong, D., Ryu, Y.: The study of improving the accuracy in the 3D data acquisition of motion capture system. In: Second International Conference on Future Generation Communication and Networking Symposia (2008)

    Google Scholar 

  8. Yang, P.F., Sanno, M., Brüggemann, G.P., Rittweger, J.: Evaluation of the performance of a motion capture system for small displacement recording and a discussion for its application potential in bone deformation in vivo measurements. Proc. Inst Mech. Eng. Part H: J. Eng. Med. 226(11), 838–847 (2012)

    Article  Google Scholar 

  9. Novo, C., Alharbi, S., Fox, M., Ouellette, E., Biden, E., Tingley, M., Chester, V.: The impact of technical parameters such as video sensor technology, system configuration, marker size and speed on the accuracy of motion analysis systems. Ingeniería Mecánica. Tecnología y Desarrollo 5(1), 265–271 (2014)

    Google Scholar 

  10. Chen, X., Davis, J.: Camera placement considering occlusion for robust motion capture. Computer Graphics Laboratory, Stanford University, Technical report 2.2.2, p. 2 (2000)

    Google Scholar 

  11. Vlasic, D., Adelsberger, R., Vannucci, G., Barnwell, J., Gross, M., Matusik, W., Popovic, J.: Practical motion capture in everyday surroundings. ACM Trans. Graph. (TOG), 26(3) (2007)

    Google Scholar 

  12. Microsoft. https://www.microsoft.com/en-us/kinectforwindows/meetkinect/features.aspx

  13. Staranowicz, A., Brown, G.R., Mariottini, G.: Evaluating the accuracy of a mobile Kinect-based gait-monitoring system for fall prediction. In: Proceedings of the 6th International Conference on Pervasive Technologies Related to Assistive Environments (2013)

    Google Scholar 

  14. Zhang, Z.: Microsoft Kinect sensor and its effect. IEEE Multimedia 19(2), 4–10 (2012)

    Article  MathSciNet  Google Scholar 

  15. Whitehead, A., Johnston, H., Fox, K., Crampton, N., Tuen, J.: Homogeneous accelerometer-based sensor networks for game interaction. Comput. Entertain. (CIE) 9(1), 1 (2011)

    Article  Google Scholar 

  16. Whitehead, A., Crampton, N., Fox, K., Johnston, H.: Sensor networks as video game input devices. In: Proceedings of the 2007 Conference on Future Play. ACM (2007)

    Google Scholar 

  17. Burt, C.: Having fun, working out: adaptive and engaging video games for exercise. Master’s thesis, Carleton University, Ottawa (2014)

    Google Scholar 

  18. Godwin, A., Agnew, M., Stevenson, J.: Accuracy of inertial motion sensors in static, quasistatic, and complex dynamic motion. J. Biomech. Eng. 131(11) (2009)

    Google Scholar 

  19. Pollard, N., Hodgins, J., Riley, M., Atkeson, C.: Adapting human motion for the control of a humanoid robot. In: Proceedings of ICRA 2002, IEEE International Conference on Robotics and Automation (2002)

    Google Scholar 

  20. Obaid, M., Häring, M., Kistler, F., Bühling, R., André, E.: User-defined body gestures for navigational control of a humanoid robot. In: Ge, S.S., Khatib, O., Cabibihan, J.J., Simmons, R., Williams, M.A. (eds.) Social Robotics, pp. 367–377. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  21. Shin, J. Lee, J., Gleicher, M., Shin, S.: Computer puppetry: an importance-based approach. ACM Trans. Graph. (2001)

    Google Scholar 

  22. El-laithy, R., Huang, J., Yeh, M.: Study on the use of Microsoft Kinect for robotics applications. In: Position Location and Navigation Symposium (PLANS) (2012)

    Google Scholar 

  23. Liying, C., Sun, Q., Su, H., Cong, Y., Zhao, S.: Design and implementation of human-robot interactive demonstration system based on Kinect. In: Control and Decision Conference (2012)

    Google Scholar 

  24. Qian, K., Niu, J., Yang, H.: Developing a gesture based remote human-robot interaction system using Kinect. Int. J. Smart Home 7(4) (2013)

    Google Scholar 

  25. Van den Bergh, M., Carton, D., De Nijs, R., Mitsou, N., Landsiedel, C., Kuehnlenz, K., Wollherr, D., Van Gool, L., Buss, M.: Real-time 3D hand gesture interaction with a robot for understanding directions from humans. In: RO-MAN. IEEE (2011)

    Google Scholar 

  26. Parzych, M., Dabrowski, A., Cetnarowicz, D.: Aspects of Microsoft Kinect sensor application to servomotor control. Bull. Pol. Acad. Sci. Tech. Sci. 62(3), 595–601 (2014)

    Google Scholar 

  27. Manigandan, M., Jackin, M.: Wireless vision based mobile robot control using hand gesture recognition through perceptual color space. In: International Conference on Advances in Computer Engineering (ACE) (2010)

    Google Scholar 

  28. Hussein, M.: Motion Control of Robot by using Kinect Sensor. Res. J. Appl. Sci. Eng. Technol. 8(11), 1384–1388 (2014)

    Google Scholar 

  29. Reddivari, H., Yang, C., Ju, Z., Liang, P., Li, Z., Xu, B.: Teleoperation control of Baxter robot using body motion tracking. In: International Conference on Multisensor Fusion and Information Integration for Intelligent Systems (MFI) (2014)

    Google Scholar 

  30. Du, G., Zhang, P., Mai, J., Li, Z.: Markerless Kinect-based hand tracking for robot teleoperation. Int. J. Adv. Robot. Syst. 9(10) (2012)

    Google Scholar 

  31. Miller, N., Jenkins, O.C., Kallmann, M., Matric, M.J.: Motion capture from inertial sensing for untethered humanoid teleoperation. In: International Conference of Humanoid Robotics, pp. 547–565 (2004)

    Google Scholar 

  32. National Aeronautics and Space Administration: Robonaut (2015). http://robonaut.jsc.nasa.gov/

  33. Prabakar, M., Kim, J.H.: TeleBot: design concept of telepresence robot for law enforcement. In: Proceedings of the 2013 World Congress on Advances in Nano, Biomechanics, Robotics, and Energy Research (ANBRE 2013), Seoul, Korea (2013)

    Google Scholar 

  34. Kuchenbecker, K., Fiene, J., Niemeyer, G.: Improving contact realism through event-based haptic feedback. IEEE Trans. Vis. Comput. Graph. 219–230 (2006)

    Google Scholar 

  35. Kron, A., Schmidt, G., Petzold, B., Zah, M.I., Hinterseer, P., Steinbach, E.: Disposal of explosive ordnances by use of a bimanual haptic telepresence system. In: International Conference on Robotics and Automation (2004)

    Google Scholar 

  36. Okamura, A.: Methods for haptic feedback in teleoperated robot-assisted surgery. Ind. Robot.: Int. J. 31(6), 499–508 (2004)

    Article  Google Scholar 

  37. Sugiyama, J., Miura, J.: A wearable visuo-inertial interface for humanoid robot control. In: Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (2013)

    Google Scholar 

  38. Jun, H.: The effect of composite vs. first person perspective view in real world telerobotic operations. Master’s thesis, Iowa State University (2011)

    Google Scholar 

  39. Okura, F., Ueda, Y., Sato, T., Yokoya, N.: Teleoperation of mobile robots by generating augmented free-viewpoint images. International Conference on Intelligent Robots and Systems (IROS) (2013)

    Google Scholar 

  40. Rybarczyk, Y., Coelho, T., Cardoso, T., De Oliveira, R.: Effect of avatars and viewpoints on performance in virtual world: efficiency vs. telepresence. EAI Endorsed Trans. Creat. Technol. 14(1), 4 (2014)

    Article  Google Scholar 

  41. Langevin, G.: InMoov (2015). http://www.inmoov.fr/project/

  42. Arsenault, D., Whitehead, A.: Gesture recognition using Markov Systems and wearable

    Google Scholar 

  43. Charness, G., Gneezy, U., Kuhn, M.A.: Experimental methods: between-subject and within-subject design. J. Econ. Behav. Organ. 81(1), 1–8 (2012)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anthony Whitehead .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Cite this paper

Kilby, C., Whitehead, A. (2018). A Study of Viewpoint and Feedback in Wearable Systems for Controlling a Robot Arm. In: Ahram, T., Falcão, C. (eds) Advances in Human Factors in Wearable Technologies and Game Design. AHFE 2017. Advances in Intelligent Systems and Computing, vol 608. Springer, Cham. https://doi.org/10.1007/978-3-319-60639-2_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-60639-2_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-60638-5

  • Online ISBN: 978-3-319-60639-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics