An integrated approach for user-adaptive robotic grasping
This thesis presents an integrated approach for transferring grasping skills from a user to a robot arm using spoken commands. At present, grasping research has focused on the use of mathematical models for choosing optimal grasp configurations. However, selecting a successful grasp using this approach is often difficult to achieve. Integrating human-robot interaction into the grasping process, however, provides an alternative approach that enables robots to learn how users prefer to grasp and manipulate objects according to the user's own needs and preferences. In this case, the robot not only learns how to grasp from interactions with the user, but also adapts to the user's expectations during grasping sessions. This approach is urgently needed to ensure the success of robots operating in human friendly environments such as homes and office. The methodology used in this thesis involved conducting two usability studies to observe user behavior, and building a learning system for predicting user actions during a grasping task. The first usability study presented, focuses on the type of natural language produced by 15 non-expert users during robotic grasping tasks. This study also examines whether the background of these 15 users plays a part during the grasping process. Grasping commands recorded from this first usability study, are used by a learning system to predict the user's intentions, and to explore whether users' previous knowledge gathered from earlier grasping experiences plays a role during subsequent grasping operations. Finally, a second usability study involving 8 users is presented. The focus of this second study is to explore the accuracy of the learning system for predicting users' intentions, by examining each user's performance. This thesis revealed several important findings. Firstly, using a limited set of commands, users were able to command a simple robotic arm equipped with a basic gripper to grasp five small relatively difficult to grasp everyday objects. Also, users were capable of developing their own set of natural language instructions for instructing the robot arm during a grasping operation. In addition, the user's background does appear to play a role in the grasping process, with less technically experienced users requiring more time and commands to complete a grasping task. However, the user's background does not appear to impact their ability to successfully grasp objects presented to them. Finally, results suggest that users build on previous grasping experiences during subsequent grasping trials, and welcome a more proactive robot capable of adapting to their expectations during a grasping task.