Project

The major contribution of the project consists of laying the scientific and technological foundations for wearable haptics, a novel concept for the systematic exploration of haptics in advanced cognitive systems and robotics that will enable novel forms of communication and cooperation between humans and robots. Within the project we will investigate several complementary approaches able to interact with different parts of the human body through the sense of touch. The WEARHAP system is wearable and its architectural concept is envisioned in the figure where devices interacting with the fingepads, the elbow and the skin are sketched. The complexity of the whole wearable system is not a-priori fixed, indeed the inherently modular nature of the proposed solutions will allow to customize the WEARHAP system according to the given application.

arm

  1. Strengthen the neuroscientific, behavioral and physiological foundations for wearable haptic systems, and providing guidelines for the design of future generations of wearable haptic systems.
  2. Development of computational models of human touch, with special emphasis on two aspects: understand- ing the mechanical variables and low-level cognitive processes involved in tactile perception, and computing force and deformation cues to be rendered to wearable haptic users, as part of a model-based cognitive control strategy;
  3. Advancement of the state of the art in vision-based 3D tracking of the arm and of the articulated motion of hands, integrating visual and non-visual sensors, and enabling multi-sensory hand tracking and force sensing to go portable/wearable by addressing the related sensory and computational challenges;
  4. Development of new wearable active systems exploiting partial substitution of kinesthetic with cutaneous feedback, and integrating complex multi-DoF systems. The number of DoFs is related to the number of contact points and to the contact model considered (approximately from 1 to 3 DoFs per contact). Suitable algorithms based on projections are considered in case of underactuation and/or undersensing in the supporting kinematic structure;
  5. Improvement of human intention and action recognition through remote observation of human interaction forces in addition to visual human motion observation and context knowledge, advancing robotic assisted guidance and/or cutaneous interaction with the environment.
  6. Development of novel frameworks for human-robot cooperation in different contexts, such as acquisition of dynamic object models through cooperation with a WEARHAP-equipped human partner and human-robot cooperative manipulation;
  7. definition of new protocols for affective haptics in clinical scenarios and the integration of wearable haptic technology in entertainment applications.

Impact

The primary goal of this project is to develop a coherent set of scientifically well founded theoretical tools from neuroscience to physics and engineering, and a methodology for the design of the cognitive architectures underpinning the next generation of wearable haptic interfaces, showing the ability to remotely or virtually explore, grasp and manipulate the environment adaptively and robustly. The extraction of the conceptual organization of haptic action and perception, the exploitation of the relationships between the skin mechanics and the learning, the adaptation capabilities of humans in manipulations and haptic perception, aim to design more performing system architectures for the artificial haptic rendering, while the specification of wearability is fulfilled. Wearability let such systems be employed in a transparent and non invasive way in classic virtual or remote manipulation tasks, such as sensorimotor rehabilitation or diagnostic applications. In terms of exploitation, the ambitious goal of this project is to propose new perspectives in social interaction for education or entertainment, but above all in certain specific fields of health care in which robotics and haptics still are not fully exploited, e.g. non-invasive autonomous or remote guidance for blind people, or remote treatments for people in vegetative state. The developed system will be ultimately used in new paradigms for human-robot cooperation, where the human can take advantage of the wearability of the device during the cooperative task.

The WEARHAP results may benefit: medical equipment industries, companies producing wearable sensing (including gloves), clinicians, health care facilities and public institutions. Moreover WEARHAP has a further high-impact direction in the field of digital Art, Entertainment and Gaming industries: the development of wearable devices enriches the interaction with 3D multimedia contents providing haptic feedback to the user in a comfortable way and available everywhere.