In the third year of the project, the Consortium has continued its study on the skin mechanism, in particular focusing on skin shearing and on its interaction with different surface materials, so that stereotypical references could be established and defined. It was also found that movement information available from cutaneous input in one hand bound with movement information from the three other limbs available to cause movement. An additional founding of the Consortium concerns visuo-tactile binding in spatiotopic coordinates. However, in contrast with the binding phenomenon occurring between somatic sources of movement information, the visuo-tactile binding occurred by disregarding metric-related movement information, speed or magnitude. For what concerns the human finger mechanical properties, the Consortium has worked on the development of a novel constrained dynamics solver designed to perform well with highly nonlinear skin deformation constraints. This algorithm is 10x faster than previous constrained optimization solvers, typically designed for contact mechanics problems. Moreover, the Consortium has worked on the development of innovative test ring devices, presenting, for the first time, a contact-less characterization strategy that enables to simultaneously record force, area and pressure distribution on the finger pad at the contact. An additional development related to this topic comprises the human fingertip FE model, which was used to investigate the role of tactile flow in the psychophysiology of touch. During the third year, the Consortium has focused part of its attention in the performance improvement of the hand tracker, considering its robustness and accuracy as a mandatory requirement for an additional public release, which happened in November 2015. In this context, a novel divide-and-conquer approach for tracking of hand articulations from an egocentric viewpoint facilitating manipulation of virtual objects has been presented. A novel method to estimate the fingertip contact forces in grasping deformable objects with known shape and stiffness matrix was developed, using a sensing glove instrumented with inertial and magnetic sensors. Many collaborations have been conducted intra-Consortium, for permitting the exploitation of the hand tracker, in many other fields and researches. Particular examples of such processes are reflected in an integration of a-priori synergistic information on how humans commonly shape their hands to improve and complete reconstruction via FORTH techniques, in a fusion of available proprioception-based and vision-based pose estimates for robotic hand tracking, and in an adaptation of the hand tracking framework for human hands and devices attached such robotic hand. Regarding the hardware, the third year has been a year of consolidation, since the design and the improvements of the preceding two years needed to be concluded and finalised. The Consortium has produced within the WEARHAP project a variety of wearable devices able to render surface orientation, contact/no contact transition, high frequency tactile cues, skin stretch, and softness at the fingertip, and tangential and normal skin stretch at the arm. A special attention has been devoted to the development of a device for the proximal finger phalanx, designed to improve wearability. Moreover, single joint, variable damping actuators has been finalized, together with a wearable elbow exoskeleton. Some of the presented devices were also applied to virtual and tele-operated scenarios, involving manipulation and surface exploration, and operation of a robotic hand prosthesis with cutaneous feedback of the exerted grasping force. For what concern sensing devices, the tactile sensitive data-glove has been developed in its third version. It has been integrated with the posture sensing glove, based on IMU and flex sensors and, its fabric-based sensor technology was applied to cover the palm of the Shadow robot hand, allowing for additional sensing capabilities during manipulation. During the third year of the project, the cognitive control framework—which represents the connecting layer between the hardware WPs and the application-and-demonstrator WPs—has received a strong advancement. The consortium has worked toward its improvement on many sides. On one hand, physical knowledge has been incorporated into the framework for learning, recognition and prediction, whereas multi-sensory input data has been decomposed into displacement, squeeze and free-space motion. On this, a modelling approach, with model training in high-dimensional space, has been developed. The guidance through wearable haptic devices has received a strong attention by the Consortium, during the period described in this report. The Consortium has been able to successfully: guide a human providing trajectory information, reaching a final desired location in a large environment; provide navigation cues through lateral skin stretch force feedback; guide the wrist and thus the hand in a certain position and to let the wrist follow a predefined path/trajectory. A particular attention has been given to people with visual impairments, since the Consortium has developed an assistive haptic technology for blind skiers guidance (as shown in the following figure).
Possible kinematic asymmetries when the WEARHAP system is used for bilateral telemanipulation have been investigated, and a novel approach able to deal with master and slave devices having different structures has been defined. Moreover, a novel bilateral algorithm enabling to remotely control the Pisa/IIT SoftHand (SH) using at the master station the hand exoskeleton HEXOTRACK developed in WP4 has been developed. In addition, as a last contribution from the Consortium to the improvement of the cognitive control framework, two novel shared control strategies, based on the formation and teleoperation concepts were developed, as part of the architecture for cognitive adjustable team control. They enable human-robot team interaction and are suitable for incorporation of the WEARHAP devices in the loop. Many development has been brought forward by the Consortium for what concern the human-robot interaction applications and the social and healthcare scenarios. Strategies for estimating object kinematics seen through the human grasp pose, i.e. displacement and orientation of the human hand with respect to the robot end-effector, and for estimating object dynamics parameters such as object’s mass, center of mass, and inertial parameters has been investigated. Guidance mechanisms for suggesting users’ motions which would excite trajectories for helping the convergence of such algorithm have also been under study. A shared formation control and a shared control teleoperation in a cooperative manipulation task scenarios have been investigated and their realization has been started. During this study period, all the robotic platforms made available by TUM have been maintained and updated constantly. Moreover, their composition is increased during the third year of the project, adding one new bimanual manipulator platform with two robotic arm and one with two KUKA LBR4+ manipulators (as shown in the following figure).
Moreover, the Consortium implemented a novel and computationally efficient model of the arm endpoint stiffness behaviour. Real-time tracking of the human arm kinematics is achieved using an arm triangle monitored by three markers placed at the shoulder, elbow and wrist level. In preparation of one of the final demonstrators of the project efforts have been focused on defining the foundation of a teleoperation framework which will serve as the main platform for the demonstrator scenarios. For what concerns social and healthcare scenario, the Consortium had tested affective touch display for conveying caress-like stimuli on the forearm healthy subject with the aim of understating the proper fabric and the proper force and velocity to stimulate at best the forearm. Once it was established that silk was perceived as the most positive fabric, using a low velocity level for the caress, this tests have been expanded to vegetative-state and minimally-conscious-state patients. Low force level, together with high velocity level brought best results in patients’ responds to the stimuli.
In the second year of the project, the Consortium mainly focused on developing the preliminary ideas and models studied throughout the first year of work. Final guidelines for the development of the WEARHAP system have been characterized. Haptic simulation has been investigated from a neuroscience and physiology point of view, stressing how the somatosensory system, and the neural architecture in general, can be exploited to design and build interaction devices which take into account real human beings’ capabilities.
From the presented experimental results, human abilities appear far from optimal resulting in perceptual biases. Since posture recognition of a human hand and arm constitutes a very rich perceptual input that can be useful in wearable haptics research and application scenarios, 3D tracking of the articulated motion of hands has been studied in details during the second year. Several technologies have been investigated, together with the integration between visual and non-visual sensors, to enhance the overall accuracy of hand tracking. Novel sampling procedures have been developed to improve computational performance of the hand tracking methodology, bearing in mind that they will run in portable, and not extremely powerful, systems. For instance, postural synergies among fingers have been proposed to reduce the parameter space of the hand posture search. Significant advancement in the state of the art of vision-based 3D tracking of the arm and of the articulated motion of hands in isolation (interaction with virtual objects), or in interaction with physical objects have been obtained (Figure 2). The use of an Ensemble of Collaborative Trackers (ECT) seeks to overcome multi-object 3D tracking problems and then track human hands interacting with several objects in complex scenes. Moreover, the consortium started to explore the extension of this algorithm for estimating human upper body 3D pose, with no restrictive assumptions on biometric characteristics, or on the type of performed motions.
During the first year of the project, novel fingertip devices, based on different concepts and solutions, have been proposed. The second year of the project has witnessed high improvements of the prototypes, with the aim of ensuring versatility in providing haptic feedback and to meet the requirements of different scenarios (Figure 3). Several concepts of actuation have been proposed, such as forearm skin stimuli, softness display, vibrotactile bracelets, and fingertip haptic interfaces. Moreover a wearable underactuated hand exoskeleton has been proposed with a design that allows the integration with the fingertip devices, moving towards the integration required by the project. For measuring the hand poses during human manipulation tasks through dedicated data gloves, two solutions have been investigated during the reference period: a fabric-based strain sensor and a sensing glove based on Inertial Measurement Unit (IMU) and Magnetic Angular Rate and Gravitational (MARG) sensors. We have also promoted a tactile glove to measure normal contact forces when a human is interacting with the environment. The particular mechanical properties of the sensors guarantee a limited obstruction by the sensing glove and a high degree of wearability and flexibility. Preliminary user studies in measuring interaction forces during hand shaking and caressing motions have already carried out.
Results from WP1-5 allowed to define and describe feasible demonstrator scenarios, given the up-to-date state of knowledge about the human’s sensing and actuation capabilities as well as the specifications of available hardware, developed within WEARHAP. Such scenarios represent the possible application domains for WEARHAP-enhanced human-robot interaction:
• collaborative search and rescue (SAR) missions in the context of disaster relief;
• outdoor navigation for visually impaired people;
• human-robot interaction enhancement in manufacturing context.
For each application technical requirements and human performance assessment were defined, taking in account the type of human-robot interaction planned. Finally, demonstrators related to the defined applications were detailed, including final application expectations. In the context of social and health care scenarios envisaged in WEARHAP, the consortium has improved the concept of affective caress-like stimuli, provided by haptic devices developed within the project. Experimental results on the psychological characterization of this type of stimuli and subjects’ perceived stress were investigated. Results showed that the psychological state of the subjects significant changed after the tactile stimulation for both PANAS and STAI scores, resulting in a reduction of stress and anxiety due to caress-like haptic stimuli.
Moreover, the consortium has investigated the characterization of human hand pressure distribution while the arm is caressed, or squeezed (Figure 4). Finally, a preliminary setup about wearable haptic technology integration in mixed reality applications was implemented. This setup will act as a base for intuitive visuohaptic interaction without workspace restrictions.
For the dissemination activity, the WEARHAP consortium put a great effort in communicating its most innovative project outcomes to the widest possible audience through its website and socials, press releases on national and local newspapers-magazines, TV, and Radio interviews. Moreover, some gadgets with the logo of the project have also been distributed during several exhibitions and social events.
In the first year of the project the consortium focused principally on defining models and guidelines for the development of the WEARHAP system.
We started from neuroscience and physiology by critically reviewing basic facts regarding the somatosensory system, its anatomy, mechanics, tribology and neural architecture. We started to explore human perception and action using a novel haptic workbench that is including an intelligent object able to record the interaction forces of all 5 fingers placed in an unconstrained way on the object, while at the same time force perturbation can be exerted. We also started to investigate the integration mechanisms of humans by recording the natural statistics when interacting freely with the natural environment. These results will allow us to refine the models of human multisensory integration and eventually these results will help us to design and construct interaction devices that are best adapted to the human capabilities.
Concerning modelling, the main results obtained during the first year include the development of an advanced and efficient computational model of human fingertip deformation. This model captures the nonlinear deformations and forces of the finger skin upon contact, which will be used in future developments to command wearable haptic devices. The skin model describes efficiently the complex nonlinear behavior of skin mechanics, and eliminates the limitations of linear materials, while avoiding the computational cost of traditional hyper-elastic models. The current constraint-based skin model is interactive for models of moderate size, and allows haptic rendering of interactions with the finger, as shown in Fig. 1.
During this first year, we also improved the computational performance and scalability of the hand tracking methodology since we are interested in scenarios involving hand-object interaction, eventually running in portable systems. Additionally, we focused on input from moving cameras with application in wearable sensing systems.
The requirements for both the prototypes of the WEARHAP system have been defined according to feasibility studies on the technologies to be developed and the demands of the applications envisaged. Novel fingertip devices, based on different concepts and solutions to display haptic interaction with bare fingers have been proposed (Fig. 2). New concepts of actuation based on variable damping, able to enhance the stability of force feedback in haptic interaction, and new devices to display distributed feedback to the user have been also proposed. The latter comprises vibrotactile devices for the wrist and torso, skin strain interface for the elbow and hand exoskeletons.
The consortium also developed techniques for human-robot cooperation via vibro-tactile feedback. In particular, a novel use of haptic feedback for human guidance has been proposed.
As far as augmented human robot interaction is concerned, teleimpedance and intrinsically passive haptic feedback for the interfacing and control of the Pisa/IIT SoftHand (Fig. 3) has been investigated. Two tactile interfaces have been developed. The first interface estimates the interaction forces developed between the SoftHand and the grasped object and feeds them back to the upper arm of the user via a mechanotactile haptic interface. The second interface employs vibrotactile feedback in the form of a bracelet worn also at the upper arm.
Finally in the context of social and healthcare scenarios envisaged in the project, a novel wearable haptic system based on an elastic fabric controlled by motors to go forward and backward over the subject forearm, thus simulating a human caress, has been preliminarily designed and a first prototype has been realized. The system allows to control both the velocity of the “caress-like” movement, by regulating motor velocity, and the “strength of the caress”, by regulating motor positions and hence the pressure of the partially wrapped fabric on the forearm. Along with the realization of the device, a suitable experimental protocol for a psycho-physiological assessment has been envisaged. Such an assessment is intended as a characterization of the device capability to actually elicit emotional states in humans using different combinations of velocity and caress strength.