Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
…
9 pages
1 file
Brief use of a tool recalibrates multisensory representations of the user's body, a phenomenon called tool embodiment. Despite two decades of research, little is known about its boundary conditions. It has been widely argued that embodiment requires active tool use, suggesting a critical role for somatosensory and motor feedback. The present study used a visual illusion to cast doubt on this view. We used a mirror-based setup to induce a visual experience of tool use with an arm that was in fact stationary. Following illusory tool use, tactile perception was recalibrated on this stationary arm, and with equal magnitude as physical use. Recalibration was not found following illusory passive tool holding, and could not be accounted for by sensory conflict or general interhemispheric plasticity. These results suggest visual tool-use signals play a critical role in driving tool embodiment.
Scientific Reports, 2021
A tool can function as a body part yet not feel like one: Putting down a fork after dinner does not feel like losing a hand. However, studies show fake body-parts are embodied and experienced as parts of oneself. Typically, embodiment illusions have only been reported when the fake body-part visually resembles the real one. Here we reveal that participants can experience an illusion that a mechanical grabber, which looks scarcely like a hand, is part of their body. We found changes in three signatures of embodiment: the real hand's perceived location, the feeling that the grabber belonged to the body, and autonomic responses to visible threats to the grabber. These findings show that artificial objects can become embodied even though they bear little visual resemblance to the hand. Our body is the means through which we interact with the external world. Little would a brain achieve without a body to execute its commands and collect information about the environment through the sensory channels. Yet our body is not just any kind of input/output machine that executes actions and provides feedback. We have a "very special regard for just one body", such that each seems to "think of it as unique and perhaps more important than any other" 1 We are not simply aware of one body; we are aware of it as being our own body (i.e. we have a sense of bodily ownership) 2 . Throughout evolution, interactions with the environment have become more and more complex and mediated by objects that humans built and used to overcome the limitations of their bodies. Tools expand motor capabilities and allow actions that would otherwise be dangerous or impossible. There is now little doubt that tools can be incorporated: many of their properties are processed in the same way as the properties of one's limbs . But bodily ownership is that and more 6 . It requires experiencing tools as constitutive parts of one's own body. Though we manipulate dozens of tools during the day, could we actually feel that a fork, a toothbrush or a screwdriver belong to us in the same way our hands do? Here we investigate whether a tool can be processed as a body part not only at the spatial level (localization), but also at the physiological level (response to threats), and at the phenomenological level (feeling of ownership). For each measure, we assess the additional impact of motor experience with the tool. Previous studies show that even ten minutes of tool-use can deeply modify the representations of both the body and the space around it 7-15 . For example, when using a long grabber tool to retrieve objects, the arm representation is updated to reflect the functional elongation of the effector. Similarly, when using pliers, digit representations change to take into account the new morphology. Tool use also modifies the visual properties of peripersonal space, recoding far space as nearer , and enhancing the defensive monitoring of such space 18 . However, while these previous studies showed that tool use affects sensorimotor and spatial representations, they did not address whether it affects body ownership, that is, whether using a tool makes it feel more like a part of one's own body. Although we seem to have little doubt about the boundaries of our own body, it has been shown that it is relatively easy to induce the illusion of owning external fake body parts. This line of research originated with the seminal paper by Botvinick and Cohen describing what is now known as the Rubber Hand Illusion (RHI) . In
Cognition, 2002
In a visual-tactile interference paradigm, subjects judged whether tactile vibrations arose on a finger or thumb (upper vs. lower locations), while ignoring distant visual distractor lights that also appeared in upper or lower locations. Incongruent visual distractors (e.g. a lower light combined with upper touch) disrupt such tactile judgements, particularly when appearing near the tactile stimulus (e.g. on the same side of space as the stimulated hand). Here we show that actively wielding tools can change this pattern of crossmodal interference. When such tools were held in crossed positions (connecting the left hand to the right visual field, and vice-versa), the spatial constraints on crossmodal interference reversed, so that visual distractors in the other visual field now disrupted tactile judgements most for a particular hand. This phenomenon depended on active tool-use, developing with increased experience in using the tool. We relate these results to recent physiological and neuropsychological findings.
PLOS ONE, 2019
The rubber hand illusion describes a phenomenon in which participants experience a rubber hand as being part of their body by the synchronous application of visuotactile stimulation to the real and the artificial limb. In the recently introduced robotic hand illusion (RobHI), a robotic hand is incorporated into one's body representation due to the integration of synchronous visuomotor information. However, there are no setups so far that combine visuotactile and visuomotor feedback, which is expected to unravel mechanisms that cannot be detected in experimental designs applying this information in isolation. We developed a robotic hand, controlled by a sensor glove and equipped with pressure sensors, and varied systematically and separately the synchrony for motor feedback (MF) and tactile feedback (TF). In Experiment 1, we implemented a ball-grasping task and assessed the perceived proprioceptive drift of one's own hand as a behavioral measure of the spatial calibration of body coordinates as well as explicit embodiment experiences by a questionnaire. Results revealed significant main effects of both MF and TF for proprioceptive drift data, but we only observed main effects for MF on perceived embodiment. Furthermore, for the proprioceptive drift we found that synchronous feedback in one factor compensates for asynchronous feedback in the other. In Experiment 2, including a new sample of naïve participants, we further explored this finding by adding unimodal conditions, in which we manipulated the presence or absence of MF and/or TF. These findings replicated the results from Experiment 1 and we further found evidence for a supper-additive multisensory effect on spatial body representation caused by the presence of both factors. Results on conscious body perception were less consistent across both experiments. The findings indicate that sensory and motor input equally contribute to the representation of spatial body coordinates which for their part are subject to multisensory enhancing effects. The results outline the potential of
Journal of Vision, 2009
When integrating signals from vision and haptics the brain must solve a "correspondence problem" so that it only combines information referring to the same object. An invariant spatial rule could be used when grasping with the hand: here the two signals should only be integrated when the estimate of hand and object position coincide. Tools complicate this relationship, however, because visual information about the object, and the location of the hand, are separated spatially. We show that when a simple tool is used to estimate size, the brain integrates visual and haptic information in a near-optimal fashion, even with a large spatial offset between the signals. Moreover, we show that an offset between the tool-tip and the object results in similar reductions in cross-modal integration as when the felt and seen positions of an object are offset in normal grasping. This suggests that during tool use the haptic signal is treated as coming from the tool-tip, not the hand. The brain therefore appears to combine visual and haptic information, not based on the spatial proximity of sensory stimuli, but based on the proximity of the distal causes of stimuli, taking into account the dynamics and geometry of tools.
The present study examined what participants perceive of their hand movements when using a tool. In the experiments different gains for either the x-axis or the y-axis perturbed the relation between hand movements on a digitizer tablet and cursor movements on a display. As a consequence of the perturbation participants drew circles on the display while their covered hand movements followed either vertical or horizontal ellipses on the digitizer tablet. When asked to evaluate their hand movements, participants were extremely uncertain about their trajectories. By varying the amount of visual feedback, findings indicated that the low awareness of one’s own movements originated mainly from an insufficient quality of the humans’ tactile and proprioceptive system or from an insufficient spatial reconstruction of this information in memory.
2012
Prior research has linked visual perception of tools with plausible motor strategies. Thus, observing a tool activates the putative action-stream, including the left posterior parietal cortex. Observing a hand functionally grasping a tool involves the inferior frontal cortex. However, tool-use movements are performed in a contextual and grasp specific manner, rather than relative isolation.
When using a tool, proximal action effects (e.g., the hand movement on a digitizer tablet) and distal action effects (e.g., the cursor movement on a display) often do not correspond to or are even in conflict with each other. In the experiments reported here, we examined the role of proximal and distal action effects in a closed loop task of sensorimotor control. Different gain factors perturbed the relation between hand movements on the digitizer tablet and cursor movements on a display. In the experiments, the covert hand movement was held constant, while the cursor amplitude on the display was shorter, equal, or longer, and vice versa in the other condition. When participants were asked to replicate the hand movement without visual feedback, hand amplitudes varied in accordance with the displayed amplitudes. Adding a second transformation (Experiment 1: 90°-rotation of visual feedback, Experiment 2: 180°-rotation of visual feedback) reduced these aftereffects only when the discrepancy between hand movement and displayed movement was obvious. In conclusion, distal action effects assimilated proximal action effects when the proprioceptive/tactile feedback showed a feature overlap with the visual feedback on the display.
When using lever tools, subjects have to deal with two, not necessarily concordant effects of their motor behavior: The body-related proximal effects, like tactile sensations from the moving hand, and/or more external distal effects, like the moving effect points of the lever. As a consequence, spatial compatibility relationships between stimulus (S; at which the effect points of the lever aim at), responding hand (R) and effect point of the lever (E) play a critical role in response generation. In the present study we examine whether the occurrence of compatibility effects needs real tool movements or whether a similar response pattern can be already evoked by pure mental imaginations of the tool effects. In general, response times and errors observed with real and imagined tool movements showed a similar pattern of results, but there were also differences. With incompatible relationships and thus more difficult tasks, response times were reduced with imagined tool movements than compared with real tool movements. On the contrary, with compatible relationships and thus high overlap between proximal and distal action effects, response times were increased with imagined tool movements. Results are only in parts consistent with the ideomotor theory of motor control.
2014
The term affordance defines a property of objects, which relates to the possible interactions that an agent can carry out on that object. In monkeys, canonical neurons encode both the visual and the motor properties of objects with high specificity. However, it is not clear if in humans exists a similarly finegrained description of these visuomotor transformations. In particular, it has not yet been proven that the processing of visual features related to specific affordances induces both specific and early visuomotor transformations, given that complete specificity has been reported to emerge quite late (300-450 ms). In this study, we applied an adaptation-stimulation paradigm to investigate early corticospinal facilitation and hand movements' synergies evoked by the observation of tools. We adapted, through passive observation of finger movements, neuronal populations coding either for precision or power grip actions. We then presented the picture of one tool affording one of the two grasps types and applied single-pulse Transcranial Magnetic Stimulation (TMS) to the hand primary motor cortex, 150 ms after image onset. Cortico-spinal excitability of the Abductor Digiti Minimi and Abductor Pollicis Brevis showed a detailed pattern of modulations, matching tools' affordances. Similarly, TMS-induced hand movements showed a pattern of grip-specific whole hand synergies. These results offer a direct proof of the emergence of an early visuomotor transformation when tools are observed, that maintains the same amount of synergistic motor details as the actions we can perform on them.
PLoS ONE, 2013
Understanding the interactions of visual and proprioceptive information in tool use is important as it is the basis for learning of the tool's kinematic transformation and thus skilled performance. This study investigated how the CNS combines seen cursor positions and felt hand positions under a visuo-motor rotation paradigm. Young and older adult participants performed aiming movements on a digitizer while looking at rotated visual feedback on a monitor. After each movement, they judged either the proprioceptively sensed hand direction or the visually sensed cursor direction. We identified asymmetric mutual biases with a strong visual dominance. Furthermore, we found a number of differences between explicit and implicit judgments of hand directions. The explicit judgments had considerably larger variability than the implicit judgments. The bias toward the cursor direction for the explicit judgments was about twice as strong as for the implicit judgments. The individual biases of explicit and implicit judgments were uncorrelated. Biases of these judgments exhibited opposite sequential effects. Moreover, age-related changes were also different between these judgments. The judgment variability was decreased and the bias toward the cursor direction was increased with increasing age only for the explicit judgments. These results indicate distinct explicit and implicit neural representations of hand direction, similar to the notion of distinct visual systems.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
New Ideas in Psychology
Avances en Psicología Latinoamericana, 2017
Journal of Experimental Psychology, 2005
PloS one, 2010
Experimental Brain Research, 2012
Ecological psychology, 2001
Experimental Brain Research, 2007
Cortex, 2020