Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2010, Attention Perception & Psychophysics
This study explored whether functional properties of the hand and tools influence the allocation of spatial attention. In four experiments that used a visual-orienting paradigm with predictable lateral cues, hands or tools were placed near potential target locations. Results showed that targets appearing in the hand’s grasping space (i.e., near the palm) and the rake’s raking space (i.e., near the prongs) produced faster responses than did targets appearing to the back of the hand, to the back of the rake, or near the forearm. Validity effects were found regardless of condition in all experiments, but they did not interact with the target-in-grasping/raking-space bias. Thus, the topology of the facilitated space around the hand is, in part, defined by the hand’s grasping function and can be flexibly extended by functional experience using a tool. These findings are consistent with the operation of bimodal neurons, and this embodied component is incorporated into a neurally based model of spatial attention.
Journal of Experimental Psychology: Human Perception and Performance, 2006
This study explored whether hand location affected spatial attention. The authors used a visual covertorienting paradigm to examine whether spatial attention mechanisms-location prioritization and shifting attention-were supported by bimodal, hand-centered representations of space. Placing 1 hand next to a target location, participants detected visual targets following highly predictive visual cues. There was no a priori reason for the hand to influence task performance unless hand presence influenced attention. Results showed that target detection near the hand was facilitated relative to detection away from the hand, regardless of cue validity. Similar facilitation was found with only proprioceptive or visual hand location information but not with arbitrary visual anchors or distant targets. Hand presence affected attentional prioritization of space, not the shifting of attention.
Frontiers in Psychology, 2013
Changes in visual processing near the hands may assist observers in evaluating items that are candidates for actions. If altered vision near the hands reflects adaptations linked to effective action production, then positioning the hands for different types of actions could lead to different visual biases. I examined the influence of hand posture on attentional prioritization to test this hypothesis. Participants placed one of their hands on a visual display and detected targets appearing either near or far from the hand. Replicating previous findings, detection near the hand was facilitated when participants positioned their hand on the display in a standard open palm posture affording a power grasp (Experiments 1 and 3). However, when participants instead positioned their hand in a pincer grasp posture with the thumb and forefinger resting on the display, they were no faster to detect targets appearing near their hand than targets appearing away from their hand (Experiments 2 and 3). These results demonstrate that changes in visual processing near the hands rely on the hands' posture. Although hands positioned to afford power grasps facilitate rapid onset detection, a pincer grasp posture that affords more precise action does not.
Attention, Perception, & Psychophysics, 2013
Spatial attention can be biased to locations near the hand. Some studies have found facilitated processing of targets appearing within hand-grasping space. In this study, we investigated how changing top-down task priorities alters hand bias during visual processing. In Experiment 1, we used a covert orienting paradigm with nonpredictive cues and emphasized the location of the hand relative to the target. Hands or visual anchors (boards) were placed next to potential target locations, and responses were made with the contralateral hand. Results indicated a hand-specific processing bias: Hand location, but not board location, speeded responses to targets near the hand. This pattern of results replicated previous studies using covert orienting paradigms with highly predictive cues. In Experiment 2, we used the same basic paradigm but emphasized the location of the response hand. Results now showed speeded responses to targets near response locations. Together these experiments demonstrated that top-down instructional sets (i.e., what is considered to be most relevant to task performance) can change the processing priority of hand location by influencing the strength of top-down, as compared with bottom-up, inputs competing for attention resources.
Cortex, 2007
The active and skilful use of tools has been claimed to lead to the 'extension' of the visual receptive fields of single neurons representing peripersonal space -the visual space immediately surrounding one's body parts. While this hypothesis provides an attractive and potentially powerful explanation for one neural basis of tool-use behaviours in human and nonhuman primates, a number of competing hypotheses for the reported behavioural effects of tool-use have not yet been subjected to empirical test. Here, we report five behavioural experiments in healthy human participants (N = 120) involving the effects of tool-use on visual-tactile interactions in peripersonal space. Specifically, we address the possibility that the use of only a single tool, which is typical of many neuropsychological studies of tool-use, induces a spatial allocation of attention towards the side where the tool is held. Participants' tactile discrimination responses were more strongly affected by visual stimuli presented on the right side when they held a single tool on the right, compared to visual stimuli presented on the left. When two tools were held, one in each hand, this spatial effect disappeared. Our results are incompatible with the hypothesis that tool-use extends peripersonal space, and suggest instead that tools result in an automatic multisensory shift of spatial attention to the side of space where the tip of the tool is actively held. These results have implications for many of the cognitive neuroscientific studies of tool-use published to date.
2012
Prior research has linked visual perception of tools with plausible motor strategies. Thus, observing a tool activates the putative action-stream, including the left posterior parietal cortex. Observing a hand functionally grasping a tool involves the inferior frontal cortex. However, tool-use movements are performed in a contextual and grasp specific manner, rather than relative isolation.
Plos One, 2008
Background: Tool use in humans requires that multisensory information is integrated across different locations, from objects seen to be distant from the hand, but felt indirectly at the hand via the tool. We tested the hypothesis that using a simple tool to perceive vibrotactile stimuli results in the enhanced processing of visual stimuli presented at the distal, functional part of the tool. Such a finding would be consistent with a shift of spatial attention to the location where the tool is used.
Neuropsychologia, 2011
In recent years, there has been growing excitement within cognitive neuroscience about the concept of embodiment: How do the capabilities and limitations of our physical bodies affect neural representations in the brain? Neuropsychological and neurophysiological studies show clear evidence that short-term visuomotor experience can influence the encoding of the space around the body in parietal cortex. For example, tool-use may expand the neural representation of peripersonal space. But how is this initial spatial representation influenced by a lifetime of object-related interactions? To examine this question we used functional magnetic resonance imaging (fMRI) to investigate the neural effects of an individual's hand preferences for acting within peripersonal space. Left-and right-handed participants viewed real-world objects at different locations accessible by either the left hand, right hand, or neither hand. The superior parieto-occipital cortex (SPOC), an area most often implicated in reaching actions, showed enhanced visual responses for objects located within the range of space in which each group typically acts. Specifically, in right-handers, who strongly prefer grasping with the right hand, SPOC showed strongest activation for objects located within the range of space for the right hand only. In contrast, in left-handers, who use their two hands comparably often in visuomotor tasks, SPOC showed strongest activation for objects located within the range of space of either hand. These findings show that, even in the absence of overt responses, real 3D objects located in the individual's typical workspace for hand actions automatically invoke enhanced responses in associated visuomotor areas of the brain.
Consciousness and cognition, 2018
Observers show biases in attention when viewing objects within versus outside of their hands' grasping space. While the hands' proximity to stimuli plays a key role in these effects, recent evidence suggests an observer's affordances for grasping actions also shape visual processing near the hands. The current study examined the relative contributions of proximity and affordances in introducing attentional biases in peripersonal space. Participants placed a single hand on a visual display and detected targets appearing near or far from the hand. Across conditions, the hand was either free, creating an affordance for a grasping action, or immobilized using an orthosis, interfering with the potential to grasp. Replicating previous findings, participants detected targets appearing near the hand more quickly than targets appearing far from the hand. Immobilizing the hands did not disrupt this effect, suggesting that proximity alone is sufficient to facilitate target detectio...
Journal of …, 2012
MV. Closely overlapping responses to tools and hands in left lateral occipitotemporal cortex. perception of object-directed actions performed by either hands or tools recruits regions in left fronto-parietal cortex. Here, using functional MRI (fMRI), we tested whether the common role of hands and tools in object manipulation is also reflected in the distribution of response patterns to these categories in visual cortex. In two experiments we found that static pictures of hands and tools activated closely overlapping regions in left lateral occipitotemporal cortex (LOTC). Left LOTC responses to tools selectively overlapped with responses to hands but not with responses to whole bodies, nonhand body parts, other objects, or visual motion. Multivoxel pattern analysis in left LOTC indicated a high degree of similarity between response patterns to hands and tools but not between hands or tools and other body parts. Finally, functional connectivity analysis showed that the left LOTC hand/tool region was selectively connected, relative to neighboring body-, motion-, and object-responsive regions, with regions in left intraparietal sulcus and left premotor cortex that have previously been implicated in hand/tool action-related processing. Taken together, these results suggest that action-related object properties shared by hands and tools are reflected in the organization of high-order visual cortex. We propose that the functional organization of high-order visual cortex partly reflects the organization of downstream functional networks, such as the fronto-parietal action network, due to differences within visual cortex in the connectivity to these networks. action perception; functional connectivity; visual cortex; ventral stream Address for reprint requests and other correspondence: M. V. Peelen, Center for Mind/Brain Sciences
"An exciting new line of research that investigates the impact of one’s own hands on visual perception and attention has flourished in the past several years. Specifically, several studies have demonstrated that the nearness of one’s hands can modulate visual perception, visual attention, and even visual memory. These studies together shed new light on how the brain prioritizes certain information to be processed first. This review first outlines the recent progress that has been made to uncover various characteristics of the nearby-hand effect, including how they may be transferred to a familiar tool. We then summarize the findings into four specific characteristics of the nearby-hand effect, and conclude with a possible neural mechanism that may account for all the findings."
Lecture Notes in Computer Science, 2009
Classically, visual attention is assumed to be influenced by visual properties of objects, e. g. as assessed in visual search tasks. However, recent experimental evidence suggests that visual attention is also guided by action-related properties of objects ("affordances", Gibson, 1966("affordances", Gibson, , 1979, e. g. the handle of a cup affords grasping the cup; therefore attention is drawn towards the handle (see , for an example). In a first step towards modelling this interaction between attention and action, we implemented the Selective Attention for Action model (SAAM). The design of SAAM is based on the Selective Attention for Identification model (SAIM, Heinke & Humphreys, 2003). For instance, we also followed a soft-constraint satisfaction approach in a connectionist framework. However, SAAM's selection process is guided by locations within objects suitable for grasping them whereas SAIM selects objects based on their visual properties. In order to implement SAAM's selection mechanism two sets of constraints were implemented. The first set of constraints took into account the anatomy of the hand, e. g. maximal possible distances between fingers. The second set of constraints (geometrical constraints) considered suitable contact points on objects by using simple edge detectors. At first, we demonstrate that SAAM can successfully mimic human behaviour by comparing simulated contact points with experimental data. Secondly, we show that SAAM simulates affordance-guided attentional behaviour as it successfully generates contact points for only one object in two-object images. (precision or power grip) can influence subsequent categorisation of objects. In their study, participants had to categorise objects into either artefact or natural object. Additionally, and unknown to the participants, the objects could be manipulated with either a precision or a power grasp. showed that categorisation was faster when the hand postures were congruent with the grasp compared to hand postures being incongruent with the grasp. Hence, the participants' behaviour was influenced by action-related properties of objects irrelevant to the experimental task. This experiment together with earlier, similar studies can be interpreted as evidence for an automatic detection of affordance.
Cognition, 2002
In a visual-tactile interference paradigm, subjects judged whether tactile vibrations arose on a finger or thumb (upper vs. lower locations), while ignoring distant visual distractor lights that also appeared in upper or lower locations. Incongruent visual distractors (e.g. a lower light combined with upper touch) disrupt such tactile judgements, particularly when appearing near the tactile stimulus (e.g. on the same side of space as the stimulated hand). Here we show that actively wielding tools can change this pattern of crossmodal interference. When such tools were held in crossed positions (connecting the left hand to the right visual field, and vice-versa), the spatial constraints on crossmodal interference reversed, so that visual distractors in the other visual field now disrupted tactile judgements most for a particular hand. This phenomenon depended on active tool-use, developing with increased experience in using the tool. We relate these results to recent physiological and neuropsychological findings.
Attention, Perception, & Psychophysics, 2013
Previous research on the interaction between manual action and visual perception has focused on discrete movements or static postures and discovered better performance near the hands (the near-hand effect). However, in everyday behaviors, the hands are usually moving continuously between possible targets. Therefore, the current study explored the effects of continuous hand motion on the allocation of visual attention. Eleven healthy adults performed a visual discrimination task during cyclical concealed hand movements underneath a display. Both the current hand position and its movement direction systematically contributed to participants' visual sensitivity. Discrimination performance increased substantially when the hand was distant from but moving toward the visual probe location (a far-hand effect). Implications of this novel observation are discussed.
Cortex, 2007
In human and non human primates, evidence has been reported supporting the idea that near peripersonal space is represented through integrated multisensory processing. In humans, the interaction between near peripersonal space representation and action execution can be revealed in brain damaged patients through the use of tools that, by extending the reachable space, modify the strength of visual-tactile extinction, thus showing that tool-mediated actions modify the multisensory coding of near peripersonal space. For example, following the use of a rake to retrieve distant, otherwise non reachable objects, the peri-hand multisensory area has been documented to extend to include the distal part of a rake . The re-sizing of peri-hand space seems to be selective for tool-use, as directional motor activity alone (i.e., pointing without the tool) and visual/proprioceptive experience alone (protracted passive exposure to the tool) does not vary the extent of the visual-tactile peri-hand space . Moreover, the amount of dynamic resizing varies with the length of the used tool, and is specifically centred on the functionally relevant part of the tool . Here, besides reviewing and discussing these results, we report new evidence, based on a single-case study, supporting the idea that dynamic re-sizing of peri-hand space consists of a real spatial extension of the visual-tactile integrative area along the tool axis.
F1000posters, 2011
Neuroscience, 2018
The ability to recognize a tool's affordances (how a spoon should be appropriately grasped and used), is vital for daily life. Prior research has identified parietofrontal circuits, including mirror neurons, to be critical in understanding affordances. However, parietofrontal action-encoding regions receive extensive visual input and are adjacent to parietofrontal attention control networks. It is unclear how eye movements and attention modulate parietofrontal encoding of affordances. To address this issue, scenes depicting tools in different use-contexts and grasp-postures were presented to healthy subjects across two experiments, with stimuli durations of 100 ms or 500 ms. The 100-ms experiment automatically restricted saccades and required covert attention, while the 500-ms experiment allowed overt attention. The two experiments elicited similar behavioral decisions on tool-use correctness and isolated the influence of attention on parietofrontal activity. Parietofrontal ERPs (P600) distinguishing tool-use contexts (e.g., spoon-yogurt vs. spoon-ball) were similar in both experiments. Conversely, parietofrontal ERPs distinguishing tool-grasps were characterized by posterior to frontal N130-N200 ERPs in the 100-ms experiment and by saccade-perturbed N130-N200 ERPs, frontal N400 and parietal P500 in the 500-ms experiment. Particularly, only overt gaze toward the hand-tool interaction engaged mirror neurons (frontal N400) when discerning grasps that manipulate but not functionally use a tool-(grasp bowl rather than stem of spoon). Results here detail the first human electrophysiological evidence on how attention selectively modulates multiple parietofrontal grasp-perception circuits, especially the mirror neuron system, while unaffecting pari-etofrontal encoding of tool-use contexts. These results are pertinent to neurophysiological models of affordances that typically neglect the role of attention in action perception.
Frontiers in Psychology, 2013
The current study explored effects of continuous hand motion on the allocation of visual attention. A concurrent paradigm was used to combine visually concealed continuous hand movements with an attentionally demanding letter discrimination task. The letter probe appeared contingent upon the moving right hand passing through one of six positions. Discrimination responses were then collected via a keyboard press with the static left hand. Both the right hand's position and its movement direction systematically contributed to participants' visual sensitivity. Discrimination performance increased substantially when the right hand was distant from, but moving toward the visual probe location (replicating the far-hand effect, . However, this effect disappeared when the probe appeared close to the static left hand, supporting the view that static and dynamic features of both hands combine in modulating pragmatic maps of attention.
Psychological science, 2017
Observers experience affordance-specific biases in visual processing for objects within the hands' grasping space, but the mechanism that tunes visual cognition to facilitate action remains unknown. I investigated the hypothesis that altered vision near the hands is a result of experience-driven plasticity. Participants performed motion-detection and form-perception tasks-while their hands were either near the display, in atypical grasping postures, or positioned in their laps-both before and after learning novel grasp affordances. Participants showed enhanced temporal sensitivity for stimuli viewed near the backs of the hands after training to execute a power grasp using the backs of their hands (Experiment 1), but showed enhanced spatial sensitivity for stimuli viewed near the tips of their little fingers after training to use their little fingers to execute a precision grasp (Experiment 2). These results show that visual biases near the hands are plastic, facilitating process...
Experimental Brain Research, 2009
An event-related potential (ERP) experiment was conducted in order to investigate the nature of any cross-modal links in spatial attention during tool use. Tactile stimuli were delivered from the tip of two sticks, held in either a crossed or an uncrossed tools posture, while visual stimuli were presented along the length of each tool. Participants had to detect tactile deviant stimuli at the end of one stick while trying to ignore all other stimuli. Reliable ERP spatial attention eVects to tactile stimuli were observed at early (160-180 ms) and later time epochs (>350 ms) when the tools were uncrossed. Reliable ERP attention eVects to visual stimuli presented close to the tip of the tool and close to the hand were also observed in the uncrossed tools condition (time epoch 140-180 ms). These results are consistent with the claim that tool-use results in a shift of visuospatial attention toward the tip of the tool and also to attention being focused by the hand where the touch is felt.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.