Psychologists at Washington University in St. Louis, led by Richard A. Abrams, Ph.D., professor of psychology in Arts & Sciences, have shown that to see objects better, you should take the matter into your own hands.
Abrams’ study demonstrates that humans more thoroughly inspect objects when their hands are near the object rather than farther away from it. This reflexive, non-conscious difference in information processing exists, they posit, because humans need to be able to analyze objects near their hands, to figure out how to handle the objects or to provide protection against them.
Recognizing that the location of your hands influences what you see is a new insight into the wiring of the brain, one that could lead to rethinking current rehabilitative therapy techniques and prosthetic design.
For a stroke victim trying to regain use of a paralyzed hand, just placing the good hand next to the desired object could help the injured hand grasp it.
Likewise, prosthetics could be redesigned to include additional information flow from the hand to the brain, rather than just the brain controlling the spatial location of the prosthetic, as with today’s artificial limb technology.
The findings also may lend scientific support for recently enacted California legislation barring the use of hand-held cell phones while driving.
“Being able to have both hands on the wheel might enhance a driver’s perception of the wheel and the nearby instruments,” Abrams suggests. “If the car is perceived to be a type of extension of the wheel, then having both hands on the wheel might enhance the driver’s perception of the car’s location and of objects near to the car. So it is quite possible that there could be an unexpected benefit of having both hands on the wheel.”
Participants in the study were asked to search for the letters S or H among groups of letters displayed on a computer monitor. When they found the letter, the subjects responded by pressing one of two buttons, located either on the sides of the monitor or on their laps. The subjects’ search rate was slower when their hands were on the side of the monitor than on their laps, meaning “This is the first experiment to investigate the effect of hand position on response time for a visual search task,” said Abrams. “In all previous visual search experiments, subjects viewed stimuli on a display and responded by pressing buttons on a table, where their hands were far from the stimuli. In our experiment, the subjects responded using buttons attached to the display so that their hands were next to the stimuli.”
Response times from the hands-on monitor experiment were compared with those from a typical experiment where the subjects responded by pushing buttons that were far from the display, he added.
Results were published in the June issue of Cognition.
Abrams compares this new mode of information processing to the robotic arm on a space vehicle. The camera on the end of the arm sends an image to the operator about its surroundings, allowing the operator to guide the arm into position.
“The engineers who designed the arm knew that positioning it would be easier if they had the camera right in hand,” he said. “What we didn’t know until now was that humans have a mechanism for doing this, too.”