Most participants preferred the manual mode, which requires them to think several steps ahead and either physically type in instructions or verbally direct the arm with a series of precise commands. They favoured the manual mode even though they did not perform tasks as well with it.
"We focused so much on getting the technology right," said Assistant Professor Aman Behal. "We didn't expect this."
John Bricout, Behal's collaborator, said the study demonstrates how people want to be engaged – but not overwhelmed – by technology. The psychology theory of Flow describes this need to have a balance between challenge and capacity in life.
"If we're too challenged, we get angry and frustrated. But if we aren't challenged enough, we get bored," said Bricout, who has conducted extensive research on adapting technology for users with disabilities. "We all experience that. People with disabilities are no different."
The computer program is based on how the human eye sees. A touch screen, computer mouse, joystick or voice command sends the arm into action. Then sensors mounted on the arm see an object, gather information and relay it to the computer, which completes the calculations necessary to move the arm and retrieve the object.
Behal is seeking grants to translate the study's findings into a smoother "hybrid" mode that is more interactive and challenging for users and features a more accurate robotic arm. Laser, ultrasound and infrared technology coupled with an adaptive interface will help him achieve his goals.
The key is to design technology that can be individualized with ease, Behal said. Some patients will have more mobility than others, and they may prefer a design closer to the manual mode. Though the automatic mode wasn't popular in the pilot study, it may be the best option for patients with more advanced disease and less mobility.
COMPAMED.de; Source: University of Central Florida