This is technically known as direct neural interface or brain-machine interface.
For example, down the road at Duke, lab monkeys in North Carolina have controlled robots in Japan. The goal? The creation of exoskeletons that would allow paralyzed people to walk.
In simple English: "A great portion of the mammalian brain is devoted to sampling and processing sensory information generated by the animal's active exploration of its surrounding environment. These complex and vital tasks are accomplished by the cooperative action of large ensembles of neurons distributed across multiple intermediary levels of the parallel sensory pathways that connect peripheral receptors to the cortical mantle. Dr. Nicolelis' laboratory is particularly interested in understanding the general computational principles underlying the dynamic interactions between populations of cortical and subcortical neurons that mediate tactile perception. To pursue this goal, Dr. Nicolelis and his colleagues have developed new electrophysiological techniques for carrying out long-term simultaneous recordings of the extracellular activity of up to 128 single neurons distributed across multiple levels of somatosensory and motor pathways in behaving animals. This experimental paradigm is used in combination with multivariate statistical techniques, computer graphics, and neural network models to analyze the spatiotemporal structure of neuronal ensemble activity and its correlation with different aspects of exploratory tactile behaviors.
Heretofore, this approach has been used to describe the spatiotemporal structure of normal sensory responses and functional plastic rearrangements within the somatosensory thalamus. Currently, the lab is investigating the dynamic interactions between populations of sensory and motor neurons during active tactile exploration of novel objects. The rat whisker system is the main model used in this investigation. Using the same experimental approach, the lab will also study haptic discrimination in behaving non-human primates. The overall goal is to verify whether patterns of neuronal ensemble activity across the sensory system predict object attributes such as shape and texture. Another main project in the lab focuses on the role of early postnatal motor activity in shaping the spatiotemporal structure of sensory responses across the rat somatosensory system."
Piece of cake.
In the future, we will see products that permit speech through "mind reading," not vocalization; vision through implants, and control of artificial limbs. I predict that within 50 to 100 years the iPod of the future will be a direct implant which will allow instant access to global knowledge. Probably for 99 cents per download. The implant would probably be subdermal at first, but technological progress would ultimately permit the microscopic biomachine to be directly inserted into the brain. Eventually, this will be a routine procedure conducted at birth. It might decay on a scheduled basis, allowing for regular upgrades. Our brains would have subscriptions.
Whoever figures out how to commercialize this technology will make Bill Gates look like a pauper.
Then again it might look like this: The White Mountains, the young adult trilogy of books about what would have happened had H.G.Wells' Martians conquered the world....Brain capping for everyone on their 13th birthdays.
In Arthur C. Clarke's 3001: The Final Odyssey, he predicts BrainCaps for everyone, allowing direct Internet access to the brain. Plus, dinosaur pets....