Humans are pretty creative. We adapt the world around us to fit our needs and develop tools to help with almost any task imaginable. Other animals use tools as well, but not to the same extent, and not with the same flexibility and ease.
Tools make us more efficient, and more intelligent (for the most part); they help optimize the relationship between our bodies, our minds, and the world we live in. Some researchers (Clark and Chalmers, 1998; Hutchins, 1995) even argue that tools, in some ways, extend our minds and our bodies. And, in fact, there’s some evidence from cognitive neuroscience that the brain considers tools – at least some of the time – part of the body.
The Body Schema
To better understand what this means, it’s important to understand how the brain thinks about the body. Findings from clinical neurology and neuroscience have revealed that the brain forms “maps” of the body throughout particular regions of the cortex, with different circuits corresponding to different parts of the body (see the figure below).
These maps are sometimes called “homunculi”. There’s a sensory homunculus, which is responsible for representing tactile stimulation of parts of the body, and there’s a motor homunculus, which governs motor actions from different parts of the body.
The brain also uses other self-monitoring systems, such as the proprioceptive and vestibular systems, to track its posture, position, and motion. All this information, along with information from the more “typical” sensory modalities – vision, hearing, etc. – is integrated into what’s called a body schema.
Put simply, the body schema is the sense of self. This representation changes constantly, as new sensory information is processed and integrated with each action taken by an organism. The body schema plays an important role in constraining our future decisions and actions, and address implicit questions such as: how fast am I walking, and how long will it take for me to get to the other side of the room? Most importantly, the body schema is a dynamic representation of how we, as embodied beings, are situated in the world around us.
One particularly fascinating way in which the brain integrates sensory modalities is in bimodal neurons. Researchers discovered neurons in the intraparietal cortex of macaque monkeys that encoded information about both visual receptive field (vRF) and somatosensory receptive field (sRF). In other words, the same neuron would fire when a monkey’s hand was touched, and also to visual stimuli on and around the hand (Maravita & Iriki, 2004).
This discovery suggests a direct neural mechanism for encoding the “body schema”. Bimodal neurons represent both a body part (e.g. a hand), and the accessible space around that hand – sometimes called the “effector space”. Thus, our cognitive system has a quick way of determining the range of available motor actions (and corresponding effects on the immediate environment) for any given body part.
What’s cool about bimodal neurons is that they demonstrate how the brain integrates information both about the body, and the environment. Using both streams of information, we’re able to do the things that we do.
Tools in the brain
So what happens when we teach a monkey to use a tool, such as a rake?
If bimodal neurons are solely responsible for representing the monkey’s body parts, then the firing patterns shouldn’t change. But if they represent the affordances and actions available to the monkey – e.g. the effector space – then the firing patterns should change to accommodate the augmented range of possibilities facilitated by the rake.
And in fact, the latter is exactly what happens.
Before the monkey learned to use the rake, the bimodal neuron represented the area directly around its hand. But after the monkey learned the new skill, the bimodal neuron changed its pattern of activity to also represent the area around the rake, suggesting a fundamental change to the monkey’s body-schema. In some ways, the brain’s “definition” of the body expanded to encompass tools in the environment.
What does this mean for us? Why should we care about the firing patterns of neurons in the intraparietal cortex of a macaque monkey?
Tools are obviously a central part of human intelligence. We live in a world designed by humans, for humans; tools range from rakes and shovels to cell phones, computers, and the Internet. Technology is a way of augmenting our natural abilities, and in some cases, offloading the computations we normally have to perform in our head onto the environment (Kirsh, 2008).
Despite their importance to our daily lives, many people are resistant to the idea that the tools we use are in some ways part of “us”. There’s a general idea that the boundaries of the “self” end at the tips of our fingers and our toes, but it’s possible we ought to think of the “self” as a more flexible concept. Certainly, our fingers and our toes are a part of this concept, but sometimes, maybe our “self” expands to encompass the tool we’re using in that moment.
Recently, Professor David Kirsh (of the UCSD Cognitive Science department) gave a lecture on this paper, in which he pointed out the implications of these findings. Tools and technology have an incredible impact on human life. As we learn to use tools – and humans are uniquely fast and adept at learning to use new tools – our brain adapts to represent the actions afforded to us by this new tool. This opens up new possibilities, new solutions to previously unsolvable problems, and new ways of looking out at the world.
Hutchins, E. (1995). How a cockpit remembers its speeds. Cognitive Science, 19(3), 265–288. https://doi.org/10.1016/0364-0213(95)90020-9
Clark, A., & Chalmers, D. J. (1998). The extended mind. Analysis, 180–191. https://doi.org/10.1111/j.1467-9744.2009.01021.x
Maravita, A., & Iriki, A. (2004). Tools for the body (schema). Trends in Cognitive Sciences, 8(2), 79–86. https://doi.org/10.1016/j.tics.2003.12.008
Spivey, M., Richardson, D., & Fitneva, S. (2004). Thinking outside the brain: Spatial indices to visual and linguistic Information. The Interface of Language, Vision, and Action, 161–190. https://doi.org/10.4324/9780203488430
Shapiro, L. (2007). The Embodied Cognition Research Programme. Philosophy Compass, 2(2), 338–346. https://doi.org/10.1111/j.1747-9991.2007.00064.x
Conner, J. M., Culberson, A., Packowski, C., Chiba, A. A., & Tuszynski, M. H. (2003). Lesions of the basal forebrain cholinergic system impair task acquisition and abolish cortical plasticity associated with motor skill learning. Neuron, 38(5), 819–829. https://doi.org/10.1016/S0896-6273(03)00288-5
Kirsh, D. (1993). The intelligent use of space. Artificial Intelligence, 73.
 This perspective is sometimes called extended mind, and is discussed in more detail elsewhere (Clark and Chalmers, 1998; Spivey et al, 2004). It’s related to the idea of embodied cognition (Shapiro, 2007) in that it argues for a redefining of the boundaries of “the mind” beyond the boundaries of the skull. Unlike embodied cognition, however – which argues that the body, not just the brain, is a part of our mind – the extended mind thesis suggests that our view of “the mind” should include particular tools that we rely on either for cognitive or perceptual enhancement.
 Interestingly, certain parts of the body are “over-represented” in these homunculi. For example, our hands are quite sensitive to touch, and take up a significant part of the sensory homunculus. Even more interestingly, the amount of cortex devoted to representing a particular part of the body can change over time, depending on the demands of the environment and the experience of the individual (Conner et al, 2003).