Summary: Constraints on hand movements affect object-meaning processing, a finding that supports embodied cognition theory.

Source: Osaka Metropolitan University

How do we understand words? Scientists don’t fully understand what happens when a word pops into your mind. A research team led by Professor Shogo Makioka at Osaka Metropolitan University’s Graduate School of Sustainable Systems Science wanted to test the concept of embedded cognition.

Embodied perception refers to how people understand the words of objects through their interactions with them, so the researchers designed an experiment to observe the process of word interpretation when participants had limited ways of interacting with objects.

Words are defined in relation to other words; A “cup” could be, for example, “a container made of glass, used for drinking.” But you can only use a cup to drink water from a cup if you bring it to your mouth in your hand, or if you drop the cup and realize that it will break on the floor.

Without understanding this, it will be difficult to create a real cup-handling robot. In artificial intelligence research, these issues are known as signal grounding problems, which represent signals in the real world.

How can people achieve symbol lands? Cognitive psychology and cognitive science present the concept of cognition, objects give meaning in association with the body and the environment.

To test involved perception, the researchers conducted experiments to see how the participants’ brains responded to words describing objects that could be controlled by their hands when their hands were free to move compared to when they were restricted.

“Developing a method to measure and analyze brain activity was very difficult. The first author, Ms. Sai Onishi, worked diligently to develop a method to measure brain activity with sufficient accuracy,” explained Prof. Makioka.

In the experiment, two words such as “cup” and “broom” were presented to the participants on the screen. They were asked to compare the relative sizes of the objects the words represented and verbally answer which was larger—in this case, “broom.”

This shows when people think of a glass of water and a broom.
When asked which word represented the largest object, participants responded faster when their hands were free (left) than when their hands were restrained (right). Constraining hands reduces brain activity during word processing in left brain areas associated with tools. Credit: Makioka, Osaka Metropolitan University

Comparisons are made between the words, to see how each type is made, describing two types of things, hand-printed things like “cup” or “broom” and hand-printed things like “building” or “candlestick”.

During the tests, the participants put their hands on a table, where they are free or restricted by a clear acrylic plate. Participants were forced to think of both objects and compare their sizes to work out the meaning of each word in order to answer which of the two words represented the larger object when presented on the screen.

Brain activity is measured by functional near-infrared spectroscopy (fNIRS), which has the advantage of taking measurements without imposing additional physical constraints.

The measurements focused on the interparietal sulcus and the left inferior lobule (supramarginal gyrus and angular gyrus), which control instrumental-related semantic processing.

Verbal response speed was measured to determine how quickly the participant responded after the words appeared on the screen.

The results showed that left brain activity in response to manual manipulations was significantly reduced by manual restraints. Verbal responses are also affected by manual limitations.

These results indicate that hand movement restriction affects the object-meaning process, which supports the idea of ​​integrated cognition. These results show that embedded cognition as well as artificial intelligence can be effective in learning the meaning of objects.

So cognitive research news

Author: Yoshiko Tani
Source: Osaka Metropolitan University
Contact: Yoshiko Tani – Osaka Metropolitan University
Image: The image is credited to Makioka, Osaka Metropolitan University

watch out

This shows a sad woman

Preliminary study: Open Access.
Hand restraint reduces brain activity and impairs the speed of verbal responses in translation tasks.” by Sai Onishi et al. Scientific reports


Draft

Hand restraint reduces brain activity and impairs the speed of verbal responses in translation tasks.

According to the empirical theory of cognition, the process of definition is closely connected with the movements of the body. For example, restricting hand movement inhibits the ability to remember things that can be done with the hands. However, physical limitations have not been shown to reduce brain activity related to semantic learning.

Using functional infrared spectroscopy, we measured the effects of hand restraint on localization processing in the parietal lobe.

Pairs of words representing manual (e.g., cup or pencil) or non-manual (e.g., windmill or fountain) objects were presented and participants were asked to identify which object was larger.

Reaction time (RT) and activation of the left intraparietal sulcus (LIPS) and left inferior parietal lobule (LIPL), supramarginal gyrus and angular gyrus in the judgment task were analyzed. We found that restricting that hand movement inhibited brain activity in LIPS to hand-movable objects and affected RT in the size judgment task.

These results suggest that physical limitation reduces the activity of brain regions involved in divorce. Hand inhibition can inhibit motor simulation, which, in turn, inhibits body-related localization.

Leave a Reply

Your email address will not be published.