Our brains take rhythmic snapshots of the world as we walk – and we never knew

For decades, psychology departments around the world have studied human behaviour in darkened laboratories that restrict natural movement.

Our new study, published today in Nature Communications, challenges the wisdom of this approach. With the help of virtual reality (VR), we have revealed previously hidden aspects of perception that happen during a simple everyday action – walking.

We found the rhythmic movement of walking changes how sensitive we are to the surrounding environment. With every step we take, our perception cycles through “good” and “bad” phases.

This means your smooth, continuous experience of an afternoon stroll is deceptive. Instead, it’s as if your brain takes rhythmic snapshots of the world – and they are synchronised with the rhythm of your footfall.

The next step in studies of human perception

In psychology, the study of visual perception refers to how our brains use information from our eyes to create our experience of the world.

Typical psychology experiments that investigate visual perception involve darkened laboratory rooms where participants are asked to sit motionless in front of a computer screen.

Often, their heads will be fixed in position with a chin rest, and they will be asked to respond to any changes they might see on the screen.

This approach has been invaluable in building our knowledge of human perception, and the foundations of how our brains make sense of the world. But these scenarios are a far cry from how we experience the world every day.

This means we might not be able to generalise the results we discover in these highly restricted settings to the real world. It would be a bit like trying to understand fish behaviour, but only by studying fish in an aquarium.

Instead, we went out on a limb. Motivated by the fact our brains have evolved to support action, we set out to test vision during walking – one of our most frequent and everyday behaviours.

Doing tests in a lab isn’t quite the same as seeing and interacting with things in the real world. sirtravelalot/Shutterstock

A walk in a (virtual) forest

Our key innovation was to use a wireless VR environment to test vision continuously while walking.

Several previous studies have examined the effects of light exercise on perception, but used treadmills or exercise bikes. While these methods are better than sitting still, they don’t match the ways we naturally move through the world.

Instead, we simulated an open forest. Our participants were free to roam, yet unknown to them, we were carefully tracking their head movement with every step they took.

Participants walked in a virtual forest while trying to detect brief visual ‘flashes’ in the moving white circle.

We tracked head movement because as you walk, your head bobs up and down. Your head is lowest when both feet are on the ground and highest when swinging your leg in-between steps. We used these changes in head height to mark the phases of each participant’s “step-cycle”.

Participants also completed our visual task while they walked, which required looking for brief visual “flashes” they needed to detect as quickly as possible.

By aligning performance on our visual task to the phases of the step-cycle, we found visual perception was not consistent.

Instead, it oscillated like the ripples of a pond, cycling through good and bad periods with every step. We found that depending on the phases of their step-cycle, participants were more likely to sense changes in their environment, had faster reaction times, and were more likely to make decisions.

Oscillations in nature, oscillations in vision

Oscillations in vision have been shown before, but this is the first time they have been linked to walking.

Our key new finding is these oscillations slowed or increased to match the rhythm of a person’s step-cycle. On average, perception was best when swinging between steps, but the timing of these rhythms varied between participants. This new link between the body and mind offers clues as to how our brains coordinate perception and action during everyday behaviour.

Next, we want to investigate how these rhythms impact different populations. For example, certain psychiatric disorders can lead to people having abnormalities in their gait.

There are further questions we want to answer: are slips and falls more common for those with stronger oscillations in vision? Do similar oscillations occur for our perception of sound? What is the optimal timing for presenting information and responding to it when a person is moving?

Our findings also hint at broader questions about the nature of perception itself. How does the brain stitch together these rhythms in perception to give us our seamless experience of an evening stroll?

These questions were once the domain of philosophers, but we may be able to answer them, as we combine technology with action to better understand natural behaviour.The Conversation

Matthew Davidson, Postdoctoral research fellow, lecturer, University of Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.