Skip to main content

Loudness affects motion: asymmetric volume of auditory feedback results in asymmetric gait in healthy young adults



The potential of auditory feedback for motor learning in the rehabilitation of various diseases has become apparent in recent years. However, since the volume of auditory feedback has played a minor role so far and its influence has hardly been considered, we investigate the volume effect of auditory feedback on gait pattern and gait direction and its interaction with pitch.


Thirty-two healthy young participants were randomly divided into two groups: Group 1 (n = 16) received a high pitch (150-250 Hz) auditory feedback; group 2 (n = 16) received a lower pitch (95-112 Hz) auditory feedback. The feedback consisted of a real-time sonification of the right and left foot ground contact. After an initial condition (no auditory feedback and full vision), both groups realized a 30-minute habituation period followed by a 30-minute asymmetry period. At any condition, the participants were asked to walk blindfolded and with auditory feedback towards a target at 15 m distance and were stopped 5 m before the target. Three different volume conditions were applied in random order during the habituation period: loud, normal, and quiet. In the subsequent asymmetry period, the three volume conditions baseline, right quiet and left quiet were applied in random order.


In the habituation phase, the step width from the loud to the quiet condition showed a significant interaction of volume*pitch with a decrease at high pitch (group 1) and an increase at lower pitch (group 2) (group 1: loud 1.02 ± 0.310, quiet 0.98 ± 0.301; group 2: loud 0.95 ± 0.229, quiet 1.11 ± 0.298). In the asymmetry period, a significantly increased ground contact time on the side with reduced volume could be found (right quiet: left foot 0.988 ± 0.033, right foot 1.003 ± 0.040, left quiet: left foot 1.004 ± 0.036, right foot 1.002 ± 0.033).


Our results suggest that modifying the volume of auditory feedback can be an effective way to improve gait symmetry. This could facilitate gait therapy and rehabilitation of hemiparetic and arthroplasty patients, in particular if gait improvement based on verbal corrections and conscious motor control is limited.

Peer Review reports


The ability to perceive noise and sound is of great importance for our everyday interaction with the environment. For example, auditory perception helps us to recognize and determine distances, speeds, obstacles, materials, and our own position in space [1,2,3,4]. In sports, acoustic signals, sounds, verbal agreements, and music are often used to synchronize and modulate movements. Sounds are produced by movement, e.g. when bouncing off spring floors, hitting balls and when arms and legs hit water, or are consciously generated, e.g. when the starting shot is given or when shouting in team sports. The volume of sounds is often causally related to the intensity of movement. Thus, greater energy means increased power and acceleration or deceleration resp., which results in increased volume.

Possibly due to these physical correlation between movement and sound, neurophysiological findings suggest a close relationship between the movement system and auditory brain areas. Several imaging studies have shown that noises or sounds produced by a known movement induce neuronal activation in the human brain that resembles the neuronal activation during execution of the action. This simulation can be observed especially in the mirror neuron system and has become known in recent years under the term “action-listening” [5,6,7,8]. Furthermore, Chen et al. [9] showed in two fMRI experiments that rhythmic sounds generally cause an activation of the motor cortex in humans. The participants of experiment 1 knew the task of tapping on a right mouse button in synchrony to different rhythms given by a computer and via headphones from an exercise session on the day before the fMRI measurements. In contrast, participants of experiment 2 did not know that they were supposed to tap to the rhythms during the course of the fMRI measurement. Since no practice session was conducted on the previous day, they only learned about the tapping task after they had passively listened to the rhythms once. Under both conditions, listen with action anticipation and passive listening, the supplementary motor area, mid-premotor cortex (mid-PMC), and the cerebellum were activated.

It also became clear that people are better able to recognize the sound pattern generated by their own actions than a sound pattern generated by other persons actions and to assign it to themselves [10,11,12,13]. For auditory perception, therefore, a close perception-action link can be assumed in humans. Due to the intrinsic connection between sound and movement in space and time [14,15,16,17] and the neural connectivity described above, it seems reasonable to use auditory information to provide targeted and effective feedback for sports training and motor (re-)learning.

In the research on motor behavior, there exist many different approaches regarding the artificial generation of augmented auditory feedback (AAF). The following AAF methods were mainly considered: natural movement sounds [18,19,20], error feedback [21,22,23], rhythmic auditory stimulation [24,25,26,27], sonification [28,29,30,31,32,33,34] and musical movement feedback [35,36,37]. It has been shown that AAF is effective in a wide variety of application areas. There is evidence of efficacy in sports, e.g. rowing [38, 39], skiing [40], golf [41], cycling [42], and swimming [43], and also in movement rehabilitation, particularly in Parkinson’s disease [44,45,46] and stroke patients [47, 48].

So far, the choice of one of the aforementioned AAF methods and the mapping of acoustic parameters to specific movements, seems to be based primarily on the assessment of the movement or disease under investigation. For example, for gait rehabilitation in Parkinson’s patients [49], rhythmic-auditory stimulation was investigated above all, since walking is an intrinsically rhythmic and repetitive movement. For movements with more degrees of freedom, such as attack-and-release actions (e.g. grasping), studies were conducted more frequently using real-time movement sonification or musical sonification [43, 45].

Movement sonification (MS) means the transformation of kinematic human motion data into sound, resulting in multidimensional motion acoustics. So far, research on gait sonification mainly considered timbre and pitch [50,51,52,53], rhythm [54,55,56,57,58] and tempo [59, 60]. As far as known, even if correlations of volume and distance [61], volume and size of objects [2, 62, 63], volume and direction and speed of movement [64, 65] as well as volume and articulatory kinematics [66] are known from other research areas, these have hardly been included when using gait sonification. However, due to the known correlations, volume could be an easy-to-use parameter, for example, to specifically treat rehabilitation patients with asymmetrical gait (stroke patients, unilateral arthroplasty) with the help of well-shaped auditory feedback.

In a recent review paper, Schaffert et al. [67] point out that the question of “what auditory components and amount of information are most relevant for motor training and rehabilitation” has not yet been sufficiently investigated. Among other things, it is unclear what effect individual parameters of sound (e.g., pitch, volume, timbre, tempo, rhythm) have on the execution of movement and motor control (cf. also [68]). However, knowledge of the concrete impact of the various sound parameters in AAF considering different target groups would make the use of auditory feedback more purposeful and efficient in the future. This work aims to contribute to the clarification of the sound-parameter-motion relationship in AAF. For this purpose, we consider the parameters volume and pitch and their possible influence on the gait pattern of healthy young persons. These two parameters are taken into account since pitch and loudness perception are correlated due to the perceptual range of the human auditory system: We hear sounds loudest at frequencies between 2000 and 4000 Hz, and sounds below or above are perceived more quietly at the same sound pressure level [69]. Furthermore, correlations between pitch and range and direction of motion [16, 70,71,72,73] are well known and clearly described in the literature. A higher pitch is usually accompanied by an increase in height and velocity which also indicates a similarity to volume perception.

This study intends to investigate the influence of different volume and its interaction with pitch of real-time sonification of the ground contact on the gait pattern of healthy persons.

First, the overall volume was varied by 6 dB in three steps (loud 0 dB, normal − 6 dB, quiet − 12 dB) to determine its influence on participants’ gait pattern (stride width, stride length, gait speed). Second, we hypothesized that the asymmetric loudness of sonification influences the gait symmetry of the participants. In this regard, the volume difference was varied between the right and left channel of the headphone used. Furthermore, to investigate whether pitch interacts with volume, the volume changes were applied to two groups (G1 n = 16, G2 n = 16) with different sonification pitches: G1 received a sound with a base frequency of 150-250 Hz and G2 received a sound with a base frequency of 95-112 Hz.



A total of 32 young, healthy volunteers participated in the study. Each participant was informed about the general course of the study and the handling of the data collected before the start of the measurement. Written informed consent was obtained from each participant. The study was conducted in accordance with the guidelines stated in the Declaration of Helsinki and the regulations of the Ethical Committee of the Leibniz University Hannover (EV LUH 15/2019). Volunteers aged 18-35 years with normal physiological walking and hearing ability were included in the study. Acute injuries or pain of the lower extremities and diseases affecting hearing, vision or balance were defined as exclusion criteria. The criteria were checked by means of a questionnaire, which was completed by the participants before the start of the measurements. In addition, each participant obtained a hearing test (HTTS hearing test software, Version 2.10, SAX GmbH, Berlin, Germany) to ensure sufficient hearing ability and well-balanced hearing in both ears.

Participants were randomly divided into two groups. G1 (n = 16, gender: 8 m/8f, age: 23.6 ± 3.4 years, height: 178.3 ± 9.7 cm, weight: 71.3 ± 15.6 kg, weekly sport activity: 6.4 ± 3.8 h) received a high pitch sonification and G2 (n = 16, gender: 9 m/7f, age: 25.2 ± 3.3 years, height: 180.1 ± 7.1 cm, weight: 73.3 ± 10.0 kg, weekly sport activity: 6.2 ± 2.9 h) received a lower pitch sonification. T-tests for independent samples of the baseline characteristics of both groups revealed no significant differences between G1 and G2 (age p = 0.202, height p = 0.552, weight p = 0.669, weekly sport activity p = 0.836). The proportion of right- and left-handed and right- and left-footed participants was approximately balanced in G1 and G2 (G1: 14 right-handed, 2 left-handed; 7 support leg right, 9 support leg left; G2: 13 right-handed, 2 left-handed, 1 ambidextrous; 5 support leg right, 11 support leg left).

In order to capture whether there are different emotional responses in participants due to the different pitch of sonification, the Bf-SR questionnaire was used to assess mental state [validated German questionnaire Bf-SR [74]]. The questionnaire was filled out by the participants once before the start of the gait measurements and once after the gait measurements.

Experimental design

The measurements took place in a quiet gym of the Leibniz University Hannover. Each participant participated in one 90-minute measuring session. A randomized single-blinded design was chosen. Unlike the supervisor of the experiment, the participants were not informed in advance about their group allocation and the different volume conditions. Each participant went through all of the conditions presented below in random order.

The measurements began with an initial condition: the participants walked four times straight from a start mark towards a target at a distance of 15 m with full vision and without sonification. The further course of the experiment was divided into two periods: a habituation period and an asymmetry period. In both periods the participants received sonification via headphones while walking. The sonification of the right ground contact was played only on the right speaker of the headphone and the sonification of the left ground contact on the left speaker of the headphone. In detail, the sonification mappings are described in section Ground contact sonification. Both periods consisted of three blocks each. During the habituation period, the volume was varied symmetrically on both sides: (1) loud, (2) normal, (3) quiet. During the asymmetry period, the volume was varied asymmetrically: (1) right quiet (RQ), (2) left quiet (LQ), (3) right and left equal (baseline). In the habituation period, one block consisted of a five-minute walking phase in which the participants walked back and forth between start and target with full vision and sonification (loud, normal, quiet). This was followed by four times walking blindfolded from the start towards the target under the same volume condition as during the five-minute gait phase. In the asymmetry period, one block consisted of four blindfolded walks from the start to the target with wave noise. This was followed by four blindfolded walks from the start to the target with sonification (RQ, LQ, baseline). The course of the experiment is shown in Fig. 1.

Fig. 1
figure 1

Experimental design. The experiment starts with an initial condition, followed by the habituation period (top) and the asymmetry period (bottom), each consisting of three repetitions (blue diamond). The three repetitions include three different volume conditions in the habituation period (loud, normal, and quiet) and in the asymmetry period (baseline, right quiet, left quiet), each run once in randomized order

Gait analysis

To ensure consistent walking conditions, the test persons were provided with anti-slip socks in which they could walk safely in the gym. A start marker was attached to the floor and a red target point was attached to a box (70 × 50 × 40 cm) to clearly delimit the walking area (Fig. 2). The markings indicated a distance of 15 m. Furthermore, a white line drawn in an arc on the ground marked a distance of 10 m from the starting point. The participants were fitted with the wireless motion analysis system MVN Awinda (XSens Technologies B.V., Enschede, the Netherlands). Seven inertial measurement units (IMUs) were attached to the sacrum (1 IMU), lateral side of both femurs (2 IMUs), medial surface of tibias (2 IMUs), and middle arches of the feet (2 IMUs) using velcro straps. The data acquisition was carried out using the software MVN Studio BIOMECH (Version 4.1, XSens Technologies B.V., Enschede, the Netherlands), which stores the data at a frequency of 60 Hz. Before each gait recording, the motion analysis system was calibrated directly at the marked starting point to ensure the highest possible measurement and sonification accuracy.

Fig. 2
figure 2

a Experimental setup for the gait measurements. The start-calibration mark is on the bottom right. At the top left is the target marking and the 10 m distance is marked by an arc line. b In the initial condition and habituation, participants walk with full vision. Right side: In the conditions loud, quiet, normal, baseline setting, RQ setting, LQ setting, and wave noise, participants walk blindfolded

The measurements started with an initial condition without visual restriction and without sonification. The participants approached the target four times at a self-selected average speed and stopped about 5 cm before the target mark. The kinematics of the total distance of 15 m was recorded. This was followed by the habituation period. For the five-minute walking phases without visual restriction the participants were instructed to put on the wireless headphones after calibration and to walk back and forth between the start and finish markings for 5 min each.

All other conditions (loud, normal, quiet, wave noise, baseline, RQ and LQ) were performed blindfolded. Before each condition, the participants were instructed to first concentrate visually on the target point, second to put on headphones, third to put on the sleeping mask and fourth to start walking within 5 s. In all blindfolded conditions, the participants were stopped at a 10 m line by a touch on their back to achieve a standardized walking distance. Data acquisition was also stopped at this point. The headphones were removed from the participants heads, but not the sleeping mask, in order to avoid the possibility of conscious directional correction during subsequent attempts. The participant was guided back to the starting point via the touch on the back, where the sleeping mask could be taken off again.

Ground contact sonification

For sonification, the kinematic data was streamed in real time from the MVN Biomech software to a self-developed Spyder program (Version 3.3.1., The Scientific Python Development Environment, Spyder Developer Community). Latency from touch down to sound occurrence was less than 100 ms. An algorithm was used to determine the gait events touch-down (TD) and toe-off (TO) using the acceleration data of the feet. The sonification of the ground contact time (from TD to TO) was performed by an implemented CSound module (Csound 6, LGPL). One channel was used for each foot, so that on the left ear only the ground contacts of the left foot and on the right ear only the ground contacts of the right foot could be heard. The pitch was the same on both sides. G1 received sonification of ground contact times with a base frequency of 150-250 Hz. The sound resembles the noise produced when walking through snow. However, the sound has more characteristics of a tone. G2 received sonification of ground contact time with a base frequency of 95-112 Hz. Due to the narrower frequency setting, the sound appeared deeper and softer, and its frequency spectrum was more clearly delineated from the first one. Both sounds are visually contrasted in a Melodic Range Spectrogram in Fig. 3.

Fig. 3
figure 3

a Melodic range spectrogram of the sound used for the sonification of the ground contact for G1 (base frequency of 150-250 Hz) and (b) melodic range spectrogram of the sound used for the sonification of the ground contact for G2 (base frequency of 95-112 Hz). Only one channel is shown at a time. The spectrograms were generated using Sonic Visualiser (Release 4.3, Centre for Digital Music at Queen Mary, London, GB)

The loud volume level (59.0 dB) was determined by pilot measurements in which five young healthy participants were asked whether they perceived the sound clearly when walking 10 m. If the sound was perceived as too loud, they were immediately stopped, and the volume was reduced gradually via the headphones. Steps of volume decrease was chosen such that differences were not clearly noticeable to avoid participants responding with a deliberate change in gait pattern. After the measurements, participants were asked whether they perceived differences in gait sonification, which was the case for only three of the 32 participants (G1: 1; G2: 2).

The volume change of the ground contact sonification was implemented by a decibel change in CSound. This change was based on the inverse square law, which according to Blauert [61] states that the sound pressure level decreases by approximately 6 dB when the distance is doubled. The loud setting was defined in CSound as 0/0 dB (sonification 1/1 = 100%), the normal setting as − 6/− 6 dB (sonification 0/0 = 50%) and the quiet setting as − 12/− 12 dB (sonification − 1/− 1 = 25%). Accordingly, the RQ setting was defined as − 12/− 6 dB and the LQ setting as − 6/− 12 dB. This resulted in actual mean sound pressure levels of 52.0 dB (quiet), 55.5 dB (normal), and 59.0 dB (loud). The volume settings of the headphones and the laptop used were kept the same throughout the experiment.

Data processing

Six middle steps of each gait recording were cut in MVN Studio BIOMECH and included in the evaluation in order to exclude any falsification by accelerating and stopping at the beginning or end of the walk. The gait events TD and TO were determined using a self-developed algorithm in MATLAB (R2016a, The MathWork inc., Natick, MA, USA) and the gait parameters stride duration, percentage step duration in relation to stride duration, percentage ground contact time in relation to stride duration, stride speed, cadence, stride length, step length and step width were analyzed. We defined one stride as the range between the TD of 1 ft to the following TD of the same foot. One step was defined as the range from the TD of 1 ft to the following TD of the other foot and the ground contact time was the time between TD and TO of the same foot. The percentage step duration and the percentage ground contact time were considered in relation to the stride duration, i.e. the stride duration was defined as 100%. The step width is the distance between both feet orthogonal to the direction of gait and the cadence is defined as number of steps per minute.

For the evaluation of the gait direction the recordings were not cut. The target position is the position of the participants’ feet in the initial condition, which was measured for each participant at the beginning. The stop position is the final foot position of the participants in the habituation period and asymmetry period, when walking blindfolded. The direction of gait was determined in MATLAB by establishing a line equation based on the start position and target position of the feet (Eq. 1). The amount of the angle between the two vectors target position and stop position was determined by Eq. 2.

$$\Delta y={y}_{stop}-\left(\ \frac{y_{target}-{y}_{start}}{x_{target}-{x}_{start}}\bullet \left({x}_{stop}-{x}_{start}\right)+{y}_{start}\right)$$
$${\alpha}_{dev}={\cos}^{-1}\frac{\left({\overrightarrow{\Delta s}}^{{}^{\circ}}\ \overrightarrow{\Delta t}\right)}{\left|\overrightarrow{\Delta s}\right|\bullet \left|\overrightarrow{\Delta t}\right|}$$

In Eq. 1, ∆y is the difference between the y-coordinates of the stop vector and target vector at the same level, ystop is the y-coordinate of the stop vector, xstop is the x-coordinate of the stop vector, ytarget is the y-coordinate of the target vector, and xtarget is the x-coordinate of the target vector. A ∆y > 0° was defined as a deviation to the left, a ∆y < 0° was defined as a directional deviation to the right.

In Eq. 2, αdev is the amount of the directional deviation, \(\overrightarrow{\Delta s}\) is the stop vector, and \(\overrightarrow{\Delta t}\) is the target vector.

To determine the ratio the data of the conditions RQ and LQ were each divided by the baseline condition (asymmetry period) and the data of the conditions loud and quiet were each divided by the normal condition (habituation period). For the statistical analysis, the ratio of stride duration, step duration, ground contact time, percentage step duration, percentage ground contact time, stride speed, cadence, stride length, step length and step width were considered. For the gait direction, the angles of the conditions RQ and LQ were subtracted from the angles of the baseline condition. The differences were used for the statistical analysis.

Statistical analysis

The results of the parameters are presented as mean values and standard deviations (mean ± SD). Only the deviation of the gait direction in Fig. 4 is given as mean values and standard error (mean + SE). A mixed ANOVA was applied to the temporal, spatial and directional parameters. The mental state (Bf-SR score) was analyzed using a sign test.

Fig. 4
figure 4

Step width of G1 and G2 at loud and quiet settings during the habituation period. Values are mean ± standard deviation. Significant interactions are marked with * (p < 0.05)

The data were checked by a Shapiro Wilk test for the condition of normal distribution. A Levene’s test was used to check for homogeneity of variances. The analyses were performed using SPSS (IBM SPSS Statistics, Version 26, Chicago, IL) and level of significance was set at α = 0.05. If a significant interaction effect was observed, post-hoc t-tests using Bonferroni correction were performed in MATLAB to identify detailed differences between conditions.


Habituation period

Considering the temporal parameters of the habituation period, no significant effects were found for step duration, ground contact time, cadence, and stride speed (Table 1).

Table 1 Results of the habituation period

For the spatial parameters, no significant effects were found for stride length and step length. Regarding step width, no main effect of volume was found. However, an interaction effect of volume*pitch was found (F(1,30) = 4.39, p = 0.045, f = 0.38). This effect can be explained by a decrease in step width for G1 (high pitch) and an increase in step width for G2 (low pitch) from loud to quiet (Fig. 4). However, post hoc tests show no significant differences between the respective conditions.

Asymmetry period

In the asymmetry period, there were no main effects of volume or side on the temporal parameters stride duration, step duration, ground contact time, stride speed, and cadence (Table 2). However, an interaction effect of volume*side (F(1,30) = 5.027, p = 0.033, f = 0.41) was found for ground contact time. Post hoc tests show a significantly higher ground contact time of the left leg of G1 (p = 0.046) for the LQ (1.004 ± 0.045) condition compared to RQ (0.978 ± 0.026). A similar trend can be seen for G2 (LQ: 1.004 ± 0.024, RQ: 0.997 ± 0.037), but here no significant difference can be found post hoc. The described effect is shown in Fig. 5 (left).

Table 2 Results of the asymmetry period
Fig. 5
figure 5

a Ground contact time of G1 and G2 at right quiet (RQ) and left quiet (LQ) settings during the asymmetry period. Values are mean ± standard deviation. Significant interactions are marked with * (p < 0.05). b Directional deviation of G1 and G2 at right quiet (RQ) and left quiet (LQ) settings during the asymmetry period. A positive value is defined as deviation to the left, a negative value as deviation to the right. No significant differences could be found. Values are mean + standard error

For the spatial parameters stride length, step length, and step width, neither main nor interaction effects appeared in the asymmetry period.

Also, no significant main and interaction effects could be found for gait direction. Purely descriptively, however, a tendency of the study participants to walk in the direction to which the louder ground contact sound was heard can be detected (Fig. 5, right).

Assessment of mental state

There was a significant decrease in the Bf-SR score (pre: 12.44 ± 7.28, post: 11.19 ± 7.29) from before measurements to after measurements (p = 0.045) indicating an improvement in mental state.


The present study intended to investigate the influence of the volume of real-time gait sonification on the gait pattern and gait direction of healthy young persons. The results show that an asymmetric volume of ground contact sonification directly influences the ground contact time unilaterally, which results in a temporal gait asymmetry. It can be seen that the ground contact time of the quiet foot is increased. However, no effects of the asymmetrical volume on spatial parameters of the gait, such as step length, and walking direction when walking blindfolded were found. Considering the overall volume during the habituation phase an effect on the step width was revealed, which seems to interact with the pitch of the gait sonification: for G1 (high pitch) a positive relationship between volume and step width, but for G2 (low pitch) a negative relationship becomes apparent. In addition, the Bf-SR survey showed that the mental state of the study participants improved from the beginning to the end of the measurements. It is clear that this development is not due to the sound of the sonification, as no differences between the groups can be detected in this development. Presumably, the improvement in mood is rather due to the task itself or to its accomplishment.

Previous studies on volume indicated an influence of this parameter on spatio-temporal perception [2, 61,62,63,64,65] and, to a limited extent, on human kinematics [66]. However, we are currently not aware of any studies investigating the influence of volume in gait sonification. In order to make a first step towards a better general understanding of the influence of volume on the effectiveness of MS, explorative hypotheses were tested.

In a first consideration of the results, it seems surprising that volume modification in the asymmetry period did not affect spatial parameters, although volume is predominantly associated with spatial distances, directions, and velocities [64, 65, 75]. The reason why the volume affected the gait pattern of the participants only in the habituation period might be due to a high degree of automation of the gait, which prevented an adjustment to a possibly less economical gait pattern. Also, the unilateral modification of the auditory stimulus in the asymmetry period might have been too small to affect spatial parameters and/or might have been overlaid by proprioceptive, tactile, and vestibular afferences.

We tried to make the volume difference between the two sides as large as possible but still not noticeable to avoid participants’ intentional motion adaptation. Only three of the 32 participants reported having detected a volume difference after the measurements. Several questions follow in this regard. First, whether knowledge of or recognition of asymmetric volume interferes with (unconscious) motor adaptation. And, if this is the case, to what extent verbal instruction (e.g., “Do not consciously adjust your movement to the sonification.”) could counteract this. Second, the question of optimal volume difference arises. It is possible that the effect on ground contact time that occurred correlates with the volume difference, similar to reaction time tasks in which lower reaction times can be observed with louder acoustic stimuli [76,77,78], although here a comparison regarding the application of the sound and the motor response is not obvious. If an analysis of the effect size of increasing volume difference on gait symmetry is successful, this correlation could be a crucial factor in making the use of gait sonification efficient in rehabilitation. However, it should be noted in this context that elderly patients in particular, who could benefit from gait sonification e.g. after stroke, Parkinson’s disease, or arthroplasty, often suffer from hearing loss. If this hearing loss is more pronounced on one side, the volume difference must be adjusted accordingly or even overcompensated to compensate for habituation effects. Finally, based on the results presented here, it can be assumed that the gait pattern of patients with unilateral hearing loss might suffer from the hearing impairment. Although no studies are currently known on laterality, preliminary evidence suggests that hearing impairment leads to increased risk of falls in the elderly [79, 80]. Again, the use of gait sonification with volume settings adapted to the user could potentially counteract deterioration of gait due to hearing impairment.

With regard to the impact of volume on step width, which occurs contrarily for the two different pitches, the influence of pitch on movement and a possible interaction between pitch and volume should also be considered. In an early work by Wood [81] it became evident that in human perception there is an interaction between pitch and volume that can affect movement reactions. In the experiment, reaction times were measured after hearing a simple syllable that varied in pitch and volume. One-dimensional changes in pitch and volume showed shorter reaction times than orthogonal-dimensional changes in pitch and volume. Similar psychophysical correlations between pitch and volume could also be found for non-speech-related sounds [82, 83]. This interdependency of pitch and volume might be an explanation for the divergent step width change at low vs. high pitch and increased volume.

Gomez-Andres et al. [50] also showed that the overall pitch of acoustic gait feedback influences the gait symmetry of stroke patients. Here, a high pitch of amplified footsteps sounds increased the asymmetry of the patients’ ground contact times, while a low pitch reduced the asymmetry. Although a different method of sound generation respectively amplification and other participants were chosen in Gomez-Andres et al., the current results show similarities regarding the effect of different pitches on gait symmetry.

Furthermore, in the present study, the results of the asymmetry period show a clear effect on the temporal parameter ground contact time. Since only the ground contact time was presented acoustically, it can be assumed that the sound-motion relationship was clearly recognizable to the participants and that sonification had a direct influence on gait pattern. The mechanism underlying this influence of gait sonification has been investigated and discussed in previous studies. It is hypothesized that the mapping of sound to movement leads to audio-motor coactivation in the CNS. This coactivation occurs because the acoustic stimuli are directly generated by the user’s movement, probably unconsciously [5, 30, 84]. Due to this close audio-motor coupling, it is possible that continuous sensorimotor adaptation takes place and, as explained by the forward model, movement adaptation occurs [50]. Regarding the observed effect on ground contact time, it should additionally be considered that the human auditory system perceives rhythmic information and temporal structures particularly clearly [85,86,87], which might have led to a stronger effect on motor timing compared to range and direction of motion. Thus, the temporal increase in ground contact time might have been favored with reduced volume.

In the present study, it can also be assumed that a comparison of the actually perceived sensory information (afferent input) with the expected sensory information (efference copy) led to a discrepancy. An attempt was made to compensate for this by changing the ground contact time. Since the participants were not informed about the volume modification, it can be assumed that the processes described were mainly unconscious. The forward model could therefore explain the observed effects in the case of a repetitive and automated movement such as the human gait. Especially since in the present study visual information was reduced during walking and subjects relied heavily on sonification as auditory information to maintain automated processes [88].

It must be regarded as a limitation of this study that it cannot be assessed whether the ground contact time was a result of altered ground reaction forces due to the lack of force/pressure measurement. Possibly a stronger heel strike or a more intensive push off led to an extension of the ground contact time during the quiet sound condition. The participants (unconsciously) could have tried to produce a louder sound by applying more force. An additional use of force or pressure plates should clarify this question in the future. Furthermore, it might be useful to replicate the results using a larger sample. This could also clarify whether there might be a statistically significant effect of volume on gait direction when walking blindfolded.


The present study showed that the volume of gait sonification has directly affected the gait pattern of healthy young persons. At asymmetrical volume, a unilateral increase in ground contact time was observed on the side with reduced volume. Also, an interaction of pitch and volume was observed mainly with an overall change in volume. This could be explained in terms of psychophysical perception, which should be considered when using volume for gait sonification. We thus provide first clues for an appropriate sound-motion mapping and a targeted use of volume. Based on the present results, we would recommend for gait sonification that temporally asymmetric parameters be presented directly acoustically on both sides and that the side on which the movement is performed in a shortened manner be presented more quietly than the other. In this way, the user would respond by amplifying the movement, i.e., increasing its duration, which would improve temporal movement symmetry. A lasting effect of volume modification must be investigated in future intervention studies. In this context different patient groups should be considered. The available findings can be helpful to improve the effect of gait sonification in patients with asymmetrical gait pattern and thus to return to a physiological gait more quickly and easily.

Availability of data and materials

The datasets during and/or analyzed during the current study available from the corresponding author on reasonable request.



Group 1 (150-250 Hz)


Group 2 (95-112 Hz)


Right quiet


Left quiet


Touch down


Toe off


Movement Sonification


Inertial measurement unit


Mental state scale [74]


  1. Giordano BL, McDonnell J, McAdams S. Hearing living symbols and nonliving icons: category specificities in the cognitive processing of environmental sounds. Brain Cogn. 2010;73:7–19.

    Article  PubMed  Google Scholar 

  2. Grassi M, Pastore M, Lemaitre G. Looking at the world with your ears: how do we get the size of an object from its sound? Acta Psychol. 2013;143:96–104.

    Article  Google Scholar 

  3. Houben MMJ, Kohlrausch A, Hermes DJ. Perception of the size and speed of rolling balls by sound. Speech Comm. 2004;43:331–45.

    Article  Google Scholar 

  4. Houix O, Lemaitre G, Misdariis N, Susini P, Urdapilleta I. A lexical analysis of environmental sound categories. J Exp Psychol Appl. 2012;18:52–80.

    Article  PubMed  Google Scholar 

  5. Bangert M, Peschel T, Schlaug G, Rotte M, Drescher D, Hinrichs H, et al. Shared networks for auditory and motor processing in professional pianists: evidence from fMRI conjunction. Neuroimage. 2006;30:917–26.

    Article  PubMed  Google Scholar 

  6. Haslinger B, Erhard P, Altenmüller E, Schroeder U, Boecker H, Ceballos-Baumann AO. Transmodal sensorimotor networks during action observation in professional pianists. J Cogn Neurosci. 2005;17:282–93.

    CAS  Article  PubMed  Google Scholar 

  7. Lahav A, Saltzman E, Schlaug G. Action representation of sound: audiomotor recognition network while listening to newly acquired actions. J Neurosci. 2007;27:308–14.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  8. Pizzamiglio L, Aprile T, Spitoni G, Pitzalis S, Bates E, D'Amico S, et al. Separate neural systems for processing action- or non-action-related sounds. Neuroimage. 2005;24:852–61.

    CAS  Article  PubMed  Google Scholar 

  9. Chen JL, Penhune VB, Zatorre RJ. Listening to musical rhythms recruits motor regions of the brain. Cereb Cortex. 2008;18:2844–54.

    Article  PubMed  Google Scholar 

  10. Justen C, Herbert C, Werner K, Raab M. Self vs. other: neural correlates underlying agent identification based on unimodal auditory information as revealed by electrotomography (sLORETA). Neuroscience. 2014;259:25–34.

    CAS  Article  PubMed  Google Scholar 

  11. Kennel C, Hohmann T, Raab M. Action perception via auditory information: agent identification and discrimination with complex movement sounds. J Cogn Psychol. 2014;26:157–65.

    Article  Google Scholar 

  12. Murgia M, Hohmann T, Galmonte A, Raab M, Agostini T. Recognising one's own motor actions through sound: the role of temporal factors. Perception. 2012;41:976–87.

    Article  PubMed  Google Scholar 

  13. Sevdalis V, Keller PE. Know thy sound: perceiving self and others in musical contexts. Acta Psychol. 2014;152:67–74.

    Article  Google Scholar 

  14. Maes P-J, Leman M, Palmer C, Wanderley MM. Action-based effects on music perception. Front Psychol. 2014;4:1008.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Parise CV, Knorre K, Ernst MO. Natural auditory scene statistics shapes human spatial hearing. Proc Natl Acad Sci U S A. 2014;111:6104–8.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  16. Rusconi E, Kwan B, Giordano BL, Umiltà C, Butterworth B. Spatial representation of pitch height: the SMARC effect. Cognition. 2006;99:113–29.

    Article  PubMed  Google Scholar 

  17. Sievers B, Polansky L, Casey M, Wheatley T. Music and movement share a dynamic structure that supports universal expressions of emotion. Proc Natl Acad Sci U S A. 2013;110:70–5.

    Article  PubMed  Google Scholar 

  18. Kennel C, Pizzera A, Hohmann T, Schubotz RI, Murgia M, Agostini T, et al. The perception of natural and modulated movement sounds. Perception. 2014;43:796–804.

    Article  PubMed  Google Scholar 

  19. Murgia M, Santoro I, Tamburini G, Prpic V, Sors F, Galmonte A, et al. Ecological sounds affect breath duration more than artificial sounds. Psychol Res. 2016;80:76–81.

    Article  PubMed  Google Scholar 

  20. Pizzera A, Hohmann T, Streese L, Habbig A, Raab M. Long-term effects of acoustic reafference training (ART). Eur J Sport Sci. 2017;17:1279–88.

    Article  PubMed  Google Scholar 

  21. Ferrigno C, Stoller IS, Shakoor N, Thorp LE, Wimmer MA. The feasibility of using augmented auditory feedback from a pressure detecting insole to reduce the knee adduction moment: a proof of concept study. J Biomech Eng. 2016;138:21014.

    Article  Google Scholar 

  22. He J, Lippmann K, Shakoor N, Ferrigno C, Wimmer MA. Unsupervised gait retraining using a wireless pressure-detecting shoe insole. Gait Posture. 2019;70:408–13.

    Article  PubMed  Google Scholar 

  23. Wentink EC, Talsma-Kerkdijk EJ, Rietman HS, Veltink P. Feasibility of error-based electrotactile and auditive feedback in prosthetic walking. Prosthetics Orthot Int. 2015;39:255–9.

    Article  Google Scholar 

  24. Song G-B, Ryu HJ. Effects of gait training with rhythmic auditory stimulation on gait ability in stroke patients. J Phys Ther Sci. 2016;28:1403–6.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Thaut MH, McIntosh KW, McIntosh GC, Hoemberg V. Auditory rhythmicity enhances movement and speech motor control in patients with Parkinson's disease; 2001.

    Google Scholar 

  26. Willems AM, Nieuwboer A, Chavret F, Desloovere K, Dom R, Rochester L, et al. The use of rhythmic auditory cues to influence gait in patients with Parkinson's disease, the differential effect for freezers and non-freezers, an explorative study. Disabil Rehabil. 2006;28:721–8.

    CAS  Article  PubMed  Google Scholar 

  27. Wittwer JE, Webster KE, Hill K. Rhythmic auditory cueing to improve walking in patients with neurological conditions other than Parkinson's disease - what is the evidence? Disabil Rehabil. 2013;35:164–76.

    Article  PubMed  Google Scholar 

  28. Dyer JF, Stapleton P, Rodger MWM. Advantages of melodic over rhythmic movement sonification in bimanual motor skill learning. Exp Brain Res. 2017;235:3129–40.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  29. Dyer JF, Stapleton P, Rodger M. Mapping Sonification for perception and action in motor skill learning. Front Neurosci. 2017;11:463.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Effenberg AO, Fehse U, Schmitz G, Krueger B, Mechling H. Movement Sonification: effects on motor learning beyond rhythmic adjustments. Front Neurosci. 2016;10:219.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Effenberg AO, Schmitz G. Acceleration and deceleration at constant speed: systematic modulation of motion perception by kinematic sonification. Ann N Y Acad Sci. 2018;1425:52–69.

    Article  PubMed  Google Scholar 

  32. Reh J, Hwang T-H, Schmitz G, Effenberg AO. Dual mode gait Sonification for rehabilitation after unilateral hip Arthroplasty. Brain Sci. 2019.

  33. Reh J, Schmitz G, Hwang T-H, Effenberg AO. Acoustic feedback in gait rehabilitation—pre-post effects in patients with unilateral hip Arthroplasty. Front Sports Act Living. 2021.

  34. Schmitz G, Effenberg AO. Perceptual effects of auditory information about own and other movements; 2012. p. 2168–5126.

    Google Scholar 

  35. Nikmaram N, Scholz DS, Großbach M, Schmidt SB, Spogis J, Belardinelli P, et al. Musical Sonification of arm movements in stroke rehabilitation yields limited benefits. Front Neurosci. 2019;13:1378.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Scholz DS, Rhode S, Großbach M, Rollnik J, Altenmüller E. Moving with music for stroke rehabilitation: a sonification feasibility study. Ann N Y Acad Sci. 2015;1337:69–76.

    Article  PubMed  Google Scholar 

  37. van Vugt FT, Kafczyk T, Kuhn W, Rollnik JD, Tillmann B, Altenmüller E. The role of auditory feedback in music-supported stroke rehabilitation: a single-blinded randomised controlled intervention. Restor Neurol Neurosci. 2016;34:297–311.

    Article  PubMed  Google Scholar 

  38. Schaffert N, Mattes K, Effenberg AO. In: Bresin R, Hermann T, Hunt A, editors. Listen to the boat motion: acoustic information for elite rowers; 2010. p. 31–8.

    Google Scholar 

  39. Schaffert N, Mattes K. Designing an acoustic feedback system for on-water rowing training. Int J Comput Sci Sport. 2011;10(2):71–6.

  40. Hasegawa S, Ishijima S, Kato F, Mitake H, Sato M. Realtime sonification of the center of gravity for skiing. Megève, New York: Association for Computing Machinery; 2012. p. 1–4.

    Book  Google Scholar 

  41. O'Brien B, Juhas B, Bieńkiewicz M, Buloup F, Bringoux L, Bourdin C. Sonification of golf putting gesture reduces swing movement variability in novices. Res Q Exerc Sport. 2021;92:301–10.

    Article  PubMed  Google Scholar 

  42. Schaffert N, Godbout A, Schlueter S, Mattes K. Towards an application of interactive sonification for the forces applied on the pedals during cycling on the Wattbike ergometer. Displays. 2017;50:41–8.

    Article  Google Scholar 

  43. Cesarini D, Hermann T, Ungerechts B. In: Stockmann T, Metatla O, Macdonald D, editors. A real-time auditory biofeedback system for sports swimming. New York: International Community for Auditory Display (ICAD); 2014.

    Google Scholar 

  44. Mezzarobba S, Grassi M, Pellegrini L, Catalan M, Kruger B, Furlanis G, et al. Action observation plus Sonification. A novel therapeutic protocol for Parkinson's patient with freezing of gait. Front Neurol. 2017;8:723.

    Article  PubMed  Google Scholar 

  45. Scholz DS, Rohde S, Nikmaram N, Brückner H-P, Großbach M, Rollnik JD, et al. Sonification of arm movements in stroke rehabilitation - a novel approach in neurologic music therapy. Front Neurol. 2016;7:106.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Thaut MH, Rice RR, Braun Janzen T, Hurt-Thaut CP, McIntosh GC. Rhythmic auditory stimulation for reduction of falls in Parkinson's disease: a randomized controlled study. Clin Rehabil. 2019;33:34–43.

    Article  PubMed  Google Scholar 

  47. Chen JL, Fujii S, Schlaug G. The use of augmented auditory feedback to improve arm reaching in stroke: a case series. Disabil Rehabil. 2016;38:1115–24.

    Article  PubMed  Google Scholar 

  48. Schmitz G, Bergmann J, Effenberg AO, Krewer C, Hwang T-H, Müller F. Movement Sonification in stroke rehabilitation. Front Neurol. 2018;9:389.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Ghai S, Ghai I, Schmitz G, Effenberg AO. Effect of rhythmic auditory cueing on parkinsonian gait: a systematic review and meta-analysis. Sci Rep. 2018;8:506.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  50. Gomez-Andres A, Grau-Sánchez J, Duarte E, Rodriguez-Fornells A, Tajadura-Jiménez A. Enriching footsteps sounds in gait rehabilitation in chronic stroke patients: a pilot study. Ann N Y Acad Sci. 2020;1467:48–59.

    Article  PubMed  Google Scholar 

  51. Horsak B, Dlapka R, Iber M, Gorgas A-M, Kiselka A, Gradl C, et al. SONIGait: a wireless instrumented insole device for real-time sonification of gait. J Multimodal User Interfaces. 2016;10:195–206.

    Article  Google Scholar 

  52. Tajadura-Jiménez A, Basia M, Deroy O, Fairhurst M, Marquardt N, Bianchi-Berthouze N. In: Lee MH, Cha S, Nam TJ, editors. As light as your footsteps: altering walking sounds to change perceived body weight, emotional state and gait; 2015. p. 2943–52.

  53. Young WR, Shreve L, Quinn EJ, Craig C, Bronte-Stewart H. Auditory cueing in Parkinson's patients with freezing of gait. What matters most: action-relevance or cue-continuity? Neuropsychologia. 2016;87:54–62.

    Article  PubMed  Google Scholar 

  54. Brodie MAD, Dean RT, Beijer TR, Canning CG, Smith ST, Menant JC, et al. Symmetry matched auditory cues improve gait steadiness in most people with Parkinson's disease but not in healthy older people. J Parkinsons Dis. 2015;5:105–16.

    Article  PubMed  Google Scholar 

  55. Dotov DG, Bayard S, Cochen de Cock V, Geny C, Driss V, Garrigue G, et al. Biologically-variable rhythmic auditory cues are superior to isochronous cues in fostering natural gait variability in Parkinson's disease. Gait Posture. 2017;51:64–9.

    CAS  Article  PubMed  Google Scholar 

  56. Ghai S, Ghai I, Effenberg AO. Effect of rhythmic auditory cueing on aging gait: a systematic review and meta-analysis. Aging Dis. 2018;9:901–23.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Rodger MWM, Young WR, Craig CM. Synthesis of walking sounds for alleviating gait disturbances in Parkinson's disease. IEEE Trans Neural Syst Rehabil Eng. 2014;22:543–8.

    Article  PubMed  Google Scholar 

  58. Wright RL, Elliott MT. Stepping to phase-perturbed metronome cues: multisensory advantage in movement synchrony but not correction. Front Hum Neurosci. 2014;8:724.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Ford MP, Malone LA, Nyikos I, Yelisetty R, Bickel CS. Gait training with progressive external auditory cueing in persons with Parkinson's disease. Arch Phys Med Rehabil. 2010;91:1255–61.

    Article  PubMed  Google Scholar 

  60. Roerdink M, Bank PJM, Peper CLE, Beek PJ. Walking to the beat of different drums: practical implications for the use of acoustic rhythms in gait rehabilitation. Gait Posture. 2011;33:690–4.

    Article  PubMed  Google Scholar 

  61. Blauert J. Spatial hearing: the psychophysics of human sound localization. 6th ed. Cambridge: MIT Press; 1996.

    Book  Google Scholar 

  62. Grassi M. Do we hear size or sound? Balls dropped on plates. Percept Psychophys. 2005;67:274–84.

    Article  PubMed  Google Scholar 

  63. Lipscomb SD, Kim EM. Perceived match between visual parameters and auditory correlates: an experimental multimedia investigation. In: Society for Music Perception and Cognition. Evanston: Proceedings of the 8th International Conference on Music Perception and Cognition; 2004. p. 72–5.

  64. Eitan Z, Schupak A, Marks LE. In: Miyazaki K, Hiraga Y, Adachi M, Nakajima Y, Tsuzaki M, editors. Louder is higher: cross-modal interaction of loudness change and vertical motion in speeded classification. Adelaide: CausalProductions; 2008.

    Google Scholar 

  65. Eitan Z. How pitch and loudness shape musical space and motion. In: Tan S-L, Cohen AJ, Lipscomb SD, Kendall RA, editors. The psychology of music in multimedia. Oxford: Oxford Univ. Press; 2013. p. 165–91.

    Chapter  Google Scholar 

  66. Darling M, Huber JE. Changes to articulatory kinematics in response to loudness cues in individuals with Parkinson’s disease. J Speech Lang Hear Res. 2011;54:1247–59.

    Article  PubMed  Google Scholar 

  67. Schaffert N, Janzen TB, Mattes K, Thaut MH. A review on the relationship between sound and movement in sports and rehabilitation. Front Psychol. 2019;10:244.

    Article  PubMed  PubMed Central  Google Scholar 

  68. Vinken PM, Kröger D, Fehse U, Schmitz G, Brock H, Effenberg AO. Auditory coding of human movement kinematics. Multisens Res. 2013;26:533–52.

    Article  PubMed  Google Scholar 

  69. Robinson DW, Dadson RS. A re-determination of the equal-loudness relations for pure tones. Br J Appl Phys. 1956;7:166–81.

    Article  Google Scholar 

  70. Cabrera D, Tilley S. Parameters for auditory display of height and size. In: International Conference on Auditory Display. Boston: Georgia Institute of Technology; International Community on Auditory Display; 2003.

    Google Scholar 

  71. Eitan Z, Granot RY. How music moves: musical parameters and listeners images of motion. Music Percept. 2006;23:221–48.

    Article  Google Scholar 

  72. Kohn D, Eitan Z. Seeing sound moving: congruence of pitch and loudness with human movement and visual shape. In: Cambouropoulos E, Tsougras C, Mavromatis P, Pastiadis K, editors. ICMPC-ESCOM 2012 Joint conference: proceedings: book of abstracts, CD-ROM proceeding: School of Music Studies. Thessaloniki: Aristotle University of Thessaloniki; 2012.

  73. Singhal P, Agarwala A, Srivastava P. In: Rogers TM, Rau M, Zhu J, Kalish C, editors. Do Pitch and Space Share Common Code?: Role of feedback in SPARC effect. Austin: CogSci; 2018.

  74. Zerssen DV, Petermann F. Bf-SR-Die Befindlichkeits-Skala-Revidierte Fassung: Revidierte Fassung. 1. Auflage ed. Göttingen: Hogrefe; 2011.

    Google Scholar 

  75. Küssner MB, Tidhar D, Prior HM, Leech-Wilkinson D. Musicians are more consistent: gestural cross-modal mappings of pitch, loudness and tempo in real-time. Front Psychol. 2014;5:789.

    Article  PubMed  PubMed Central  Google Scholar 

  76. Brown AM, Kenwell ZR, Maraj BKV, Collins DF. “Go” signal intensity influences the sprint start. Med Sci Sports Exerc. 2008;40:1142–8.

    Article  PubMed  Google Scholar 

  77. Marshall L, Brandt JF. The relationship between loudness and reaction time in normal hearing listeners. Acta Otolaryngol. 1980;90:244–9.

    CAS  Article  PubMed  Google Scholar 

  78. Sors F, Prpic V, Santoro I, Galmonte A, Agostini T, Murgia M. Loudness, but not shot power, influences simple reaction times to soccer penalty sounds. Psihologija. 2018;51:127–41.

    Article  Google Scholar 

  79. Criter RE, Gustavson M. Subjective hearing difficulty and fall risk. Am J Audiol. 2020;29:384–90.

    Article  PubMed  Google Scholar 

  80. Xu D, Newell MD, Francis AL. Fall-related injuries mediate the relationship between self-reported hearing loss and mortality in middle-aged and older adults. J Gerontol A Biol Sci Med Sci. 2021;76:e213–20.

    Article  PubMed  Google Scholar 

  81. Wood CC. Levels of processing in speech perception: neurophysiological and information-processing analyses. Yale: Yale University; 1973. [Unpublished doctoral dissertation]

  82. Neuhoff JG, Kramer G, Wayand J. onification and the interaction of perceptual dimensions: Can the data get lost in the map? In: Psychology, Lafayette College. Department of, Foundation M, Psychology, Kent State University. Department of, editors. Atlanta: Georgia Institute of Technology; International Community for Auditory Display; 2000.

    Google Scholar 

  83. Neuhoff JG, Wayand J, Kramer G. Pitch and loudness interact in auditory displays: can the data get lost in the map? J Exp Psychol Appl. 2002;8:17–25.

    Article  PubMed  Google Scholar 

  84. Bangert M, Altenmüller EO. Mapping perception to action in piano practice: a longitudinal DC-EEG study. BMC Neurosci. 2003;4:26.

    Article  PubMed  PubMed Central  Google Scholar 

  85. Barton B, Venezia JH, Saberi K, Hickok G, Brewer AA. Orthogonal acoustic dimensions define auditory field maps in human cortex. Proc Natl Acad Sci U S A. 2012;109:20738–43.

    Article  PubMed  PubMed Central  Google Scholar 

  86. Boemio A, Fromm S, Braun A, Poeppel D. Hierarchical and asymmetric temporal sensitivity in human auditory cortices. Nat Neurosci. 2005;8:389–95.

    CAS  Article  PubMed  Google Scholar 

  87. Giraud AL, Lorenzi C, Ashburner J, Wable J, Johnsrude I, Frackowiak R, et al. Representation of the temporal envelope of sounds in the human brain. J Neurophysiol. 2000;84:1588–98.

    CAS  Article  PubMed  Google Scholar 

  88. Clark DJ. Automaticity of walking: functional significance, mechanisms, measurement and rehabilitation strategies. Front Hum Neurosci. 2015;9:246.

    Article  PubMed  PubMed Central  Google Scholar 

Download references


We would like to thank our student co-worker Sven Birger Niehaus for his excellent assistance. We would also like to thank two anonymous reviewers for their helpful comments on an earlier version of this manuscript.


Open Access funding enabled and organized by Projekt DEAL. The publication of this article was funded by the Open Access Fund of the Leibniz Universität Hannover. This funder, nor any other, was not involved in the design of the study, the collection, analysis, or interpretation of data, or in writing the manuscript.

Author information

Authors and Affiliations



JR drafted the manuscript. AE, GS, and T-HH revised it critically for important intellectual content. AE developed the sonification. AE, GS, and JR developed the framework for gait sonification and the experimental design. T-HH and JR contributed to the software application, sound synthesis and involved in data analysis. AE, GS, and JR conceived, and designed the study. JR conducted the measurements and participant acquisition. All authors read and approved the version of the submitted manuscript.

Corresponding authors

Correspondence to Julia Reh or Alfred O. Effenberg.

Ethics declarations

Ethics approval and consent to participate

The study involving human participants was reviewed and approved by the Ethical Committee of the Leibniz University Hannover. The participants provided their written informed consent to participate in this study.

Consent for publication

Not applicable.

Competing interests

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Reh, J., Schmitz, G., Hwang, TH. et al. Loudness affects motion: asymmetric volume of auditory feedback results in asymmetric gait in healthy young adults. BMC Musculoskelet Disord 23, 586 (2022).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:


  • Sonification
  • Asymmetric volume
  • Gait pattern
  • Auditory feedback
  • Ground contact time