Medical and Health Sciences
This study examined the role of route previewing strategies on climbing fluency and on exploratory movements of the limbs, in order to understand whether previewing helps people to perceive and to realize affordances. Eight inexperienced and ten experienced climbers previewed a 10 m high route of 5b difficulty on French scale, then climbed it with a top-rope as fluently as possible. Gaze behavior was collected from an eye tracking system during the preview and allowed us to determine the number of times they scanned the route, and which of four route previewing strategies (fragmentary, ascending, zigzagging, and sequence-of-blocks) they used. Five inertial measurement units (IMU) (3D accelerometer, 3D gyroscope, 3D magnetometer) were attached to the hip, both feet, and forearms to analyze the vertical acceleration and direction of each limb and hip during the ascent. We were able to detect movement and immobility phases of each IMU using segmentation and classification processes. Depending on whether the limbs and/or hip were moving, five states of behavior were detected: immobility, postural regulation, hold exploration, hold change, and hold traction. Using cluster analysis we identified four clusters of gaze behavior during route previewing depending on route preview duration, number of scan paths, fixations duration, ascending, zigzagging, and sequence-of-blocks strategies. The number of scan paths was positively correlated with relative duration of exploration and negatively correlated with relative duration of hold changes during the ascent. Additionally, a high relative duration of sequence-of-blocks strategy and zigzagging strategy were associated with a high relative duration of immobility during the ascent. Route previewing might help to pick up functional information about reachable, graspable, and usable holds, in order to chain movements together and to find the route. In other words, route previewing might contribute to perceiving and realizing nested affordances.
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.