1
|
Nakahashi W. Relationship between trackmakers of the Laetoli footprints from gait synchronization. EVOLUTIONARY HUMAN SCIENCES 2025; 7:e13. [PMID: 40297739 PMCID: PMC12034493 DOI: 10.1017/ehs.2025.10] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/30/2025] Open
Abstract
The parallel trails of footprints at Laetoli site G are important fossils for studying the characteristics of Australopithecus afarensis. However, the relationship between the trackmakers - i.e. whether it was that of an adult male-female pair or of parent-offspring - remains unclear. The footprints show that the two individuals walked side by side with a narrow and constant distance between them and synchronized their leg movements and step lengths (gait synchronization), although they had a large height difference. In this study, live camera videos were collected to obtain data on gait synchronization in Homo sapiens, the closest extant species to A. afarensis. The data showed that when two humans with a large height difference walked alongside each other, with (at least) one of the pair having their arm around the other's shoulder or back, adult male-female pairs (couples) frequently synchronized their gait, but parent-offspring pairs did not, whereas both couples and parent-offspring seldom synchronized when they walked side by side without connection or with handholding. Two individuals only maintained a narrow and constant distance like that between the Laetoli footprints when they walked with an arm-around connection. Therefore, assuming that A. afarensis had the same gait synchronization tendency as H. sapiens, the trackmakers were more likely to be an adult male-female pair than a parent-offspring one.
Collapse
|
2
|
Gashri C, Talmon R, Peleg N, Moshe Y, Agoston D, Gavras S, Fischer AG, Horowitz-Kraus T. Multimodal analysis of mother-child interaction using hyperscanning and diffusion maps. Sci Rep 2025; 15:5431. [PMID: 39948429 PMCID: PMC11825838 DOI: 10.1038/s41598-025-90310-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2024] [Accepted: 02/12/2025] [Indexed: 02/16/2025] Open
Abstract
The current work aims to reveal mother-child synchronization patterns using several interaction modalities and combining them using the diffusion maps method. Twenty-two Hebrew-speaking toddlers (ages = 33 ± 5.38 months, 17 males) and their mothers (ages = 35 ± 5.79 years) participated in two interaction conditions while data was collected from several modalities, i.e. EEG, joint attention (measured through video coding of looking behavior), and motion analysis. Dimension reduction and data fusion of these modalities were performed using diffusion maps to enable a comprehensive assessment of mother-child synchronization dynamics. This multimodal approach allows better characterization of mother-child interaction and examining the associations between interaction patterns and maternal parenting style and their importance to the child's long-term language abilities.
Collapse
Affiliation(s)
- C Gashri
- Faculty of Biomedical Engineering, Technion, Haifa, Israel
| | - R Talmon
- Faculty of Biomedical Engineering, Technion, Haifa, Israel
| | - N Peleg
- Faculty of Electrical and Computer Engineering, Technion, Haifa, Israel
| | - Y Moshe
- Faculty of Biomedical Engineering, Technion, Haifa, Israel
| | - D Agoston
- Faculty of Biomedical Engineering, Technion, Haifa, Israel
- Department of Mechatronics, Optics and Mechanical Engineering Informatics, Faculty of Mechanical Engineering, Budapest University of Technology and Economics, Budapest, Hungary
| | - S Gavras
- Faculty of Biomedical Engineering, Technion, Haifa, Israel
| | - A G Fischer
- Faculty of Biomedical Engineering, Technion, Haifa, Israel
| | - T Horowitz-Kraus
- Faculty of Biomedical Engineering, Technion, Haifa, Israel.
- Educational Neuroimaging Group, Faculty of Education in Science and Technology, Faculty of Biomedical Engineering, Technion, Haifa, Israel.
- Department of Neuropsychology, Center for Neurodevelopmental and Imaging Research (CNIR), Kennedy Krieger Institute, Baltimore, MD, USA.
- Department of Psychology and Behavioral Sciences, Johns Hopkins University School of Medicine, Baltimore, MD, USA.
| |
Collapse
|
3
|
Tomaru T, Nishiyama Y, Feliciani C, Murakami H. Robust spatial self-organization in crowds of asynchronous pedestrians. J R Soc Interface 2024; 21:20240112. [PMID: 38807528 PMCID: PMC11338568 DOI: 10.1098/rsif.2024.0112] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2023] [Revised: 03/29/2024] [Accepted: 04/03/2024] [Indexed: 05/30/2024] Open
Abstract
Human crowds display various self-organized collective behaviours, such as the spontaneous formation of unidirectional lanes in bidirectional pedestrian flows. In addition, parts of pedestrians' footsteps are known to be spontaneously synchronized in one-dimensional, single-file crowds. However, footstep synchronization in crowds with more freedom of movement remains unclear. We conducted experiments on bidirectional pedestrian flows (24 pedestrians in each group) and examined the relationship between collective footsteps and self-organized lane formation. Unlike in previous studies, pedestrians did not spontaneously synchronize their footsteps unless following external auditory cues. Moreover, footstep synchronization generated by external cues disturbed the flexibility of pedestrians' lateral movements and increased the structural instability of spatial organization. These results imply that, without external cues, pedestrians marching out of step with each other can efficiently self-organize into robust structures. Understanding how asynchronous individuals contribute to ordered collective behaviour might bring innovative perspectives to research fields concerned with self-organizing systems.
Collapse
Affiliation(s)
- Takenori Tomaru
- Faculty of Information and Human Science, Kyoto Institute of Technology, Kyoto, Japan
| | - Yuta Nishiyama
- Information and Management Systems Engineering, Nagaoka University of Technology, Niigata, Japan
| | - Claudio Feliciani
- Department of Aeronautics and Astronautics, School of Engineering, The University of Tokyo, Tokyo, Japan
- Research Center for Advanced Science and Technology, The University of Tokyo, Tokyo, Japan
| | - Hisashi Murakami
- Faculty of Information and Human Science, Kyoto Institute of Technology, Kyoto, Japan
| |
Collapse
|
4
|
Taghavi M, Russello H, Ouweltjes W, Kamphuis C, Adriaens I. Cow key point detection in indoor housing conditions with a deep learning model. J Dairy Sci 2024; 107:2374-2389. [PMID: 37863288 DOI: 10.3168/jds.2023-23680] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2023] [Accepted: 10/02/2023] [Indexed: 10/22/2023]
Abstract
Lameness in dairy cattle is a costly and highly prevalent problem that affects all aspects of sustainable dairy production, including animal welfare. Automation of gait assessment would allow monitoring of locomotion in which the cows' walking patterns can be evaluated frequently and with limited labor. With the right interpretation algorithms, this could result in more timely detection of locomotion problems. This in turn would facilitate timely intervention and early treatment, which is crucial to reduce the effect of abnormal behavior and pain on animal welfare. Gait features of dairy cows can potentially be derived from key points that locate crucial anatomical points on a cow's body. The aim of this study is 2-fold: (1) to demonstrate automation of the detection of dairy cows' key points in a practical indoor setting with natural occlusions from gates and races, and (2) to propose the necessary steps to postprocess these key points to make them suitable for subsequent gait feature calculations. Both the automated detection of key points as well as the postprocessing of them are crucial prerequisites for camera-based automated locomotion monitoring in a real farm environment. Side-view video footage of 34 Holstein-Friesian dairy cows, captured when exiting the milking parlor, were used for model development. From these videos, 758 samples of 2 successive frames were extracted. A previously developed deep learning model called T-LEAP was trained to detect 17 key points on cows in our indoor farm environment with natural occlusions. To this end, the dataset of 758 samples was randomly split into a train (n = 22 cows; no. of samples = 388), validation (n = 7 cows; no. of samples = 108), and test dataset (n = 15 cows; no. of samples = 262). The performance of T-LEAP to automatically assign key points in our indoor situation was assessed using the average percentage of correctly detected key points using a threshold of 0.2 of the head length (PCKh0.2). The model's performance on the test set achieved a good result with PCKh0.2: 89% on all 17 key points together. Detecting key points on the back (n = 3 key points) of the cow had the poorest performance PCKh0.2: 59%. In addition to the indoor performance of the model, a more detailed study of the detection performance was conducted to formulate postprocessing steps necessary to use these key points for gait feature calculations and subsequent automated locomotion monitoring. This detailed study included the evaluation of the detection performance in multiple directions. This study revealed that the performance of the key points on a cows' back were the poorest in the horizontal direction. Based on this more in-depth study, we recommend the implementation of the outlined postprocessing techniques to address the following issues: (1) correcting camera distortion, (2) rectifying erroneous key point detection, and (3) establishing the necessary procedures for translating hoof key points into gait features.
Collapse
Affiliation(s)
- M Taghavi
- Wageningen Livestock Research, Wageningen University and Research, 6708 WD Wageningen, the Netherlands.
| | - H Russello
- Agricultural Biosystems Engineering, Wageningen University and Research, 6700 AA Wageningen, the Netherlands
| | - W Ouweltjes
- Wageningen Livestock Research, Wageningen University and Research, 6708 WD Wageningen, the Netherlands
| | - C Kamphuis
- Wageningen Livestock Research, Wageningen University and Research, 6708 WD Wageningen, the Netherlands
| | - I Adriaens
- Wageningen Livestock Research, Wageningen University and Research, 6708 WD Wageningen, the Netherlands; Department of Biosystems Engineering, Livestock Technology, KU Leuven, 3001 Leuven, Belgium; Department of Mathematical Modelling and Data Analysis, BioVisM, Ghent University, B-9000 Ghent, Belgium
| |
Collapse
|
5
|
Demos AP, Palmer C. Social and nonlinear dynamics unite: musical group synchrony. Trends Cogn Sci 2023; 27:1008-1018. [PMID: 37277276 DOI: 10.1016/j.tics.2023.05.005] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2022] [Revised: 05/07/2023] [Accepted: 05/09/2023] [Indexed: 06/07/2023]
Abstract
Synchronization, the human tendency to align behaviors in time with others, is necessary for many survival skills. The ability to synchronize actions with rhythmic (predictable) sound patterns is especially well developed in music making. Recent models of synchrony in musical ensembles rely on pairwise comparisons between group members. This pairwise approach to synchrony has hampered theory development, given current findings from social dynamics indicating shifts in members' influence within larger groups. We draw on social theory and nonlinear dynamics to argue that emergent properties and novel roles arise in musical group synchrony that differ from individual or pairwise behaviors. This transformational shift in defining synchrony sheds light on successful outcomes as well as on disruptions that cause negative behavioral outcomes.
Collapse
Affiliation(s)
- Alexander P Demos
- Department of Psychology, University of Illinois Chicago, 1007 W Harrison St., Chicago, IL 60607, USA.
| | - Caroline Palmer
- Department of Psychology, McGill University, 1205 Dr Penfield Ave., Montreal, QC H3A 1B1, Canada.
| |
Collapse
|
6
|
Szorkovszky A, Veenstra F, Glette K. Central pattern generators evolved for real-time adaptation to rhythmic stimuli. BIOINSPIRATION & BIOMIMETICS 2023; 18:046020. [PMID: 37339660 DOI: 10.1088/1748-3190/ace017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/05/2023] [Accepted: 06/20/2023] [Indexed: 06/22/2023]
Abstract
For a robot to be both autonomous and collaborative requires the ability to adapt its movement to a variety of external stimuli, whether these come from humans or other robots. Typically, legged robots have oscillation periods explicitly defined as a control parameter, limiting the adaptability of walking gaits. Here we demonstrate a virtual quadruped robot employing a bio-inspired central pattern generator (CPG) that can spontaneously synchronize its movement to a range of rhythmic stimuli. Multi-objective evolutionary algorithms were used to optimize the variation of movement speed and direction as a function of the brain stem drive and the centre of mass control respectively. This was followed by optimization of an additional layer of neurons that filters fluctuating inputs. As a result, a range of CPGs were able to adjust their gait pattern and/or frequency to match the input period. We show how this can be used to facilitate coordinated movement despite differences in morphology, as well as to learn new movement patterns.
Collapse
Affiliation(s)
- Alex Szorkovszky
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Oslo, Norway
- Department of Informatics, University of Oslo, Oslo, Norway
| | - Frank Veenstra
- Department of Informatics, University of Oslo, Oslo, Norway
| | - Kyrre Glette
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Oslo, Norway
- Department of Informatics, University of Oslo, Oslo, Norway
| |
Collapse
|
7
|
Horowitz-Kraus T, Gashri C. Multimodal Approach for Characterizing the Quality of Parent-Child Interaction: A Single Synchronization Source May Not Tell the Whole Story. BIOLOGY 2023; 12:biology12020241. [PMID: 36829518 PMCID: PMC9952901 DOI: 10.3390/biology12020241] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/29/2022] [Revised: 01/31/2023] [Accepted: 02/01/2023] [Indexed: 02/05/2023]
Abstract
The interaction between the parent and child is essential for the child's cognitive and emotional development and sets the path for future well-being. These interactions, starting from birth, are necessary for providing the sensory stimulation the child needs in the critical time window of brain development. The characterization of parent-child interactions is traditionally performed by human decoding. This approach is considered the leading and most accurate way of characterizing the quality of these interactions. However, the development of computational tools and especially the concept of parent-child synchronization opened up an additional source of data characterizing these interactions in an objective, less human-labor manner. Such sources include brain-to-brain, voice/speech, eye contact, motor, and heart-rate synchronization. However, can a single source synchronization dataset accurately represent parent-child interaction? Will attending to the same stimulation, often resulting in a higher brain-to-brain synchronization, be considered an interactive condition? In this perspective, we will try to convey a new concept of the child-parent interaction synchronization (CHIPS) matrix, which includes the different sources of signals generated during an interaction. Such a model may assist in explaining the source of interaction alterations in the case of child/parent developmental/emotional or sensory deficits and may open up new ways of assessing interventions and changes in parent-child interactions along development. We will discuss this interaction during one of the parent-child joint activities providing opportunities for interaction, i.e., storytelling.
Collapse
Affiliation(s)
- Tzipi Horowitz-Kraus
- Educational Neuroimaging Group, Faculty of Education in Science and Technology, Technion, Haifa 3200003, Israel
- Faculty of Biomedical Engineering, Technion, Haifa 3200003, Israel
- Neuropsychology Department, Kennedy Krieger Institute, Baltimore, MD 21205, USA
- Department of Psychiatry and Behavioral Sciences, Johns Hopkins School of Medicine, Baltimore, MD 21205, USA
- Correspondence: ; Tel.: +972-522-989298
| | - Carmel Gashri
- Faculty of Biomedical Engineering, Technion, Haifa 3200003, Israel
| |
Collapse
|
8
|
Hajnal A, Durgin FH. How frequent is the spontaneous occurrence of synchronized walking in daily life? Exp Brain Res 2023; 241:469-478. [PMID: 36576509 DOI: 10.1007/s00221-022-06536-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2022] [Accepted: 12/20/2022] [Indexed: 12/29/2022]
Abstract
Experimental work has suggested that individuals walking side by side may frequently synchronize their steps. The present study created video records of pedestrian activity on pedestrian pathways in order to estimate the frequency of continuous synchronization among pairs of walkers going about their daily lives. About 6% of 498 coded pairs were continuously synchronized. Analysis and modeling of the distributions of frequency differences suggested that while different walkers will tend to have different preferred frequencies for a given speed (i.e., a preferred ratio of step length to step frequency, or walk ratio), they may tend to adjust their walk ratios slightly toward one another's even when they are not synchronizing their steps.
Collapse
Affiliation(s)
- Alen Hajnal
- School of Psychology, University of Southern Mississippi, 118 College Dr, #5025, Hattiesburg, MS, 39406, USA.
| | - Frank H Durgin
- Department of Psychology, Swarthmore College, Swarthmore, USA
| |
Collapse
|
9
|
Furukawa H, Kudo K, Kubo K, Ding J, Saito A. Auditory interaction between runners: Does footstep sound affect step frequency of neighboring runners? PLoS One 2023; 18:e0280147. [PMID: 36608023 PMCID: PMC9821460 DOI: 10.1371/journal.pone.0280147] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2021] [Accepted: 12/21/2022] [Indexed: 01/07/2023] Open
Abstract
This study aimed to investigate the effect of footsteps of a neighboring runner (NR) on the main runner's step frequency (SF), heart rate (HR), and rating of perceived exertion (RPE). The participants were male long-distance runners belonging to a university track and field team. Two experiments were conducted in which the main runner (participant) and NR (examiner) ran with the same running speed on two adjacent treadmills separated by a thin wall. The participants were instructed that the experimental purpose was to investigate the HR when running with others and running alone. In Experiment 1, NR performed three trials of changing the footstep tempo in 5 bpm (beat per minute) faster (+5bpmFS), 5 bpm slower (-5bpmFS), or no footsteps (NF) conditions. The results showed that the footstep condition affected the variability of the SF but not the mean SF. Next, Experiment 2 was conducted by increasing the footstep tempo condition. NR performed seven trials of changing the footstep tempo by ±3 bpm, ±5 bpm, ±10 bpm, or no footstep. The results showed that the footstep condition affected the mean SF and the SF decreased at -10bpmFS compared to NF. There were no differences in the HR and RPE between conditions. These results indicated that the footsteps of NR could influence the SF, although it was unclear whether footsteps were involved in the synchronization between runners. Overall, our findings emphasize the environmental factors that influence running behavior, including the NR's footsteps.
Collapse
Affiliation(s)
- Hiroaki Furukawa
- Department of Life Sciences, Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan
- * E-mail: (HF); (KK)
| | - Kazutoshi Kudo
- Department of Life Sciences, Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan
- Graduate School of Interdisciplinary Information Studies, The University of Tokyo, Tokyo, Japan
- * E-mail: (HF); (KK)
| | - Kota Kubo
- Faculty of Occupational Therapy, Department of Rehabilitation, Kyushu Nutrition Welfare University, Kitakyushu, Fukuoka, Japan
| | - Jingwei Ding
- Graduate School of Human-Environment Studies, Kyushu University, Fukuoka, Japan
| | - Atsushi Saito
- Faculty of Human-Environment Studies, Kyushu University, Fukuoka, Japan
| |
Collapse
|
10
|
Spontaneous gait phase synchronization of human to a wheeled mobile robot with replicating gait-induced upper body oscillating motion. Sci Rep 2022; 12:16275. [PMID: 36175591 PMCID: PMC9523056 DOI: 10.1038/s41598-022-20481-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2021] [Accepted: 09/13/2022] [Indexed: 11/08/2022] Open
Abstract
Synchronization between humans is often observed in our daily lives, for example in breathing, in hand clapping in crowds, and in walking. It has been reported that pedestrian gait synchronization maximizes walking flow efficiency. As increasingly more mobile robots are being developed for practical use, it is important to consider how robots may impact pedestrian flows. While there is research on synchronization phenomena between humans and robots, gait synchronization between humans and robots has yet to be studied, particularly synchronization occurring with wheeled humanoid robots while moving. In this paper, we investigated the gait phase synchronization between humans and a wheeled mobile humanoid robot, which moved its upper body in three distinct types of motion patterns: (1) no-motion, (2) arm-swinging (as is common for typical mobile humanoids), and (3) arms-swinging in addition to periodic vertical-oscillation similar to the human upper body movement while walking. Rayleigh test was performed on the distribution of the obtained gait phase differences under each condition and a significant distributional bias was confirmed when participants were walking with the robot that performed both arm-swinging and vertical-oscillation of the upper body. These results suggest that humans can spontaneously synchronize their gaits with wheeled robots that utilize upper body oscillating. These findings can be important for the design of robot-integrated urban transportation systems, such as train stations and airports, where both humans and robots are mobile and a highly efficient flow is required.
Collapse
|
11
|
Vandevoorde K, Vollenkemper L, Schwan C, Kohlhase M, Schenck W. Using Artificial Intelligence for Assistance Systems to Bring Motor Learning Principles into Real World Motor Tasks. SENSORS (BASEL, SWITZERLAND) 2022; 22:2481. [PMID: 35408094 PMCID: PMC9002555 DOI: 10.3390/s22072481] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/18/2022] [Revised: 03/18/2022] [Accepted: 03/20/2022] [Indexed: 11/03/2022]
Abstract
Humans learn movements naturally, but it takes a lot of time and training to achieve expert performance in motor skills. In this review, we show how modern technologies can support people in learning new motor skills. First, we introduce important concepts in motor control, motor learning and motor skill learning. We also give an overview about the rapid expansion of machine learning algorithms and sensor technologies for human motion analysis. The integration between motor learning principles, machine learning algorithms and recent sensor technologies has the potential to develop AI-guided assistance systems for motor skill training. We give our perspective on this integration of different fields to transition from motor learning research in laboratory settings to real world environments and real world motor tasks and propose a stepwise approach to facilitate this transition.
Collapse
Affiliation(s)
- Koenraad Vandevoorde
- Center for Applied Data Science (CfADS), Faculty of Engineering and Mathematics, Bielefeld University of Applied Sciences, 33619 Bielefeld, Germany; (L.V.); (C.S.); (M.K.)
| | | | | | | | - Wolfram Schenck
- Center for Applied Data Science (CfADS), Faculty of Engineering and Mathematics, Bielefeld University of Applied Sciences, 33619 Bielefeld, Germany; (L.V.); (C.S.); (M.K.)
| |
Collapse
|
12
|
Cornman HL, Stenum J, Roemmich RT. Video-based quantification of human movement frequency using pose estimation: A pilot study. PLoS One 2021; 16:e0261450. [PMID: 34929012 PMCID: PMC8687570 DOI: 10.1371/journal.pone.0261450] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2021] [Accepted: 12/03/2021] [Indexed: 11/18/2022] Open
Abstract
Assessment of repetitive movements (e.g., finger tapping) is a hallmark of motor examinations in several neurologic populations. These assessments are traditionally performed by a human rater via visual inspection; however, advances in computer vision offer potential for remote, quantitative assessment using simple video recordings. Here, we evaluated a pose estimation approach for measurement of human movement frequency from smartphone videos. Ten healthy young participants provided videos of themselves performing five repetitive movement tasks (finger tapping, hand open/close, hand pronation/supination, toe tapping, leg agility) at four target frequencies (1–4 Hz). We assessed the ability of a workflow that incorporated OpenPose (a freely available whole-body pose estimation algorithm) to estimate movement frequencies by comparing against manual frame-by-frame (i.e., ground-truth) measurements for all tasks and target frequencies using repeated measures ANOVA, Pearson’s correlations, and intraclass correlations. Our workflow produced largely accurate estimates of movement frequencies; only the hand open/close task showed a significant difference in the frequencies estimated by pose estimation and manual measurement (while statistically significant, these differences were small in magnitude). All other tasks and frequencies showed no significant differences between pose estimation and manual measurement. Pose estimation-based detections of individual events (e.g., finger taps, hand closures) showed strong correlations (all r>0.99) with manual detections for all tasks and frequencies. In summary, our pose estimation-based workflow accurately tracked repetitive movements in healthy adults across a range of tasks and movement frequencies. Future work will test this approach as a fast, quantitative, video-based approach to assessment of repetitive movements in clinical populations.
Collapse
Affiliation(s)
- Hannah L. Cornman
- Center for Movement Studies, Kennedy Krieger Institute, Baltimore, MD, United States of America
- Dept of Physical Medicine and Rehabilitation, Johns Hopkins University School of Medicine, Baltimore, MD, United States of America
- University of Maryland School of Medicine, Baltimore, MD, United States of America
| | - Jan Stenum
- Center for Movement Studies, Kennedy Krieger Institute, Baltimore, MD, United States of America
- Dept of Physical Medicine and Rehabilitation, Johns Hopkins University School of Medicine, Baltimore, MD, United States of America
| | - Ryan T. Roemmich
- Center for Movement Studies, Kennedy Krieger Institute, Baltimore, MD, United States of America
- Dept of Physical Medicine and Rehabilitation, Johns Hopkins University School of Medicine, Baltimore, MD, United States of America
- * E-mail:
| |
Collapse
|
13
|
Stenum J, Cherry-Allen KM, Pyles CO, Reetzke RD, Vignos MF, Roemmich RT. Applications of Pose Estimation in Human Health and Performance across the Lifespan. SENSORS (BASEL, SWITZERLAND) 2021; 21:7315. [PMID: 34770620 PMCID: PMC8588262 DOI: 10.3390/s21217315] [Citation(s) in RCA: 47] [Impact Index Per Article: 11.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/24/2021] [Revised: 10/29/2021] [Accepted: 10/31/2021] [Indexed: 01/15/2023]
Abstract
The emergence of pose estimation algorithms represents a potential paradigm shift in the study and assessment of human movement. Human pose estimation algorithms leverage advances in computer vision to track human movement automatically from simple videos recorded using common household devices with relatively low-cost cameras (e.g., smartphones, tablets, laptop computers). In our view, these technologies offer clear and exciting potential to make measurement of human movement substantially more accessible; for example, a clinician could perform a quantitative motor assessment directly in a patient's home, a researcher without access to expensive motion capture equipment could analyze movement kinematics using a smartphone video, and a coach could evaluate player performance with video recordings directly from the field. In this review, we combine expertise and perspectives from physical therapy, speech-language pathology, movement science, and engineering to provide insight into applications of pose estimation in human health and performance. We focus specifically on applications in areas of human development, performance optimization, injury prevention, and motor assessment of persons with neurologic damage or disease. We review relevant literature, share interdisciplinary viewpoints on future applications of these technologies to improve human health and performance, and discuss perceived limitations.
Collapse
Affiliation(s)
- Jan Stenum
- Center for Movement Studies, Kennedy Krieger Institute, Baltimore, MD 21205, USA;
- Department of Physical Medicine and Rehabilitation, Johns Hopkins University School of Medicine, Baltimore, MD 21205, USA;
| | - Kendra M. Cherry-Allen
- Department of Physical Medicine and Rehabilitation, Johns Hopkins University School of Medicine, Baltimore, MD 21205, USA;
| | - Connor O. Pyles
- Johns Hopkins Applied Physics Laboratory, Laurel, MD 20723, USA; (C.O.P.); (M.F.V.)
| | - Rachel D. Reetzke
- Center for Autism and Related Disorders, Kennedy Krieger Institute, Baltimore, MD 21211, USA;
- Department of Psychiatry and Behavioral Sciences, Johns Hopkins University School of Medicine, Baltimore, MD 21205, USA
| | - Michael F. Vignos
- Johns Hopkins Applied Physics Laboratory, Laurel, MD 20723, USA; (C.O.P.); (M.F.V.)
| | - Ryan T. Roemmich
- Center for Movement Studies, Kennedy Krieger Institute, Baltimore, MD 21205, USA;
- Department of Physical Medicine and Rehabilitation, Johns Hopkins University School of Medicine, Baltimore, MD 21205, USA;
| |
Collapse
|
14
|
Peterson SM, Singh SH, Wang NXR, Rao RPN, Brunton BW. Behavioral and Neural Variability of Naturalistic Arm Movements. eNeuro 2021; 8:ENEURO.0007-21.2021. [PMID: 34031100 PMCID: PMC8225404 DOI: 10.1523/eneuro.0007-21.2021] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2021] [Revised: 03/27/2021] [Accepted: 05/04/2021] [Indexed: 11/21/2022] Open
Abstract
Motor behaviors are central to many functions and dysfunctions of the brain, and understanding their neural basis has consequently been a major focus in neuroscience. However, most studies of motor behaviors have been restricted to artificial, repetitive paradigms, far removed from natural movements performed "in the wild." Here, we leveraged recent advances in machine learning and computer vision to analyze intracranial recordings from 12 human subjects during thousands of spontaneous, unstructured arm reach movements, observed over several days for each subject. These naturalistic movements elicited cortical spectral power patterns consistent with findings from controlled paradigms, but with considerable neural variability across subjects and events. We modeled interevent variability using 10 behavioral and environmental features; the most important features explaining this variability were reach angle and day of recording. Our work is among the first studies connecting behavioral and neural variability across cortex in humans during unstructured movements and contributes to our understanding of long-term naturalistic behavior.
Collapse
Affiliation(s)
- Steven M Peterson
- Department of Biology, University of Washington, Seattle, Washington 98195
- eScience Institute, University of Washington, Seattle, Washington 98195
| | - Satpreet H Singh
- Department of Electrical & Computer Engineering, University of Washington, Seattle, Washington 98195
| | - Nancy X R Wang
- IBM Research, San Jose, California 95120
- Paul G. Allen School of Computer Science and Engineering, University of Washington, Seattle, Washington 98195
| | - Rajesh P N Rao
- Department of Electrical & Computer Engineering, University of Washington, Seattle, Washington 98195
- Paul G. Allen School of Computer Science and Engineering, University of Washington, Seattle, Washington 98195
- Center for Neurotechnology, University of Washington, Seattle, Washington 98195
| | - Bingni W Brunton
- Department of Biology, University of Washington, Seattle, Washington 98195
- eScience Institute, University of Washington, Seattle, Washington 98195
| |
Collapse
|
15
|
Stenum J, Rossi C, Roemmich RT. Two-dimensional video-based analysis of human gait using pose estimation. PLoS Comput Biol 2021; 17:e1008935. [PMID: 33891585 PMCID: PMC8099131 DOI: 10.1371/journal.pcbi.1008935] [Citation(s) in RCA: 107] [Impact Index Per Article: 26.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2020] [Revised: 05/05/2021] [Accepted: 04/01/2021] [Indexed: 12/29/2022] Open
Abstract
Human gait analysis is often conducted in clinical and basic research, but many common approaches (e.g., three-dimensional motion capture, wearables) are expensive, immobile, data-limited, and require expertise. Recent advances in video-based pose estimation suggest potential for gait analysis using two-dimensional video collected from readily accessible devices (e.g., smartphones). To date, several studies have extracted features of human gait using markerless pose estimation. However, we currently lack evaluation of video-based approaches using a dataset of human gait for a wide range of gait parameters on a stride-by-stride basis and a workflow for performing gait analysis from video. Here, we compared spatiotemporal and sagittal kinematic gait parameters measured with OpenPose (open-source video-based human pose estimation) against simultaneously recorded three-dimensional motion capture from overground walking of healthy adults. When assessing all individual steps in the walking bouts, we observed mean absolute errors between motion capture and OpenPose of 0.02 s for temporal gait parameters (i.e., step time, stance time, swing time and double support time) and 0.049 m for step lengths. Accuracy improved when spatiotemporal gait parameters were calculated as individual participant mean values: mean absolute error was 0.01 s for temporal gait parameters and 0.018 m for step lengths. The greatest difference in gait speed between motion capture and OpenPose was less than 0.10 m s−1. Mean absolute error of sagittal plane hip, knee and ankle angles between motion capture and OpenPose were 4.0°, 5.6° and 7.4°. Our analysis workflow is freely available, involves minimal user input, and does not require prior gait analysis expertise. Finally, we offer suggestions and considerations for future applications of pose estimation for human gait analysis. There is a growing interest among clinicians and researchers to use novel pose estimation algorithms that automatically track human movement to analyze human gait. Gait analysis is routinely conducted in designated laboratories with specialized equipment. On the other hand, pose estimation relies on digital videos that can be recorded from household devices such as a smartphone. As a result, these new techniques make it possible to move beyond the laboratory and perform gait analysis in other settings such as the home or clinic. Before such techniques are adopted, we identify a critical need for comparing outcome parameters against three-dimensional motion capture and to evaluate how camera viewpoint affect outcome parameters. We used simultaneous motion capture and left- and right-side video recordings of healthy human gait and calculated spatiotemporal gait parameters and lower-limb joint angles. We find that our provided workflow estimates spatiotemporal gait parameters together with hip and knee angles with the accuracy and precision needed to detect changes in the gait pattern. We demonstrate that the position of the participant relative to the camera affect spatial measures such as step length and discuss the limitations posed by the current approach.
Collapse
Affiliation(s)
- Jan Stenum
- Center for Movement Studies, Kennedy Krieger Institute, Baltimore, Maryland, United States of America
- Department of Physical Medicine and Rehabilitation, Johns Hopkins University School of Medicine, Baltimore, Maryland, United States of America
| | - Cristina Rossi
- Center for Movement Studies, Kennedy Krieger Institute, Baltimore, Maryland, United States of America
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland, United States of America
| | - Ryan T. Roemmich
- Center for Movement Studies, Kennedy Krieger Institute, Baltimore, Maryland, United States of America
- Department of Physical Medicine and Rehabilitation, Johns Hopkins University School of Medicine, Baltimore, Maryland, United States of America
- * E-mail:
| |
Collapse
|
16
|
Spontaneous synchronization of motion in pedestrian crowds of different densities. Nat Hum Behav 2021; 5:447-457. [PMID: 33398140 DOI: 10.1038/s41562-020-00997-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2020] [Accepted: 10/12/2020] [Indexed: 01/28/2023]
Abstract
Interacting pedestrians in a crowd spontaneously adjust their footsteps and align their respective stepping phases. This self-organization phenomenon is known as synchronization. However, it is unclear why and how synchronization forms spontaneously under different density conditions, or what functional benefit synchronization offers for the collective motion of humans. Here, we conducted a single-file crowd motion experiment that directly tracked the alternating movement of both legs of interacting pedestrians. We show that synchronization is most likely to be triggered at the same density at which the flow rate of pedestrians reaches a maximum value. We demonstrate that synchronization is established in response to an insufficient safety distance between pedestrians, and that it enables pedestrians to realize efficient collective stepping motion without the occurrence of inter-person collisions. These findings provide insights into the collective motion behaviour of humans and may have implications for understanding pedestrian synchronization-induced wobbling, for example, of bridges.
Collapse
|
17
|
Felsberg DT, Rhea CK. Spontaneous Interpersonal Synchronization of Gait: A Systematic Review. Arch Rehabil Res Clin Transl 2021; 3:100097. [PMID: 33778472 PMCID: PMC7984988 DOI: 10.1016/j.arrct.2020.100097] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Objective To systematically review the existing evidence of spontaneous synchronization in human gait. Data Sources EBSCO, PubMed, Google Scholar, and PsycINFO were searched from inception to July 2020 using all possible combinations of (1) “spontaneous interpersonal synchronization” or “spontaneous interpersonal coordination” or “unintentional interpersonal synchronization” or “unintentional interpersonal coordination” and (2) “human movement” or “movement” or “walking” or “ambulation” or “gait.” Study Selection Studies had to focus on spontaneous synchronization in human gait, be published in a peer-reviewed journal, present original data (no review articles were included), and be written in English. The search yielded 137 results, and the inclusion criteria were met by 16 studies. Data Extraction Participant demographics, study purpose, setup, procedure, biomechanical measurement, coordination analytical technique, and findings were extracted. Our synthesis focused on the context in which this phenomenon has been studied, the role of sensory information in the emergence of spontaneous interpersonal synchronization in human gait, and the metrics used to quantify this behavior. Data Synthesis The included 16 articles ranged from 2007-2019 and used healthy, primarily young subjects to investigate the role of spontaneous interpersonal synchronization on gait behavior, with the majority using a side-by-side walking/running paradigm. All articles reported data supporting spontaneous interpersonal synchronization, with the strength of the synchronization depending on the sensory information available to the participants. Conclusions Walking alongside an intact locomotor system may provide an effective and biologically variable attractor signal for rehabilitation of gait behavior. Future research should focus on the utility of spontaneous interpersonal synchronization in clinical populations as a noninvasive method to enhance gait rehabilitation.
Collapse
Affiliation(s)
- Danielle T Felsberg
- Department of Kinesiology, University of North Carolina at Greensboro, Greensboro, North Carolina
| | - Christopher K Rhea
- Department of Kinesiology, University of North Carolina at Greensboro, Greensboro, North Carolina
| |
Collapse
|
18
|
Chambers C, Seethapathi N, Saluja R, Loeb H, Pierce SR, Bogen DK, Prosser L, Johnson MJ, Kording KP. Computer Vision to Automatically Assess Infant Neuromotor Risk. IEEE Trans Neural Syst Rehabil Eng 2020; 28:2431-2442. [PMID: 33021933 DOI: 10.1101/756262] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/20/2023]
Abstract
An infant's risk of developing neuromotor impairment is primarily assessed through visual examination by specialized clinicians. Therefore, many infants at risk for impairment go undetected, particularly in under-resourced environments. There is thus a need to develop automated, clinical assessments based on quantitative measures from widely-available sources, such as videos recorded on a mobile device. Here, we automatically extract body poses and movement kinematics from the videos of at-risk infants (N = 19). For each infant, we calculate how much they deviate from a group of healthy infants (N = 85 online videos) using a Naïve Gaussian Bayesian Surprise metric. After pre-registering our Bayesian Surprise calculations, we find that infants who are at high risk for impairments deviate considerably from the healthy group. Our simple method, provided as an open-source toolkit, thus shows promise as the basis for an automated and low-cost assessment of risk based on video recordings.
Collapse
|
19
|
Chambers C, Seethapathi N, Saluja R, Loeb H, Pierce SR, Bogen DK, Prosser L, Johnson MJ, Kording KP. Computer Vision to Automatically Assess Infant Neuromotor Risk. IEEE Trans Neural Syst Rehabil Eng 2020; 28:2431-2442. [PMID: 33021933 PMCID: PMC8011647 DOI: 10.1109/tnsre.2020.3029121] [Citation(s) in RCA: 44] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
An infant's risk of developing neuromotor impairment is primarily assessed through visual examination by specialized clinicians. Therefore, many infants at risk for impairment go undetected, particularly in under-resourced environments. There is thus a need to develop automated, clinical assessments based on quantitative measures from widely-available sources, such as videos recorded on a mobile device. Here, we automatically extract body poses and movement kinematics from the videos of at-risk infants (N = 19). For each infant, we calculate how much they deviate from a group of healthy infants (N = 85 online videos) using a Naïve Gaussian Bayesian Surprise metric. After pre-registering our Bayesian Surprise calculations, we find that infants who are at high risk for impairments deviate considerably from the healthy group. Our simple method, provided as an open-source toolkit, thus shows promise as the basis for an automated and low-cost assessment of risk based on video recordings.
Collapse
|
20
|
Soczawa-Stronczyk AA, Bocian M. Gait coordination in overground walking with a virtual reality avatar. ROYAL SOCIETY OPEN SCIENCE 2020; 7:200622. [PMID: 32874653 PMCID: PMC7428218 DOI: 10.1098/rsos.200622] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/15/2020] [Accepted: 06/19/2020] [Indexed: 06/11/2023]
Abstract
Little information is currently available on interpersonal gait synchronization in overground walking. This is caused by difficulties in continuous gait monitoring over many steps while ensuring repeatability of experimental conditions. These challenges could be overcome by using immersive virtual reality (VR), assuming it offers ecological validity. To this end, this study provides some of the first evidence of gait coordination patterns for overground walking dyads in VR. Six subjects covered the total distance of 27 km while walking with a pacer. The pacer was either a real human subject or their anatomically and biomechanically representative VR avatar driven by an artificial intelligence algorithm. Side-by-side and front-to-back arrangements were tested without and with the instruction to synchronize steps. Little evidence of spontaneous gait coordination was found in both visual conditions, but persistent gait coordination patterns were found in the case of intentional synchronization. Front-to-back rather than side-by-side arrangement consistently yielded in the latter case higher mean synchronization strength index. Although the mean magnitude of synchronization strength index was overall comparable in both visual conditions when walking under the instruction to synchronize steps, quantitative and qualitative differences were found which might be associated with common limitations of VR solutions.
Collapse
Affiliation(s)
| | - Mateusz Bocian
- School of Engineering, University of Leicester, Leicester, UK
- Biomechanics and Immersive Technology Laboratory, University of Leicester, Leicester, UK
| |
Collapse
|
21
|
Stupacher J, Witek MAG, Vuoskoski JK, Vuust P. Cultural Familiarity and Individual Musical Taste Differently Affect Social Bonding when Moving to Music. Sci Rep 2020; 10:10015. [PMID: 32572038 PMCID: PMC7308378 DOI: 10.1038/s41598-020-66529-1] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2020] [Accepted: 05/07/2020] [Indexed: 11/17/2022] Open
Abstract
Social bonds are essential for our health and well-being. Music provides a unique and implicit context for social bonding by introducing temporal and affective frameworks, which facilitate movement synchronization and increase affiliation. How these frameworks are modulated by cultural familiarity and individual musical preferences remain open questions. In three experiments, we operationalized the affective aspects of social interactions as ratings of interpersonal closeness between two walking stick-figures in a video. These figures represented a virtual self and a virtual other person. The temporal aspects of social interactions were manipulated by movement synchrony: while the virtual self always moved in time with the beat of instrumental music, the virtual other moved either synchronously or asynchronously. When the context-providing music was more enjoyed, social closeness increased strongly with a synchronized virtual other, but only weakly with an asynchronized virtual other. When the music was more familiar, social closeness was higher independent of movement synchrony. We conclude that the social context provided by music can strengthen interpersonal closeness by increasing temporal and affective self-other overlaps. Individual musical preferences might be more relevant for the influence of movement synchrony on social bonding than musical familiarity.
Collapse
Affiliation(s)
- Jan Stupacher
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus, Denmark. .,Institute of Psychology, University of Graz, Graz, Austria.
| | - Maria A G Witek
- Department of Music, University of Birmingham, Birmingham, United Kingdom
| | - Jonna K Vuoskoski
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Movement, Department of Musicology & Department of Psychology, University of Oslo, Oslo, Norway
| | - Peter Vuust
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus, Denmark
| |
Collapse
|
22
|
Mathis MW, Mathis A. Deep learning tools for the measurement of animal behavior in neuroscience. Curr Opin Neurobiol 2020; 60:1-11. [DOI: 10.1016/j.conb.2019.10.008] [Citation(s) in RCA: 186] [Impact Index Per Article: 37.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2019] [Revised: 10/24/2019] [Accepted: 10/29/2019] [Indexed: 01/21/2023]
|