1
|
Lagerstedt E, Thill S. Multiple Roles of Multimodality Among Interacting Agents. ACM TRANSACTIONS ON HUMAN-ROBOT INTERACTION 2022. [DOI: 10.1145/3549955] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
Abstract
The term ‘multimodality’ has come to take on several somewhat different meanings depending on the underlying theoretical paradigms and traditions, and the purpose and context of use. The term is closely related to embodiment, which in turn is also used in several different ways. In this paper, we elaborate on this connection and propose that a pragmatic and pluralistic stance is appropriate for multimodality. We further propose a distinction between first and second order effects of multimodality; what is achieved by multiple modalities in isolation and the opportunities that emerge when several modalities are entangled. This highlights questions regarding ways to cluster or interchange different modalities, for example through redundancy or degeneracy. Apart from discussing multimodality with respect to an individual agent, we further look to more distributed agents and situations where social aspects become relevant.
In robotics, understanding the various uses and interpretations of these terms can prevent miscommunication when designing robots, as well as increase awareness of the underlying theoretical concepts. Given the complexity of the different ways in which multimodality is relevant in social robotics, this can provide the basis for negotiating appropriate meanings of the term at a case by case basis.
Collapse
Affiliation(s)
| | - Serge Thill
- Donders Institute for Brain, Cognition, and Behaviour, Radboud University, The Netherlands
| |
Collapse
|
2
|
Requirements for Robotic Interpretation of Social Signals “in the Wild”: Insights from Diagnostic Criteria of Autism Spectrum Disorder. INFORMATION 2020. [DOI: 10.3390/info11020081] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/27/2022] Open
Abstract
The last few decades have seen widespread advances in technological means to characterise observable aspects of human behaviour such as gaze or posture. Among others, these developments have also led to significant advances in social robotics. At the same time, however, social robots are still largely evaluated in idealised or laboratory conditions, and it remains unclear whether the technological progress is sufficient to let such robots move “into the wild”. In this paper, we characterise the problems that a social robot in the real world may face, and review the technological state of the art in terms of addressing these. We do this by considering what it would entail to automate the diagnosis of Autism Spectrum Disorder (ASD). Just as for social robotics, ASD diagnosis fundamentally requires the ability to characterise human behaviour from observable aspects. However, therapists provide clear criteria regarding what to look for. As such, ASD diagnosis is a situation that is both relevant to real-world social robotics and comes with clear metrics. Overall, we demonstrate that even with relatively clear therapist-provided criteria and current technological progress, the need to interpret covert behaviour cannot yet be fully addressed. Our discussions have clear implications for ASD diagnosis, but also for social robotics more generally. For ASD diagnosis, we provide a classification of criteria based on whether or not they depend on covert information and highlight present-day possibilities for supporting therapists in diagnosis through technological means. For social robotics, we highlight the fundamental role of covert behaviour, show that the current state-of-the-art is unable to characterise this, and emphasise that future research should tackle this explicitly in realistic settings.
Collapse
|