1
|
López-Mochales S, Aparicio-Terrés R, Díaz-Andreu M, Escera C. Acoustic perception and emotion evocation by rock art soundscapes of Altai (Russia). Front Psychol 2023; 14:1188567. [PMID: 37794915 PMCID: PMC10546042 DOI: 10.3389/fpsyg.2023.1188567] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2023] [Accepted: 09/04/2023] [Indexed: 10/06/2023] Open
Abstract
The major goal of psychoarchaeoacoustics is to understand the psychology behind motivations and emotions of past communities when selecting certain acoustic environments to set activities involving the production of paintings and carvings. Within this framework, the present study seeks to explore whether a group of archaeological rock art sites in Altai (Siberia, Russia) are distinguished by particular acoustic imprints that elicit distinct reactions on listeners, in perceptual and emotional terms. Sixty participants were presented with a series of natural sounds convolved with six impulse responses from Altai, three of them recorded in locations in front of rock art panels and three of them in front of similar locations but without any trace of rock art. Participants were interrogated about their subjective perception of the sounds presented, using 10 psychoacoustic and emotional scales. The mixed ANOVA analyses carried out revealed that feelings of "presence," "closeness," and "tension" evoked by all sounds were significantly influenced by the location. These effects were attributed to the differences in reverberation between the locations with and without rock art. Despite results are not consistent across all the studied rock art sites, and acknowledging the presence of several limitations, this study highlights the significance of its methodology. It stresses the crucial aspect of incorporating the limitations encountered in shaping future research endeavors.
Collapse
Affiliation(s)
- Samantha López-Mochales
- Brainlab – Cognitive Neuroscience Research Group, Department of Clinical Psychology and Psychobiology, Universitat de Barcelona (UB), Barcelona, Spain
- Institut de Neurociències, Universitat de Barcelona (UB), Barcelona, Spain
| | - Raquel Aparicio-Terrés
- Brainlab – Cognitive Neuroscience Research Group, Department of Clinical Psychology and Psychobiology, Universitat de Barcelona (UB), Barcelona, Spain
- Institut de Neurociències, Universitat de Barcelona (UB), Barcelona, Spain
| | - Margarita Díaz-Andreu
- Departament d’Història i Arqueologia, Universitat de Barcelona (UB), Barcelona, Spain
- Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain
- Institut d’Arqueologia de la Universitat de Barcelona (IUAB), Universitat de Barcelona (UB), Barcelona, Spain
| | - Carles Escera
- Brainlab – Cognitive Neuroscience Research Group, Department of Clinical Psychology and Psychobiology, Universitat de Barcelona (UB), Barcelona, Spain
- Institut de Neurociències, Universitat de Barcelona (UB), Barcelona, Spain
- Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain
- Institut de Recerca Sant Joan de Déu (IRSJD), Esplugues de Llobregat, Spain
| |
Collapse
|
2
|
Hauck P, Hecht H. Emotionally congruent music and text increase immersion and appraisal. PLoS One 2023; 18:e0280019. [PMID: 36634102 PMCID: PMC9836297 DOI: 10.1371/journal.pone.0280019] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Accepted: 12/19/2022] [Indexed: 01/13/2023] Open
Abstract
Numerous studies indicate that listening to music and reading are processes that interact in multiple ways. However, these interactions have rarely been explored with regard to the role of emotional mood. In this study, we first conducted two pilot experiments to assess the conveyed emotional mood of four classical music pieces and that of four narrative text excerpts. In the main experiment, participants were asked to read the texts while listening to the music and to rate their emotional state in terms of valence, arousal, and dominance. Subsequently, they rated text and music of the multisensory event in terms of the perceived mood, liking, immersion, and music-text fit. We found a mutual carry-over effect of happy and sad moods from music to text and vice versa. Against our expectations, this effect was not mediated by the valence, arousal, or dominance experienced by the subject. Moreover, we revealed a significant interaction between music mood and text mood. Texts were liked better, they were classified as of better quality, and participants felt more immersed in the text if text mood and music mood corresponded. The role of mood congruence when listening to music while reading should not be ignored and deserves further exploration.
Collapse
Affiliation(s)
- Pia Hauck
- Department of General Experimental Psychology, Johannes Gutenberg-Universität Mainz, Mainz, Germany
- Institute for Research on Reading and Media, Stiftung Lesen, Mainz, Germany
- * E-mail:
| | - Heiko Hecht
- Department of General Experimental Psychology, Johannes Gutenberg-Universität Mainz, Mainz, Germany
| |
Collapse
|
3
|
López-Mochales S, Jiménez-Pasalodos R, Valenzuela J, Gutiérrez-Cajaraville C, Díaz-Andreu M, Escera C. Experimental Enhancement of Feelings of Transcendence, Tenderness, and Expressiveness by Music in Christian Liturgical Spaces. Front Psychol 2022; 13:844029. [PMID: 35360627 PMCID: PMC8960987 DOI: 10.3389/fpsyg.2022.844029] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2021] [Accepted: 02/16/2022] [Indexed: 11/20/2022] Open
Abstract
In western cultures, when it comes to places of worship and liturgies, music, acoustics and architecture go hand in hand. In the present study, we aimed to investigate whether the emotions evoked by music are enhanced by the acoustics of the space where the music was composed to be played on. We explored whether the emotional responses of western naïve listeners to two vocal pieces from the Renaissance, one liturgical and one secular, convolved with the impulse responses of four Christian temples from the United Kingdom, were modulated by the appropriate piece/space matching. In an alternative forced choice task where participants had to indicate their preference for the original recording of the piece (not convolved with any temple-like acoustics) vs. the convolved one, no significant differences were found. However, in the tasks where participants rated their emotional in response to each piece and acoustic condition, the factorial ANCOVA analyses performed on the results revealed significant effects. We observed that, across pieces and spaces, participants found the temple-like acoustics as more transcendent, compared to the acoustics of the original version of the pieces. In addition, they rated the secular piece as more tender and the liturgical piece as more expressive in its original versions, compared to the convolved ones. We conclude that the acoustic signature of the four Christian temples causes an exaltation of certain emotions on listeners, although this effect is not associated to one or another musical piece.
Collapse
Affiliation(s)
- Samantha López-Mochales
- Brainlab - Cognitive Neuroscience Research Group, Department of Clinical Psychology and Psychobiology, Faculty of Psychology, University of Barcelona, Barcelona, Spain.,Institute of Neurosciences, University of Barcelona, Barcelona, Spain
| | - Raquel Jiménez-Pasalodos
- Departament d'Història i Arqueologia, Universitat de Barcelona, Barcelona, Spain.,Sección Departamental de Historia y Ciencias de la Música, Universidad de Valladolid, Valladolid, Spain
| | - Jose Valenzuela
- Brainlab - Cognitive Neuroscience Research Group, Department of Clinical Psychology and Psychobiology, Faculty of Psychology, University of Barcelona, Barcelona, Spain.,Institute of Neurosciences, University of Barcelona, Barcelona, Spain
| | | | - Margarita Díaz-Andreu
- Departament d'Història i Arqueologia, Universitat de Barcelona, Barcelona, Spain.,Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain.,Institut d'Arqueologia de la Universitat de Barcelona (IAUB), Barcelona, Spain
| | - Carles Escera
- Brainlab - Cognitive Neuroscience Research Group, Department of Clinical Psychology and Psychobiology, Faculty of Psychology, University of Barcelona, Barcelona, Spain.,Institute of Neurosciences, University of Barcelona, Barcelona, Spain.,Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain.,Institut de Recerca Sant Joan de Déu (IRSJD), Esplugues de Llobregat, Spain
| |
Collapse
|
4
|
He JX, Zhou L, Liu ZT, Hu XY. Digital Empirical Research of Influencing Factors of Musical Emotion Classification Based on Pleasure-Arousal Musical Emotion Fuzzy Model. JOURNAL OF ADVANCED COMPUTATIONAL INTELLIGENCE AND INTELLIGENT INFORMATICS 2020. [DOI: 10.20965/jaciii.2020.p0872] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
In recent years, with the further breakthrough of artificial intelligence theory and technology, as well as the further expansion of the Internet scale, the recognition of human emotions and the necessity for satisfying human psychological needs in future artificial intelligence technology development tendencies have been highlighted, in addition to physical task accomplishment. Musical emotion classification is an important research topic in artificial intelligence. The key premise of realizing music emotion classification is to construct a musical emotion model that conforms to the characteristics of music emotion recognition. Currently, three types of music emotion classification models are available: discrete category, continuous dimensional, and music emotion-specific models. The pleasure-arousal music emotion fuzzy model, which includes a wide range of emotions compared with other models, is selected as the emotional classification system in this study to investigate the influencing factor for musical emotion classification. Two representative emotional attributes, i.e., speed and strength, are used as variables. Based on test experiments involving music and non-music majors combined with questionnaire results, the relationship between music properties and emotional changes under the pleasure-arousal model is revealed quantitatively.
Collapse
|
5
|
Valenzuela J, Díaz-Andreu M, Escera C. Psychology Meets Archaeology: Psychoarchaeoacoustics for Understanding Ancient Minds and Their Relationship to the Sacred. Front Psychol 2020; 11:550794. [PMID: 33391069 PMCID: PMC7775382 DOI: 10.3389/fpsyg.2020.550794] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2020] [Accepted: 11/17/2020] [Indexed: 11/13/2022] Open
Abstract
How important is the influence of spatial acoustics on our mental processes related to sound perception and cognition? There is a large body of research in fields encompassing architecture, musicology, and psychology that analyzes human response, both subjective and objective, to different soundscapes. But what if we want to understand how acoustic environments influenced the human experience of sound in sacred ritual practices in premodern societies? Archaeoacoustics is the research field that investigates sound in the past. One of its branches delves into how sound was used in specific landscapes and at sites with rock art, and why past societies endowed a special significance to places with specific acoustical properties. Taking advantage of the advances made in sound recording and reproduction technologies, researchers are now exploring how ancient social and sacred ceremonies and practices related to the acoustic properties of their sound environment. Here, we advocate for the emergence of a new and innovative discipline, experimental psychoarchaeoacoustics. We also review underlying methodological approaches and discuss the limitations, challenges, and future directions for this new field.
Collapse
Affiliation(s)
- Jose Valenzuela
- Brainlab ‐ Cognitive Neuroscience Research Group, Department of Clinical Psychology and Psychobiology, Faculty of Psychology, University of Barcelona, Barcelona, Spain
- Institute of Neurosciences, University of Barcelona, Barcelona, Spain
| | - Margarita Díaz-Andreu
- Catalan Institution for Research and Advanced Studies (ICREA), Barcelona, Spain
- Department of History and Geography, University of Barcelona, Barcelona, Spain
| | - Carles Escera
- Brainlab ‐ Cognitive Neuroscience Research Group, Department of Clinical Psychology and Psychobiology, Faculty of Psychology, University of Barcelona, Barcelona, Spain
- Institute of Neurosciences, University of Barcelona, Barcelona, Spain
- Catalan Institution for Research and Advanced Studies (ICREA), Barcelona, Spain
- Sant Joan de Déu Research Institute (IRSJD), Esplugues de Llobregat, Spain
| |
Collapse
|
6
|
Cuadrado F, Lopez-Cobo I, Mateos-Blanco T, Tajadura-Jiménez A. Arousing the Sound: A Field Study on the Emotional Impact on Children of Arousing Sound Design and 3D Audio Spatialization in an Audio Story. Front Psychol 2020; 11:737. [PMID: 32435215 PMCID: PMC7219267 DOI: 10.3389/fpsyg.2020.00737] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2019] [Accepted: 03/26/2020] [Indexed: 11/13/2022] Open
Abstract
Sound from media increases the immersion of the audience in the story, adding credibility to the narration but also generating emotions in the spectator. A study on children aged 9–13 years (N = 253), using an audio story, investigated the emotional impact of arousal vs. neutral treatment of sound and 3D vs. stereo mix spatialization. The emotional impact was measured combining three different measures: physiological (Electrodermal activity), self-report (pre-post exposition), and richness of mental images elicited by the story (using Think-aloud technique after exposition). Results showed higher emotional impact of the arousal and 3D audio conditions with different patterns according to the age of the participants and distinctive types of interaction when both variables were combined.
Collapse
Affiliation(s)
- Francisco Cuadrado
- Communication and Education, Universidad Loyola Andalucía, Seville, Spain
- *Correspondence: Francisco Cuadrado
| | - Isabel Lopez-Cobo
- Communication and Education, Universidad Loyola Andalucía, Seville, Spain
| | - Tania Mateos-Blanco
- Department of Theory and History of Education, and Social Pedagogy, Universidad de Sevilla, Seville, Spain
| | - Ana Tajadura-Jiménez
- DEI Interactive Systems Group, Department of Computer Science and Engineering, Universidad Carlos III de Madrid, Madrid, Spain
| |
Collapse
|
7
|
Viaud-Delmon I, Warusfel O, Seguelas A, Rio E, Jouvent R. High sensitivity to multisensory conflicts in agoraphobia exhibited by virtual reality. Eur Psychiatry 2020; 21:501-8. [PMID: 17055951 DOI: 10.1016/j.eurpsy.2004.10.004] [Citation(s) in RCA: 25] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/07/2004] [Revised: 10/13/2004] [Accepted: 10/21/2004] [Indexed: 11/16/2022] Open
Abstract
AbstractThe primary aim of this study was to evaluate the effect of auditory feedback in a VR system planned for clinical use and to address the different factors that should be taken into account in building a bimodal virtual environment (VE). We conducted an experiment in which we assessed spatial performances in agoraphobic patients and normal subjects comparing two kinds of VEs, visual alone (Vis) and auditory–visual (AVis), during separate sessions. Subjects were equipped with a head-mounted display coupled with an electromagnetic sensor system and immersed in a virtual town. Their task was to locate different landmarks and become familiar with the town. In the AVis condition subjects were equipped with the head-mounted display and headphones, which delivered a soundscape updated in real-time according to their movement in the virtual town. While general performances remained comparable across the conditions, the reported feeling of immersion was more compelling in the AVis environment. However, patients exhibited more cybersickness symptoms in this condition. The result of this study points to the multisensory integration deficit of agoraphobic patients and underline the need for further research on multimodal VR systems for clinical use.
Collapse
Affiliation(s)
- Isabelle Viaud-Delmon
- CNRS - UPMC UMR 7593, Pavillon Clérambault, Hôpital de la Salpêtrière, 47, boulevard de l'Hôpital, 75013 Paris, France.
| | | | | | | | | |
Collapse
|
8
|
Pätynen J, Lokki T. Concert halls with strong and lateral sound increase the emotional impact of orchestra music. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2016; 139:1214-1224. [PMID: 27036257 DOI: 10.1121/1.4944038] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
An audience's auditory experience during a thrilling and emotive live symphony concert is an intertwined combination of the music and the acoustic response of the concert hall. Music in itself is known to elicit emotional pleasure, and at best, listening to music may evoke concrete psychophysiological responses. Certain concert halls have gained a reputation for superior acoustics, but despite the continuous research by a multitude of objective and subjective studies on room acoustics, the fundamental reason for the appreciation of some concert halls remains elusive. This study demonstrates that room acoustic effects contribute to the overall emotional experience of a musical performance. In two listening tests, the subjects listen to identical orchestra performances rendered in the acoustics of several concert halls. The emotional excitation during listening is measured in the first experiment, and in the second test, the subjects assess the experienced subjective impact by paired comparisons. The results showed that the sound of some traditional rectangular halls provides greater psychophysiological responses and subjective impact. These findings provide a quintessential explanation for these halls' success and reveal the overall significance of room acoustics for emotional experience in music performance.
Collapse
Affiliation(s)
- Jukka Pätynen
- Department of Computer Science, Aalto University School of Science, FI-00076 Aalto, Finland
| | - Tapio Lokki
- Department of Computer Science, Aalto University School of Science, FI-00076 Aalto, Finland
| |
Collapse
|
9
|
Quarto T, Blasi G, Pallesen KJ, Bertolino A, Brattico E. Implicit processing of visual emotions is affected by sound-induced affective states and individual affective traits. PLoS One 2014; 9:e103278. [PMID: 25072162 PMCID: PMC4114563 DOI: 10.1371/journal.pone.0103278] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2014] [Accepted: 06/29/2014] [Indexed: 02/07/2023] Open
Abstract
The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals.
Collapse
Affiliation(s)
- Tiziana Quarto
- Cognitive Brain Research Unit, Institute of Behavioral Sciences, University of Helsinki, Helsinki, Finland & Finnish Centre for Interdisciplinary Music Research, University of Helsinki, Helsinki, Finland
- Psychiatric Neuroscience Group, Department of Basic Medical Sciences, Neuroscience and Sense Organs, University of Bari, Bari, Italy
| | - Giuseppe Blasi
- Psychiatric Neuroscience Group, Department of Basic Medical Sciences, Neuroscience and Sense Organs, University of Bari, Bari, Italy
| | - Karen Johanne Pallesen
- The Research Clinic for Functional Disorders and Psychosomatics, Aarhus University Hospital & Interacting Minds Centre, Aarhus University, Aarhus, Denmark
| | - Alessandro Bertolino
- Psychiatric Neuroscience Group, Department of Basic Medical Sciences, Neuroscience and Sense Organs, University of Bari, Bari, Italy
- pRED, Neuroscience DTA, Hoffman-La Roche, Ltd., Basel, Switzerland
| | - Elvira Brattico
- Cognitive Brain Research Unit, Institute of Behavioral Sciences, University of Helsinki, Helsinki, Finland & Finnish Centre for Interdisciplinary Music Research, University of Helsinki, Helsinki, Finland
- Brain & Mind Laboratory, Department of Biomedical Engineering and Computational Science, Aalto University School of Science, Helsinki, Finland
- * E-mail:
| |
Collapse
|
10
|
Suied C, Drettakis G, Warusfel O, Viaud-Delmon I. Auditory-visual virtual reality as a diagnostic and therapeutic tool for cynophobia. CYBERPSYCHOLOGY BEHAVIOR AND SOCIAL NETWORKING 2013; 16:145-52. [PMID: 23425570 DOI: 10.1089/cyber.2012.1568] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
Traditionally, virtual reality (VR) exposure-based treatment concentrates primarily on the presentation of a high-fidelity visual experience. However, adequately combining the visual and the auditory experience provides a powerful tool to enhance sensory processing and modulate attention. We present the design and usability testing of an auditory-visual interactive environment for investigating VR exposure-based treatment for cynophobia. The specificity of our application involves 3D sound, allowing the presentation and spatial manipulations of a fearful stimulus in the auditory modality and in the visual modality. We conducted an evaluation test with 10 participants who fear dogs to assess the capacity of our auditory-visual virtual environment (VE) to generate fear reactions. The specific perceptual characteristics of the dog model that were implemented in the VE were highly arousing, suggesting that VR is a promising tool to treat cynophobia.
Collapse
Affiliation(s)
- Clara Suied
- UMR STMS IRCAM-CNRS-UPMC, 1 place Igor Stravinsky, Paris, France.
| | | | | | | |
Collapse
|
11
|
|
12
|
Guetin S, Charras K, Berard A, Arbus C, Berthelon P, Blanc F, Blayac JP, Bonte F, Bouceffa JP, Clement S, Ducourneau G, Gzil F, Laeng N, Lecourt E, Ledoux S, Platel H, Thomas-Anterion C, Touchon J, Vrait FX, Leger JM. An overview of the use of music therapy in the context of Alzheimer's disease: a report of a French expert group. DEMENTIA 2012; 12:619-34. [PMID: 24337333 DOI: 10.1177/1471301212438290] [Citation(s) in RCA: 44] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
OBJECTIVES The aim of this overview is to present the developments of music therapy in France, its techniques, mechanisms and principal indications, mainly in the context of Alzheimer's disease. METHODS An international review of the literature on music therapy applied to Alzheimer's disease was conducted using the principal scientific search engines. A work group of experts in music therapy and psychosocial techniques then considered the different points highlighted in the review of literature and discussed them. RESULTS AND DISCUSSION Clinical and neurophysiological studies have enlightened some positive benefits of music in providing support for people with Alzheimer's disease or related disorders. Music therapy acts mainly through emotional and psycho-physiological pathways. It includes a series of techniques that can respond to targeted therapeutic objectives. Some studies have shown that music therapy reduces anxiety, alleviates periods of depression and aggressive behaviour and thus significantly improves mood, communication and autonomy of patients. CONCLUSION Psychosocial interventions, such as music therapy, can contribute to maintain or rehabilitate functional cognitive and sensory abilities, as well as emotional and social skills and to reduce the severity of some behavioural disorders.
Collapse
Affiliation(s)
- Stéphane Guetin
- 1AMARC-Association de Musicothérapie Applications et Recherches Cliniques, France
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
13
|
Tajadura-Jiménez A, Väljamäe A, Västfjäll D. Self-representation in mediated environments: the experience of emotions modulated by auditory-vibrotactile heartbeat. ACTA ACUST UNITED AC 2008; 11:33-8. [PMID: 18275310 DOI: 10.1089/cpb.2007.0002] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
In 1890, William James hypothesized that emotions are our perception of physiological changes. Many different theories of emotion have emerged since then, but it has been demonstrated that a specifically induced physiological state can influence an individual's emotional responses to stimuli. In the present study, auditory and/or vibrotactile heartbeat stimuli were presented to participants (N = 24), and the stimuli's effect on participants' physiological state and subsequent emotional attitude to affective pictures was measured. In particular, we aimed to investigate the effect of the perceived distance to stimuli on emotional experience. Distant versus close sound reproduction conditions (loudspeakers vs. headphones) were used to identify whether an "embodied" experience can occur in which participants would associate the external heartbeat sound with their own. Vibrotactile stimulation of an experimental chair and footrest was added to magnify the experience. Participants' peripheral heartbeat signals, self-reported valence (pleasantness) and arousal (activation) ratings for the pictures, and memory performance scores were collected. Heartbeat sounds significantly affected participants' heartbeat, the emotional judgments of pictures, and their recall. The effect of distance to stimuli was observed in the significant interaction between the spatial location of the heartbeat sound and the vibrotactile stimulation, which was mainly caused by the auditory-vibrotactile interaction in the loudspeakers condition. This interaction might suggest that vibrations transform the far sound condition (sound via loudspeakers) in a close-stimulation condition and support the hypothesis that close sounds are more affective than distant ones. These findings have implications for the design and evaluation of mediated environments.
Collapse
Affiliation(s)
- Ana Tajadura-Jiménez
- Division of Applied Acoustics, Chalmers University of Technology, Gothenburg, Sweden.
| | | | | |
Collapse
|
14
|
Roy M, Peretz I, Rainville P. Emotional valence contributes to music-induced analgesia. Pain 2007; 134:140-7. [PMID: 17532141 DOI: 10.1016/j.pain.2007.04.003] [Citation(s) in RCA: 121] [Impact Index Per Article: 7.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2006] [Revised: 03/02/2007] [Accepted: 04/09/2007] [Indexed: 11/22/2022]
Abstract
The capacity of music to soothe pain has been used in many traditional forms of medicine. Yet, the mechanisms underlying these effects have not been demonstrated. Here, we examine the possibility that the modulatory effect of music on pain is mediated by the valence (pleasant-unpleasant dimension) of the emotions induced. We report the effects of listening to pleasant and unpleasant music on thermal pain in healthy human volunteers. Eighteen participants evaluated the warmth or pain induced by 40.0, 45.5, 47.0 and 48.5 degrees C thermal stimulations applied to the skin of their forearm while listening to pleasant and unpleasant musical excerpts matched for their high level of arousal (relaxing-stimulating dimension). Compared to a silent control condition, only the pleasant excerpts produced highly significant reductions in both pain intensity and unpleasantness, demonstrating the effect of positive emotions induced by music on pain (Pairwise contrasts with silence: p's<0.001). Correlation analyses in the pleasant music condition further indicated that pain decreased significantly (p's<0.05) with increases in self-reports of music pleasantness. In contrast, the unpleasant excerpts did not modulate pain significantly, and warmth perception was not affected by the presence of pleasant or unpleasant music. Those results support the hypothesis that positive emotional valence contributes to music-induced analgesia. These findings call for the integration of music to current methods of pain control.
Collapse
Affiliation(s)
- Mathieu Roy
- Department of Psychology, University of Montreal, C.P. 6128, Succ. Centre-ville, Montreal, Que, Canada H3C 3J7
| | | | | |
Collapse
|
15
|
Västfjäll D. The subjective sense of presence, emotion recognition, and experienced emotions in auditory virtual environments. CYBERPSYCHOLOGY & BEHAVIOR : THE IMPACT OF THE INTERNET, MULTIMEDIA AND VIRTUAL REALITY ON BEHAVIOR AND SOCIETY 2003; 6:181-8. [PMID: 12804030 DOI: 10.1089/109493103321640374] [Citation(s) in RCA: 59] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
Realistic aural rendering of events in mediated environments is becoming an increasingly important aspect in many multi-modal applications. In a between-group experiment with 45 participants, it was studied how ratings of presence (a sense of being in the mediated environment), emotional reactions to the auditory environment, and emotion recognition vary as a function of number of audio channels (mono, stereo, and six-channel reproduction). The results showed that stereo and six-channel reproduction resulted in significantly stronger changes in emotional reactions than the mono condition. Further, six-channel reproduction received the highest ratings of presence and emotional realism. Taken together, the result suggested that both emotional reactions and ratings of presence increase with spatialized sound. Further, emotional reactions and presence were highly correlated. The results are discussed in relation to theories of mediated presence and emotional reactions in an attempt to further delineate the concept of presence.
Collapse
Affiliation(s)
- Daniel Västfjäll
- Chalmers Room Acoustics Group, Department of Applied Acoustics, Chalmers University of Technology, Göteborg, Sweden.
| |
Collapse
|