Multimodal generative installations and the creation of new art form based on interactivity narratives
PDF

Palavras-chave

Multimodal
Installation
Interactive
Music
Creativity

Como Citar

MANZOLLI, Jônatas. Multimodal generative installations and the creation of new art form based on interactivity narratives. NICS Reports, Campinas, SP, v. 8, n. 22, p. 32–45, 2021. Disponível em: https://econtents.bc.unicamp.br/pas/index.php/nicsreports/article/view/315. Acesso em: 3 jul. 2024.

Resumo

We present a recent study on multimodal generative installations that can be described as immersive and interactive infrastructures in which is possible to generate, interact, analyse and storage multimodal information (audio, video, images, human movement and bio-signals). The article starts with a theoretical viewpoint based on the motion of Presence followed by the description of the computer environment implemented in our study. The aim is to discuss a unified experience where data and users are merged in space and evolve coherent in time. Our approach is based on two kinds of interactions implicit and explicit one and on expansions of the local virtual experience to a ubiquitous one, using the Internet. Shortly, we discuss how the implicit and explicit interactions, local and remote agencies are used to digitally synthesize images and sounds in real time and to furnish the man-machine interplay within immersive environments. Finally, four multimodal installations are presented to exemplify the interactive and generative design used in these artistic works.

PDF

Referências

Colton, S. and G. A. Wiggins (2012). Computational Creativity: The Final Frontier? 20th European

Conference on Artificial Intelligence.

Sternberg, R. S., Ed. (1999). Handbook of Creativity. Cambridge, Cambridge Univ. Press.

Pope, R. (2005). Creativity: theory, history, practice, Routledge.

Boden, M. (1996). What is creativity? In: M. Boden (ed), Dimensions of creativity, pp. 75-117.

London: MIT Press.

Krausz, M., D. Dutton, et al. (2009). The idea of creativity, Brill Academic Pub.

Wasserman, K., J. Manzolli, K. Eng, and P. F. M. J. Verschure. 2003. Live soundscape

composition based on synthetic emotions: Using music to communicate between an interactive

exhibition and its visitors. IEEE MultiMedia 10:82–90.

Inderbitzin, M., S. et. al. Social cooperation and competition in the mixed reality space

eXperience Induction Machine (XIM). Virtual Reality, 2009, 13:153–158.

P. Papachristodoulou, Betella, A., Manzolli, J., and Verschure, P.F.M.J. “Augmenting the

navigation of complex data sets using sonification: a case study with BrainX3. In Virtual Reality

(VR), 2015, IEEE, 2015.

Daniel Bernhardt and Peter Robinson. Detecting affect from non-stylised body motions. In

ACII ’07: Proceedings of the 2nd international conference on Affective Computing and Intelligent

Interaction, pages 59–70. Springer, 2007.

Daniel Bernhardt and Peter Robinson (2008). Interactive control of music using emotional

body expressions. Published in Human Factors in Computing Systems, pg 3117-3122. ISBN: 978-

-60558-012-8.

Ginevra Castellano, Santiago D. Villalba, and Antonio Camurri. Recognising human emotions

from body movement and gesture dynamics. In ACII ’07: Proceedings of the 2nd international

conference on Affective Computing and Intelligent Interaction, pages 71–82. Springer, 2007.

Castelli, Fulvia; Happé, Francesca; Frith, Uta; Frith, Chris (2000). "Movement and Mind: A

Functional Imaging Study of Perception and Interpretation of Complex Intentional Movement

Patterns". NeuroImage 12 (3): 314–25.

Mura et al., “re(PER)curso: a mixed reality chronicle”, ACM SIGGRAPH, pp. 2008.

Le Grox et al., “Disembodied and Collaborative Musical Interaction in the Multimodal Brain

Orchestra”, Proceedings of NIME, pp. 309–314, 2010.

Manzolli, J. “continuaMENTE: Integrating Percussion, Audiovisual and Improvisation”. (2008)

To be in the Proceedings of the International Computer Music Conference (ICMC, 2008), August,

, Belfast, Irland.

Verschure & Manzolli, Computational Modeling of Mind and Music. In: Language, Music, and

The Brain, ESF Reports, MIT Press, 2013.

Sanchez-Vives, M.V. & Slater M. “From presence to consciousness through virtual reality”. In

Nat. Rev. Neuroscience, 6(4):332-9, 2005.

Bernardet, U. et. al. The eXperience Induction Machine: A New Paradigm for Mixed-Reality

Interaction Design and Psychological Experimentation. In: Manuel Dubois: Philip Gray: Laurence

Nigay. (Org.). The Engineering of Mixed Reality Systems. Londres: Springer, 2010, p. 357-379.

Dubois, E. et all. The Engineering of Mixed Reality Systems, Springer Verlag, London, ISBN:

-1-84882-723-5, 2010.

Peeters, G. (2004). “A large set of audio features for sound description (similarity and

classification) in the CUIDADO project,” CUIDADO IST Project Report (IRCAM, Paris), pp. 1–25.

Miranda, E. R., and M. M. Wanderley. 2006. New Digital Musical Instruments: Control and

Interaction beyond the Keyboard. Middleton: A-R Editions.

Purwins, H., M. Grachten, P. Herrera, et al. 2008a. Computational models of music perception

and cognition II: Domain-specific music processing. Phys. Life Rev. 5:169–182. [AVS, 16]

Reynolds, C.W., 1987, Flocks, herds, and schools: a distributed vehavioral model. Computer

Graphics (SIGGRAPH 1987 Conference Proceedings), 21, 25 – 34.

Reynolds, C.W., 1988. Not Bumping into Things. Notes on ‘obstacle avoidance’ for the course

on Physically Based Modeling at SIGGRAPH.

Karplus, Kevin; Strong, Alex, 1983. "Digital Synthesis of Plucked String and Drum Timbres".

Computer Music Journal (MIT Press) 7 (2): 43–55.

Creative Commons License
Este trabalho está licenciado sob uma licença Creative Commons Attribution 4.0 International License.

Copyright (c) 2024 NICS Reports

Downloads

Não há dados estatísticos.