ISCA - International Speech
Communication Association


ISCApad Archive  »  2023  »  ISCApad #301  »  Events  »  Other Events  »  (2023-10-09) 25th ACM International Conference on Multimodal Interaction (ICMI 2023), Paris, France

ISCApad #301

Thursday, July 06, 2023 by Chris Wellekens

3-3-9 (2023-10-09) 25th ACM International Conference on Multimodal Interaction (ICMI 2023), Paris, France
  

25th ACM International Conference on Multimodal Interaction (ICMI 2023)

9-13 October 2023, Paris, France

 

The 25th International Conference on Multimodal Interaction (ICMI 2023) will be held in Paris, France. ICMI is the premier international forum that brings together multimodal artificial intelligence (AI) and social interaction research. Multimodal AI encompasses technical challenges in machine learning and computational modeling such as representations, fusion, data and systems. The study of social interactions englobes both human-human interactions and human-computer interactions. A unique aspect of ICMI is its multidisciplinary nature which values both scientific discoveries and technical modeling achievements, with an eye towards impactful applications for the good of people and society.

 

ICMI 2023 will feature a single-track main conference which includes: keynote speakers, technical full and short papers (including oral and poster presentations), demonstrations, exhibits, doctoral consortium, and late-breaking papers. The conference will also feature tutorials, workshops and grand challenges. The proceedings of all ICMI 2023 papers, including Long and Short Papers, will be published by ACM as part of their series of International Conference Proceedings and Digital Library, and the adjunct proceedings will feature the workshop papers.

 

Novelty will be assessed along two dimensions: scientific novelty and technical novelty. Accepted papers at ICMI 2023 will need to be novel along one of the two dimensions:

  • Scientific Novelty: Papers should bring new scientific knowledge about human social interactions, including human-computer interactions. For example, discovering new behavioral markers that are predictive of mental health or how new behavioral patterns relate to children’s interactions during learning. It is the responsibility of the authors to perform a proper literature review and clearly discuss the novelty in the scientific discoveries made in their paper.
  • Technical Novelty: Papers should propose novelty in their computational approach for recognizing, generating or modeling multimodal data. Examples include: novelty in the learning and prediction algorithms, in the neural architecture, or in the data representation. Novelty can also be associated with new usages of an existing approach.

 

Please see the Submission Guidelines for Authors https://icmi.acm.org/ for detailed submission instructions. Commitment to ethical conduct is required and submissions must adhere to ethical standards in particular when human-derived data are employed. Authors are encouraged to read the ACM Code of Ethics and Professional Conduct (https://ethics.acm.org/).

 

ICMI 2023 conference theme: The theme for this year’s conference is “Science of Multimodal Interactions”. As the community grows, it is important to understand the main scientific pillars involved in deep understanding of multimodal social interactions. As a first step, we want to acknowledge key discoveries and contributions that the ICMI community enabled over the past 20+ years. As a second step, we reflect on the core principles, foundational methodologies and scientific knowledge involved in studying and modeling multimodal interactions. This will help establish a distinctive research identity for the ICMI community while at the same time embracing its multidisciplinary collaborative nature. This research identity and long-term agenda will enable the community to develop future technologies and applications while maintaining commitment to world-class scientific research.

Additional topics of interest include but are not limited to:

  • Affective computing and interaction
  • Cognitive modeling and multimodal interaction
  • Gesture, touch and haptics
  • Healthcare, assistive technologies
  • Human communication dynamics
  • Human-robot/agent multimodal interaction
  • Human-centered A.I. and ethics
  • Interaction with smart environment
  • Machine learning for multimodal interaction
  • Mobile multimodal systems
  • Multimodal behaviour generation
  • Multimodal datasets and validation
  • Multimodal dialogue modeling
  • Multimodal fusion and representation
  • Multimodal interactive applications
  • Novel multimodal datasets
  • Speech behaviours in social interaction
  • System components and multimodal platforms
  • Visual behaviours in social interaction
  • Virtual/augmented reality and multimodal interaction

 

Important Dates

Paper Submission: May 1, 2023 

Rebuttal period: June 26-29, 2023

Paper notification: July 21, 2023

Camera-ready paper: August 14, 2023

Presenting at main conference: October 9-13, 2023

 


Back  Top


 Organisation  Events   Membership   Help 
 > Board  > Interspeech  > Join - renew  > Sitemap
 > Legal documents  > Workshops  > Membership directory  > Contact
 > Logos      > FAQ
       > Privacy policy

© Copyright 2024 - ISCA International Speech Communication Association - All right reserved.

Powered by ISCA