This topic is of increasing concern to a number of people on xmca. FYI.
mike
---------- Forwarded message ----------
From: Morana Alac <moranaalac@gmail.com>
Date: Aug 4, 2006 3:18 PM
Subject: [lchc] Fwd: 1st CfP for the Special issue on "Multimodal Corpora"
of the Journal of Language Resources and Evaluation
To: lchc@weber.ucsd.edu
---------- Forwarded message ----------
From: Jean-Claude MARTIN <martin@limsi.fr>
Date: Aug 4, 2006 12:02 PM
Subject: 1st CfP for the Special issue on "Multimodal Corpora" of the
Journal of Language Resources and Evaluation
To: undisclosed-recipients
---------------------------------------------------
1st Call For Papers
MULTIMODAL CORPORA
FOR MODELLING HUMAN MULTIMODAL BEHAVIOR
Special issue of the
International Journal of Language Resources and Evaluation
Deadline for paper submission : 15th November 2006
www.springer.com/journal/10579/
---------------------------------------------------
Guest editors:
J.-C. Martin, P. Paggio, P. Kuehnlein, R. Stiefelhagen, F. Pianesi
This special issue is concerned with behavioral models built from
Multimodal Corpora.
'Multimodal Corpora' target the recording and annotation of several
communication modalities such as speech, hand gesture, facial expression,
body posture, etc.
There is an increasing interest in multimodal communication and multimodal
corpora as visible by recently launched European Networks of Excellence
and integrated projects such as HUMAINE, SIMILAR, CHIL and AMI,
and similar efforts in the USA and in Asia.
Furthermore, the success of recent conferences related to multimodal
communication (ICMI'2005, IVA'2005, Interacting Bodies'2005,
Nordic Symposium on Multimodal Communication 2005)
as well as a series of dedicated workshops at LREC'2000, 2002, 2004 and 2006
also testifies the growing interest in this area,
and the general need for data on multimodal behaviours.
The focus of this special issue is on multimodal corpora
and their use for representing and modelling human behaviors.
This include non-verbal communication studies
and their contribution to the definition of collection protocols, coding
schemes, and reliable models of multimodal human
behaviour that can be built from corpora and
compared to results that can be found in the literature.
Topics to be addressed include, but are not limited to:
- Studies of multimodal behaviour
- Multimodal interaction in groups and meetings
- Building models of behaviour from multiple sources of knowledge :
manual annotation, image processing, motion capture, literature studies
- Coding schemes for the annotation of multimodal video corpora
- Validation of multimodal annotations
- Exploitation of multimodal corpora in different types of applications
(meeting transcription, Embodied Conversational Agent, multi-modal
interfaces, communication and clinical studies, edutainment)
- Methods, tools, and best practices for the acquisition, creation,
management, access, distribution, and use of multimedia and multimodal
corpora
- Metadata descriptions of multimodal corpora
- Benchmarking of systems and products; use of multimodal corpora for
the evaluation of real systems
- Automated multimodal fusion and/or generation
(e.g., coordinated speech, gaze, gesture, facial expressions)
The submitted papers might adress human-computer interaction if it
concerns human modalities
(e.g. 3D conversational gestures rathers than 2D pen based interaction).
---------------------------------------------------
IMPORTANT DATES
- Deadline for paper submission : 15th November 2006
- Notification of acceptance: 15th February 2007
- Camera-ready version of accepted paper: 15th April 2007
- Target publication date : September 2007
---------------------------------------------------
INSTRUCTIONS FOR AUTHORS
Submissions should be not more than 20 pages long, must be in English, and
follow the submission guidelines at
http://www.springer.com/cda/content/document/cda_downloaddocument/instr_print_10579.060421.pdf?SGWID=0-0-45-126854-p35554703
Extended and revised versions of papers accepted to previous LREC
workshops on Multimodal Corpora are encouraged.
Papers in .pdf format should be submitted via email to MARTIN@LIMSI.FR
Each received submission will be acknowledged.
Authors are encouraged to send to MARTIN@LIMSI.FR
a brief email indicating their intention to participate as soon as possible,
including their contact information and the topic they intend to address
in their submissions.
---------------------------------------------------
-- ------------------------------------------------------------------- Jean-Claude MARTIN Assistant Professor in Computer ScienceWeb: http://www.limsi.fr/Individu/martin/ Mobile: (+33).6.84.21.62.05 (answering machine)
LIMSI-CNRS, Batiment 508, BP 133, 91403 Orsay Cedex, FRANCE Phone: (+33).1.69.85.81.04 Fax: (+33).1.69.85.80.88
and
IUT Montreuil, LINC, Univ. Paris 8, 140, rue Nouvelle France, 93100 Montreuil, FRANCE
------------------------------------------------------------------- _______________________________________________ lchc mailing list lchc@weber.ucsd.edu http://dss.ucsd.edu/mailman/listinfo/lchc _______________________________________________ xmca mailing list xmca@weber.ucsd.edu http://dss.ucsd.edu/mailman/listinfo/xmca
This archive was generated by hypermail 2b29 : Tue Sep 05 2006 - 08:14:30 PDT