Inhalt des Dokuments
Affect-based Indexing - Mining for Affect in Multimedia
Description:
In this work we
use affective measures to index and measure experience of multimedia
content that include audio, music and video. The affective evaluation
can subsequently be used to provide new ways of Multimedia Indexing,
Retrieval and Search. We have started with annotating standard sound
effects database used for audio mining. The subjective ratings from
the annotations will also be validated with findings from
psycho-physiological measurements and more user-friendly measures such
as facial-expressions. This is a part of a common effort to build
auditory interfaces built at T-Labs. In joint efforts with the
Multimedia content retrieval project, we plan to build automatic
processing systems to assess multimedia clips and determine affective
values. Some of the key questions that are addressed here are: How to
automatically highlight parts of a multimedia clip that cause strong
emotions? What parts of a clip to put in a preview? What are common
features of clips with similar affective impact? What is the relation
to implicit emotional reaction of the user? What clips with similar
effect a user might like?
The scope of the project is
summarized below.
- Sophisticated annotation of real-world audio & video.
- Analysis of auditory icons, ringtones, audio, video and music clip.
- Relation of affective impact to signal features.
- Automatic methods to estimate affective impact.
- Affective evaluation in usability tests.
Team:
- Shiva Sundaram. [1]
- Robert Schleicher. [2]
- Sebastian Möller. [3]
- Julia Seebode. [4]
Duration: April 2010 - March 2012.
_sundaram/parameter/de/font3/maxhilfe/
tschleicher/parameter/de/font3/maxhilfe/
rameter/de/font3/maxhilfe/
_seebode/parameter/de/font3/maxhilfe/