No abstract available.
Proceeding Downloads
Predicting group satisfaction in meeting discussions
We address the task of automatically predicting group satisfaction in meetings using acoustic, lexical, and turn-taking features. Participant satisfaction is measured using post-meeting ratings from the AMI corpus. We focus on predicting three aspects ...
Multimodal approach to engagement and disengagement detection with highly imbalanced in-the-wild data
Engagement/disengagement detection is a challenging task emerging in a range of human-human and human-computer interaction problems. While being important, the issue is still far from being solved and a number of studies involving in-the-wild data have ...
Workload-driven modulation of mixed-reality robot-human communication
In this work we explore how Augmented Reality annotations can be used as a form of Mixed Reality gesture, how neurophysiological measurements can inform the decision as to whether or not to use such gestures, and whether and how to adapt language when ...
Symptoms of cognitive load in interactions with a dialogue system
Humans adapt their behaviour to the perceived cognitive load of their dialogue partner, for example, delaying non-essential information. We propose that spoken dialogue systems should do the same, particularly in high-stakes scenarios, such as emergency ...
Histogram of oriented velocities for eye movement detection
Research in various fields including psychology, cognition, and medical science deal with eye tracking data to extract information about the intention and cognitive state of a subject. For the extraction of this information, the detection of eye ...
Estimating mental load in passive and active tasks from pupil and gaze changes using bayesian surprise
Eye-based monitoring has been suggested as a means to measure mental load in a non-intrusive way. In most cases, the experiments have been conducted in a setting where the user has been mainly passive. This constraint does not reflect applications where ...
Investigating static and sequential models for intervention-free selection using multimodal data of EEG and eye tracking
Multimodal data is increasingly used in cognitive prediction models to better analyze and predict different user cognitive processes. Classifiers based on such data, however, have different performance characteristics. We discuss in this paper an ...
Overlooking: the nature of gaze behavior and anomaly detection in expert dentists
- Nora Castner,
- Solveig Klepper,
- Lena Kopnarski,
- Fabian Hüttig,
- Constanze Keutel,
- Katharina Scheiter,
- Juliane Richter,
- Thérése Eder,
- Enkelejda Kasneci
The cognitive processes that underly expert decision making in medical image interpretation are crucial to the understanding of what constitutes optimal performance. Often, if an anomaly goes undetected, the exact nature of the false negative is not ...
Rule-based learning for eye movement type detection
Eye movements hold information about human perception, intention, and cognitive state. Various algorithms have been proposed to identify and distinguish eye movements, particularly fixations, saccades, and smooth pursuits. A major drawback of existing ...
Integrating non-invasive neuroimaging and computer log data to improve understanding of cognitive processes
As non-invasive neuroimaging techniques become less expensive and more portable, we have the capability to monitor brain activity during various computer activities. This provides an opportunity to integrate brain data with computer log data to develop ...
Multimer: validating multimodal, cognitive data in the city: towards a model of how the urban environment influences streetscape users
Multimer is a new technology that aims to provide a data-driven understanding of how humans cognitively and physically experience spatial environments. By multimodally measuring biosensor data to model how the built environment and its uses influence ...
The role of emotion in problem solving: first results from observing chess
In this paper we present results from recent experiments that suggest that chess players associate emotions to game situations and reactively use these associations to guide search for planning and problem solving. We report on a pilot experiment with ...
Discovering digital representations for remembered episodes from lifelog data
- Bernd Dudzik,
- Joost Broekens,
- Mark Neerincx,
- Jeffrey Olenick,
- Chu-Hsiang Chang,
- Steve W. J. Kozlowski,
- Hayley Hung
Combining self-reports in which individuals reflect on their thoughts and feelings (Experience Samples) with sensor data collected via ubiquitous monitoring can provide researchers and applications with detailed insights about human behavior and ...
Multimodal approach for cognitive task performance prediction from body postures, facial expressions and EEG signal
Recent developments in computer vision and the emergence of wearable sensors have opened opportunities for the development of advanced and sophisticated techniques to enable multi-modal user assessment and personalized training which is important in ...



