pub:biraffe

This is an old revision of the document!


BIRAFFE2 dataset

This is our 2nd Study in Bio-Reactions and Faces for Emotion-based Personalization for AI Systems (BIRAFFE2). It is a dataset consisting of electrocardiogram (ECG), galvanic skin response (GSR), changes in facial expression signals and hand movements (represented by gamepad's accelerometer and gyroscope) recorded during affect elicitation by means of audio-visual stimuli (from IADS and IAPS databases) and our proof-of-concept three-level emotion evoking game. All the signals were captured using portable and low-cost equipment: BITalino ®evolution kit for ECG and GSR and Creative Live! web camera for face photos (further analyzed by MS Face API).

Besides the signals, the dataset consists also of subjects' self-assessment of their affective state after each stimuli (in the valence and arousal dimensions), “Big Five” personality traits assessment (using NEO-FFI inventory), and game involvement-related metrics (using GEQ questionnaire).

The whole dataset is available under the CC BY-NC-ND 4.0 licence at Zenodo.

All documents and papers that report on research that uses the BIRAFFE dataset should acknowledge this by citing the paper: Krzysztof Kutt, Dominika Drążyk, Maciej Szelążek, Szymon Bobek and Grzegorz J. Nalepa (2020). The BIRAFFE2 Experiment. Study in Bio-Reactions and Faces for Emotion-based Personalization for AI Systems. HAIW2020: Human-AI Interaction Workshop. Submitted, in review.
(reference will be updated when the paper will be finally published)
Draft of the paper was uploaded with the dataset as the _BIRAFFE2-HAIW2020-paper-draft.pdf

BIRAFFE1 dataset

We present BIRAFFE: Bio-Reactions and Faces for Emotion-based Personalization.

It is a dataset consisting of electrocardiogram (ECG), galvanic skin reaction (GSR) and changes in facial expression signals recorded during affect elicitation by means of audio-visual stimuli (from IADS and IAPS databases) and our two proof-of-concept affective games ("Affective SpaceShooter 2" and "Fred Me Out 2"). All the signals were captured using portable and low-cost equipment: BITalino (r)evolution kit for ECG and GSR, and Creative Live! web camera for face photos (further analyzed by MS Face API). 

Besides the signals, the dataset consists also of subjects' self-assessment of their affective state after each stimuli (with the use of two widgets: the first one with 5 emoticons and the second with valence and arousal dimensions) and “Big Five” personality traits assessment (using NEO-FFI inventory).

The whole dataset is available under the CC BY-NC-ND 4.0 licence at Zenodo.

All documents and papers that report on research that uses the BIRAFFE dataset should acknowledge this by citing the paper:
Kutt, K., Drążyk, D., Jemioło, P., Bobek, S., Giżycka, B., Rodriguez-Fernandez, V., & Nalepa, G. J. (2020). BIRAFFE: Bio-Reactions and Faces for Emotion-based Personalization. In G. J. Nalepa, J. M. Ferrandez, J. Palma, & V. Julian (Eds.), Proceedings of the 3rd Workshop on Affective Computing and Context Awareness in Ambient Intelligence (AfCAI 2019) (CEUR-WS Vol. 2609).
Final version of the paper was uploaded with the dataset as the BIRAFFE-AfCAI2019-paper.pdf.

  • pub/biraffe.1593438774.txt.gz
  • Last modified: 2020/06/29 13:52
  • by kkutt