Arturas KAKLAUSKAS, Renaldas GUDAUSKAS, Matas KOZLOVAS,
Lina PECIURE, Natalija LEPKOVA, Justas CERKAUSKAS, Audrius BANAITIS
Vilnius Gediminas Technical University,
Sauletekio al. 11, Vilnius, LT-10223, Lithuania
firstname.lastname@example.org (Corresponding author)
Abstract: People watching a video can almost always suppress their speech but they cannot suppress their body language and manage their physiological and behavioral parameters. Affects/emotions, sensory processing, actions/motor behavior and motivation link to the limbic system responsible for instinctive and instantaneous human reactions to their environment or to other people. Limbic reactions are immediate, sure, time-tested and occur among all people. Such reactions are highly spontaneous and reflect the video viewer’s real feelings and desires, rather than deliberately calculated ones. The limbic system also links to emotions, usually conveyed by facial expressions and movements of legs, arms and/or other body parts. All physiological and behavioral parameters require consideration to determine a video viewer’s emotions and wishes. This is the reason an Affect-based multimodal video recommendation system (ARTIST), developed by the authors of the article, is very suitable. The ARTIST was developed and fine-tuned during the course of conducting the TEMPUS project “Reformation of the Curricula on Built Environment in the Eastern Neighbouring Area”. ARTIST can analyze the facial expressions and physiological parameters of a viewer while watching a video. An analysis of a video viewer’s facial expressions and physiological parameters leads to better control over alternative sequences of film clips for a video clips. It can even prompt ending the video, if nothing suitable for the viewer is available in the database. This system can consider a viewer’s emotions (happy, sad, angry, surprised, scared, disgusted and neutral) and choose rational video clips in real time. The analysis of a video viewer’s facial expressions and physiological parameters can indicate possible offers to viewers for video clips they prefer at the moment.
Keywords: facial expressions; physiological video retrieval; affect-based, multimodal, video recommendation system; TEMPUS CENEAST project.
CITE THIS PAPER AS: Arturas KAKLAUSKAS, Renaldas GUDAUSKAS, Matas KOZLOVAS, Lina PECIURE, Natalija LEPKOVA, Justas CERKAUSKAS, Audrius BANAITIS, An Affect-Based Multimodal Video Recommendation System, Studies in Informatics and Control, ISSN 1220-1766, vol. 25(1), pp. 5-14, 2016. https://doi.org/10.24846/v25i1y201601