Pengamatan Ekspresi Wajah Secara Adaptif dengan Presentasi Pemutaran Musik

Agam Pamungkas Lumintan(1*), Liliana Liliana(2), Henry Novianus Palit(3),


(1) Program Studi Teknik Informatika
(2) Program Studi Teknik Informatika
(3) Program Studi Teknik Informatika
(*) Corresponding Author

Abstract


High life demands nowadays make the individual emotional state change easily. Music is one of many media that is used to stabilize a person’s emotional condition. However, wrong use of music can worsen a person’s emotional condition, as in the case of today’s society.

To prevent that, an application that can detect the user’s facial expression to stabilize the user’s emotional state through music is developed. The expression on the face is determined by the position of eyebrows, eyes, mouth, and wrinkles features. The positions of detected features are compared with those in a neutral face that has been previously calibrated. The difference of features’ positions will be used as input of the trained neural network to determine the expression on the detected face.

Through some experiments, the accuracy of detection of facial features and facial expressions are known. The detected expression will trigger the playback of music to stabilize the user’s emotional condition. The weakness of this application is that the music library must be set manually by the user.


Keywords


Facial expression, psychology, emotion detection, music player, and backpropagation

Full Text:

PDF

References


Ekman, P., & Friesen, V. W. (2003). Unmasking The Face : A guide to recognizing emotions from facial expressions. San Fransisco: Malor Books.

Fausett, L. (1994). Fundamentals of Neural Networks: Architectures, Algorithms, and Applications. Prentice-Hall.

Juslin, P. N., & Vastfjall, D. (2008). Behavioral and Brain Sciences 31. Emotional Responses to Music: The Need to Consider Underlying Mechanism, 559-621.

Kalbkhani, H., & Amirani, M. C. (2012). An Efficient Algorithm for Lip Segmentation in Color Face Images Based on Local Information. In JWEET, Journal of World's Electrical Engineering and Technology 1(1): 12-16-2012 (pp. 12-15). Urmia, Iran: Scienceline Publication.

Lundqvist, D., Flykt, A., & Öhman, A. (1998). The Karolinska Directed Emotional Faces - KDEF, CD ROM from Department of Clinical Neuroscience, Psychology section, Karolinska Institutet, ISBN 91-630-7164-9.

Martinez, A., & Du, S. (2012). Journal of Machine Learning Research 13. A Model of the Perception of Facial Expressions of Emotion by Humans: Research Overview and Perspectives, 1589-1608.

Moreira, J. L., Braun, A., & Musse, R. S. (2010). Graphics, Patterns and Images. SIBGRAPI '10 Proceedings of the 2010 23rd SIBGRAPI Conference (pp. 17-24). Washington: IEEE Computer Society.

Porter, S., Brinke, L. t., & Wallace, B. (2012). J Nonverbal Behav. Secrets and Lies: Involuntary Leakage in Deceptive Facial Expressions as a Function of Emotional Intensity, 23-37.

Rea, C., MacDonald, P., & Carnes, G. (2010). Emporia State Research Studies, vol. 46, no. 1. Listening to classical, pop, and metal music: An investigation of mood, 1-3.

Scherer, K. R., & Zentner, M. (2001). Emotional Effects of Music: Production Rules. In P. Juslin, & J. Sloboda, Music and emotion: theory and research (pp. 366-371). New York: Oxford University Press.

Tabataie, Z. S., Rahmat, R. W., Udzir, N. I., & Kheirkhah, E. (2009). IJCSNS Internation Journal of Computer Science and Network Security, Vol. 9 No. 5. A Hybrid Face Detection System using Combination of Appearance-based and Feature-based methods, 181-183.

Tian, Y.-l., Kanade, T., & Cohn, J. F. (2001). Recognizing Action Units for Facial Expression Analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 23, No. 2, 11-12.


Refbacks

  • There are currently no refbacks.


Jurnal telah terindeks oleh :