Micro Expressions Training Videos (METV) Research Test:

Micro Expressions Training Videos (METV) Research Test

  • suitable for research on facial expression recognition
  • coded in FACS system
  • takes 10 minutes for the participants to do the METV Research test
  • METV Research test consists of 20 videos displayed in a random way

Apply for the access to METV Research Test

See the video examples


This expression is a clear contempt expression. The use of the sharp lip puller is strong (12D/E) but very applicable and the timing is satisfactory.

In this clip we see a happiness expression immediately change into an expression of disgust. The disgust expression uses a moderate display of AU 10 (Upper Lip Raiser), which reliably signifies disgust at any intensity when it appears as the only AU present, as is in this case.

In this clip we see two AUs used to create an expression around the mouth. The tight-lipped expression uses the lip presser (AU24B) which is a reliable muscle used in genuine anger. We also see the lip pucker (AU 18C) used, and while not listed as a reliable muscle for anger, it does not diminish the reliability of the expression, and further emphasizes the decreased mouth size seen in anger. The timing and intensities for these AUs are satisfactory for a genuine anger expression.

METV Research Test: New advancement in micro-expression

Recently, a new generation of micro-expression recognition training has emerged with the Micro Expressions Training Videos (METV) program. The METV program is based on the Facial Action Coding System rules developed by Paul Ekman and Wallace V. Freisen (1978). However, contrary to other micro-expression recognition programs, the new METV program does not use pictures of faces (static stimuli) but instead uses videos showing facial expressions, which are stimuli that are more natural for humans. The program plays videos showing the evolution of expressions in a real interaction (i.e., videos of people having a conversation, speaking, or listening). Thanks to this change, the METV program gives participants facial recognition tasks that are much closer to real life situations than other based on micro-expressions photo recognition programs. For the training purposes, the METV videos can be viewed in slow motion or accelerated mode and can be paused to focus on particular frames to enhance learning and understand the process of displaying emotions on the face. Such learning is not possible with the photos used in other training tools. The program also includes a neutral facial expression for reference to support learning.

Background

Charles Darwin (1872) was the first to claim that facial expressions are universal across cultures and species. He postulated that powerful emotions cannot be inhibited completely or fabricated accurately because of the involuntary nature of emotional expressions. Nearly a century later, Tomkins (1962, 1963) suggested that emotion was the basis of human motivation and that the seat of emotion was in the face. Building on intercultural studies (Ekman, Sorenson, & Friesen, 1969) and a study on the remote island of Papua New Guinea (Ekman & Friesen, 1971), Matsumoto (2001) provided strong evidence that facial expressions conveying emotions such as happiness, disgust, sadness, fear, anger, and surprise are recognized universally. Matsumoto and his colleagues concluded that facial expressions are signs of a complex response system that is common to all humans (Matsumoto, Keltner, Shiota, Frank & O’Sullivan, 2008). Apparently, even people who are blind from birth exhibit these universally human emotional facial expressions (Cole, Jenkins, & Shott, 1989).

Haggard and Isaacs (1966) and Ekman and colleagues (Ekman, 1985/2001, 2006; Frank & Ekman, 1997) adopted the term micro-expressions to describe extremely brief flashes of an individual’s true emotions that appear uncontrollably on his or her face. These are brief but salient facial expressions associated with emotional expression. The expressions are spontaneous, produced by various brain activities, and associated with emotional reactions such as anger, compassion, and laughter. The brain processes emotional impulses that trigger involuntary muscle contractions; the resulting micro-expressions may last for up to half a second (Yan, 2013).

Sometimes individuals may want to suppress expressions of their true emotions or exhibit a false facial expression. In some situations, social norms compel people to conceal or alter their true feelings (Ekman, 1972). However, micro-expressions are consistent with the inhibition hypothesis (Ekman, 1988, 1991), which posits that facial expressions prevent a person from hiding their emotions. Unlike regular facial expressions, micro-expressions have the characteristic feature of being difficult to falsify, and they are perhaps impossible to conceal. Therefore, they can convey a great deal of reliable information. Since suppressed emotions are expressed unconsciously in the form of micro-expressions, a skilled observer can use them to discern concealed feelings (Ekman & Friesen, 1969; Ekman, 2009). Ekman et al. (1992) described micro-expressions as a reflection of a person’s real intent, especially one of a hostile nature.

Since micro-expressions are universal and difficult to falsify, they play a critical role in lie detection and criminal investigations (Metzinger, 2006; Schubert, 2006; Weinberger, 2010). Micro-expression recognition can be employed to detect dangerous demeanors (Metzinger, 2006; Schubert, 2006; Weinberger, 2010; Matsumoto, 2012). Micro-expressions can also be useful in communication and understanding others’ intentions in a variety of fields, including business, medicine, law, and national security (Hurly, 2011). Micro-expression detection has broad potential applications ranging from interpreting customer reactions towards various situations and marketing campaigns (Wezowski & Wezowski, 2012) to terrorist attack intercession (Weinberger, 2010). Detection can be used to predict how likely a person is to abuse their children (Asla, de Paul & Perez-Albeniz, 2012). There are also studies using METT that proof that the ability to recognize micro expressions depends on the context (Zhang M, Fu Q, Chen Y-H, Fu X, 2014) and that we can distinguish not only emotions from the face but also from the whole body language of the person shown by ‘emotional body language’ (Proverbio AM, Calbi M, Manfredi M, Zani A, 2014).

Several methods to improve micro-expression recognition have been developed (Ekman, 2003; Matsumoto & Hwang, 2011; Hurley, 2011; Endres & Laidlaw, 2009). These methods use synthesized micro-expressions, artificially created micro-expressions, in which an emotional expression is inserted between two neutral expressions. Methods using synthesized micro-expressions are popular in micro-expression recognition research as well as training materials (Endres & Laidlaw, 2009). The most well-known methods include the Micro-expression Training Tool (METT) and the Subtle Expression Training Tool (SETT), both developed by Paul Ekman, and the MiX and SubX programs available from Humintell. The METT program based on images of facial expressions is aimed to improve people’s ability to recognize expressions (Ekman, 2002) and it has been shown to improve also micro-expression detection (Hurly, 2011). However, Endres and Laidlaw (2009) demonstrated that people’s ability to learn from the METT is highly variable.

The METT program relies on the presentation of still images of faces expressing seven basic emotions: happiness, fear, anger, disgust, sadness, surprise, and contempt (Ekman, 2003). The METT program is based on the Facial Action Coding System, which is built on the assumption that facial muscle contractions alter appearance and because of that each movement is categorized into specific Action Units (AUs), representing contraction or relaxation of one or more muscles. (Ekman & Rosenberg, 1997) .

More over, synthesized micro-expressions such as METT are used for comparative recognition research (Hurley, 2011; Shen, 2012). Shen et al. (2012), indicated that recognition accuracy rates gradually increased as presentation duration became longer but was within 200 ms. In addition it was found that micro-expressions may be embedded not only in neutral expressions but also in other facial expressions, such as happiness and sadness (Stewart, Waller, Schubert, 2009). The current context can also influence recognition of facial expressions (Barrett, Kensinger, 2010). It was shown that negative facial expressions are recognized more quickly and accurately with negative context than with positive context (Righart, Gelder, 2008).

The emotional valence information can also influence facial expression recognition when appearing before the facial expression (Tanaka-Matsumi, Attivissimo, Nelson, D’Urso, 1995; Carroll, Russell, 1996). Affective priming studies discovered that there is a difference in roles in primes in the recognition of the different types of facial expressions (Ito, Masuda, Hioki, 2012; Stenberg, Wiking, Dahl 1998). Good example is anger that tend to be recognized more quickly and accurately when the prime is an angry face than when it is a happy face (Werheid, Alpay, Jentzsch, Sommer, 2005) and happy faces are recognized more accurately after positive primes than after negative ones, on the other hand sad expressions are recognized more accurately after negative primes (Hietanen, Astikainen, 2012). There were numerous cognitive neuroscience research that also confirmed the effects of emotional context on facial expression recognition (See Righart, Gelder, 2008; Morel, Beaucousin, Perrin, George, 2012). Other studies have shown that emotional information has been observed to influence attention (Vuilleumier, 2005) which is based on the emotional regulation theory (Goldin PR, McRae K, Ramel W, Gross JJ, 2008).

Bibliography

  • Asla NN, de Paul JJ, Perez-Albeniz AA (2011) Emotion recognition in fathers and mothers at high-risk for child physical abuse. Child Abuse & Neglect 35(9): 712–721.
  • Barrett LF, Kensinger EA (2010) Context is routinely encoded during emotion perception. Psychol Sci 21: 595–599. doi: 10.1177/0956797610363547
  • Carroll JM, Russell JA (1996) Do facial expressions signal specific emotions? Judging emotion from the face in context. J Pers Soc Psychol 70: 205. doi: 10.1037/0022-3514.70.2.205
  • Cole PM, Jenkins PA, Shott CT (1989) Spontaneous expressive control in blind and sighted children. Child Development 60(3): 683–688.
  • Darwin C (1872/1998) The expression of emotion in man and animals. New York: Oxford University Press. 435 p.
  • Ekman P (1985/2001) Telling lies: Clues to deceit in the marketplace, politics, and marriage. New York: Norton. 368 p.
  • Ekman P (1992) An argument for basic emotions. Cogn. and Emot. 6(3/4): 169–200.
  • Ekman P (2003) Emotions revealed: Recognizing faces and feelings to improve communication and emotional life. New York: Times Books. 304 p.
  • Ekman P, Friesen WV (1978) Manual for the faction action coding system. Palo Alto, CA: Consultation Psychologists Press. 530 p.
  • Ekman P, O’Sullivan M, Friesen WV, Scherer KR (1991) Face, voice, and body in detecting deceit. Journal of Nonverbal Behavior 15(2): 125–135.
  • Ekman P (1972) Universals and cultural differences in facial expressions of emotion. In: Cole J, editor. Nebraska Symposium on Motivation. Vol. 19. Lincoln, NE: University of Nebraska Press. pp. 207–283.
  • Ekman P (2002) MicroExpression training tool (METT). San Francisco: University of California.
  • Ekman P, Friesen WV (1971) Constants across culture in the face and emotion. Journal of Personality and Social Psychology 17: 124–129.
  • Ekman P, Rosenberg E. L, editors (1997) What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). New York: Oxford University Press. 425 p.
  • Ekman P, Sorenson ER, Friesen WV (1969) Pancultural elements in facial displays of emotion. Science 164(3875): 86–88.
  • Endres J, Laidlaw A (2009) Microexpression recognition training in medical students: A pilot study. BMC Medical Education 9(47): 1–6.
  • Goldin PR, McRae K, Ramel W, Gross JJ (2008) The neural bases of emotion regulation: reappraisal and suppression of negative emotion. Biol Psychiat 63: 577–586. doi: 10.1016/j.biopsych.2007.05.031
  • Haggard EA, Isaacs KS (1966) Micromomentary facial expressions as indicators of ego mechanisms in psychotherapy. In: Gottschalk LA, Auerbach AH, editors. Methods of research in psychotherapy. New York: Appleton Century Crofts. pp. 154–165.
  • Hietanen JK, Astikainen P (2012) N170 response to facial expressions is modulated by the affective congruency between the emotional expression and preceding affective picture. Biol Psychol 92: 114–124. doi: 10.1016/j.biopsycho.2012.10.005
  • Hurley CM (2011) The effects of motivation and training format on the ability to detect hidden emotions. PhD Thesis, State University of New York at Buffalo, New York, USA. 71: 38–53.
  • Hurley CM (2012) Do you see what I see? Learning to detect micro expressions of emotion. Motivation and Emotion 26: 371–381.
  • Ito K, Masuda T, Hioki K (2012) Affective information in context and judgment of facial expression cultural similarities and variations in context effects between North Americans and East Asians. J Cross Cult Psychol 43: 429–445. doi: 10.1177/0022022110395139
  • Marsh PJ, Green MJ, Russel TA, McGruire J, Harris A, Coltheart M (2010) Remediation of facial emotion recognition in schizophrenia: Functional predictors, generalizability, and durability. American Journal of Psychiatric Rehabilitation, 13: 143–170.
  • Matsumoto D, Hwang H (2011) Evidence for training the ability to read microexpressions of emotion. Motivation and Emotion : 35: 181–191.
  • Matsumoto, D., Keltner, D., Shiota, M. N., Frank, M. G., & O’Sullivan, M. (2008). What’s in a face? Facial expressions as signals of discrete emotions. In M. Lewis, J. M. Haviland & L. Feldman Barrett (Eds.), Handbook of emotions. New York: Guilford Press. pp. 211-234
  • Metzinger T (2006) Exposing lies. Scientific American Mind, 17(5): 32–37. doi:10.1038/scientificamericanmind1006-32.
  • Morel S, Beaucousin V, Perrin M, George N (2012) Very early modulation of brain responses to neutral faces by a single prior association with an emotional context: evidence from MEG. Neuroimage 61: 1461–1470. doi: 10.1016/j.neuroimage.2012.04.016
  • Proverbio AM, Calbi M, Manfredi M, Zani A (2014) Comprehending Body Language and Mimics: An ERP and Neuroimaging Study on Italian Actors and Viewers. PLoS ONE 9(3): e91294. doi:10.1371/journal.pone.009129
  • Righart R, Gelder BD (2008) Rapid influence of emotional scenes on encoding of facial expressions: an ERP study. Soc Cogn Affect Neur 3: 270–278. doi: 10.1093/scan/nsn021
  • Righart R, Gelder BD (2008) Recognition of facial expressions is influenced by emotional scene gist. Cogn Affect Behav Neur 8: 264–272. doi: 10.3758/cabn.8.3.264
  • Schubert S (2006) A look tells all. Scientific American Mind, 17(5): 26–31. doi:10.1038/scientificamericanmind1006-26.
  • Shen XB, Wu Q, Fu XL (2012) Effects of the duration of expressions on the recognition of microexpressions. J Zhejiang Univ-Sci B 13: 221–230. doi: 10.1631/jzus.b1100063
  • Stenberg G, Wiking S, Dahl M (1998) Judging words at face value: Interference in a word processing task reveals automatic processing of affective facial expressions. Cogn Emot 12: 755–782. doi: 10.1080/026999398379420
  • Stewart PA, Waller BM, Schubert JN (2009) Presidential speechmaking style: Emotional response to micro-expressions of facial affect. Motiv Emot 33: 125–135. doi: 10.1007/s11031-009-9129-1
  • Tanaka-Matsumi J, Attivissimo D, Nelson S, D’Urso T (1995) Context effects on the judgment of basic emotions in the face. Motiv Emot 19: 139–155. doi: 10.1007/bf02250567
  • Tomkins SS (1962) Affect, imagery, and consciousness (Vol. 1: The positive affects). New York: Springer.
  • Tomkins SS (1963) Affect, imagery, and consciousness (Vol. 2: The negative affects). New York: Springer.
  • Weinberger S (2010) Airport security: intent to deceive? Nature, 465(7297): 412–415. doi:10.1038/465412a.
  • Werheid K, Alpay G, Jentzsch I, Sommer W (2005) Priming emotional facial expressions as evidenced by event-related brain potentials. Int J Psychophysiol 55: 209–219. doi: 10.1016/j.ijpsycho.2004.07.006
  • Wezowski K, Wezowski P (2012). The micro expressions book for business. Antwerp: New Vision. 127 p.
  • Vuilleumier P (2005) How brains beware: neural mechanisms of emotional attention. Trends Cogn Sci 9: 585–594. doi: 10.1016/j.tics.2005.10.011
  • Yan W, Wu Q, Chen Y, Fu X (2013) How fast are the leaked facial expressions: The duration of microexpressions. Journal of Nonverbal Behavior. 37: 217–230. doi:10.1007/s10919-013-0159-8
  • Zhang M, Fu Q, Chen Y-H, Fu X (2014) Emotional Context Influences Micro-Expression Recognition. PLoS ONE 9(4): e95018. doi:10.1371/journal.pone.0095018

Apply for the access to the METV Research Test:

If you would like to use the METV Research Test for your study, please contact us with the form below:

Your Name *

Your Surname *

Your Email *

Your Institution*

Purpose of applying for the test *

Name of the project *

Institution who conduct the research *

Brief summary of the research *

Number of participants *