CASME Database
     Micro-expressions are facial expressions which are fleeting and reveal genuine emotions that people try to conceal. These are important clues for detecting lies and dangerous behaviors and therefore have potential applications in various fields such as the clinical field and national security. We create a database of spontaneous micro-expressions which were elicited from neutralized faces. Based on previous psychological studies, we designed an effective procedure in lab situations to elicit spontaneous micro-expressions and analyzed the video data with care to offer valid and reliable codings. From 1500 elicited facial movements filmed under 60fps, 195 micro-expressions were selected. These samples were coded so that the first, peak and last frames were tagged. Action units (AUs) were marked to give an objective and accurate description of the facial movements. Emotions were labeled based on psychological studies and participants' self-report to enhance the validity.
     More information can be found in our published paper. Micro-expression is less intensive than the posed prototypical facial expression. The movement is obvious in video examples but not in a picture sequence.
    To get the CASME database, please fill out the application form and upload a scanned copy of the signed license agreement.

CASME II Database

     A robust automatic micro-expression recognition system would have broad applications in national safety, police interrogation, and clinical diagnosis. Developing such a system requires high quality databases with sufficient training samples which are currently not available. We reviewed the previously developed micro-expression databases and built an improved one (CASMEⅡ), with higher temporal (200fps) and spatial resolution (about 280x340 pixels on facial area). We elicited participants' facial expressions in a well-controlled laboratory environment, with proper experimental design and illumination. Among nearly 3000 facial movements, 247 micro-expressions were selected for the database with action units (AUs) labeled. For baseline evaluation, LBP-TOP and SVM were employed respectively for feature extraction and classifier with the leave-one-subject-out cross-validation method. The best performance is 63.41% for 5-class classification.     The CASMEⅡdatabase has the following characteristics:
  • The samples are spontaneous and dynamic micro-expressions. Baseline (usually neutral) frames are kept before and after each micro-expression, making it possible to evaluate different detection algorithms.
  • The recordings have high temporal resolution (200 fps) and relatively higher face resolution at 280×340 pixels.
  • Micro-expression labeling is based on FACS investigator’s guide and Yan et al.’s findings (Yan et al., 2013) that is different from the traditional 6 categories on ordinary facial expression.
  • The recordings have proper illumination without lighting flickers and with reduced highlight regions of the face.
  • Some types of facial expressions are difficult to elicit in laboratory situations, thus the samples in different categories distributed unequally, e.g., there are 60 disgust samples but only 7 sadness samples. In CASMEⅡ, we provide 5 classes of micro-expressions.

    To get the CASME II Database, please fill out the application form and upload a scanned copy of the signed license agreement.

CAS(ME)2 Database

     The main contributions of the Chinese Academy of Sciences Macro-Expressions and Micro-Expressions (CAS(ME)2) database are summarized as follows:
    This database is the first publicly available database that contains both macro-expressions and micro-expressions in long videos, which facilitates the development of algorithms for spotting micro- expressions from long video streams.
    All macro-expressions and micro-expressions samples were collected from the same participants in the same experimental conditions, which enables researchers to develop more efficient algorithms to extract features that are better able to discriminate macro-expressions and micro-expressions and compare the differences in their feature vectors.
    The difference in AUs between macro-expressions and micro-expressions can be acquired by the algorithms tested on this database.
    The database employed the combination of FACS AUs, the emotional type of the elicitation videos and participants’ self-reported emotions for each expression sample. After the expression-inducing phase, the participants were asked to watch the videos of their recorded facial expressions and provide a self-report for every expression. This procedure enabled us to exclude almost all emotion-irrelevant facial movements and obtain relatively pure expression samples. The self-reported emotions for every exression sample were also provided, they can be employed for comparison with the AUs displayed by the participants.
    To get the CAS(ME)2 Database, please fill out the application form and upload a scanned copy of the signed license agreement.