Looking for Facial Expression Datasets? Here’s the Ultimate List!

Explore a comprehensive collection of publicly available facial expression databases, each with detailed descriptions, direct download links, and related research papers—all in one place! Whether you're working on AI, computer vision, or emotion recognition, this list has everything you need to advance your research.


1. 3D Twins Expression Challenge (3D-TEC) (2011)

This database comprises 3D face scans of 107 pairs of twins, totaling 214 individuals. Each person has a scan with a smiling expression and a neutral expression, resulting in 428 scans. The scans were acquired using a Minolta Vivid 910.

πŸ”— Database Link: 3D-TEC Database
πŸ“„ Related Paper: 3D Twins and Expression Challenge

2. AffectNet (2017)

AffectNet contains over 1 million facial images collected from the internet using 1,250 emotion-related keywords in six languages. Approximately 440,000 images were manually annotated for seven discrete facial expressions and the intensity of valence and arousal.

πŸ”— Database Link: AffectNet
πŸ“„ Related Paper: AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild

3. Affectiva-MIT Facial Expression Dataset (AM-FED) (2013)

Affectiva-MIT Facial Expression Dataset (AM-FED) (2013)
This dataset consists of 242 facial videos (168,359 frames) recorded in real-world conditions. It includes frame-by-frame labels for 10 symmetrical FACS action units, 4 asymmetric action units, head movements, smiles, general expressiveness, and gender.

πŸ”— Database Link: AM-FED
πŸ“„ Related Paper: Affectiva-MIT Facial Expression Dataset (AM-FED): Naturalistic and Spontaneous Facial Expressions Collected "In-the-Wild

4. Natural Visible and Infrared Facial Expression Database (NVIE) (2010)

This database contains spontaneous and posed facial expressions from over 100 subjects, captured using both visible-light and infrared thermal cameras. The dataset aims to address challenges related to illumination variations and differences between posed and natural expressions. Images were recorded under three different illumination conditions (front, left, and right). The posed database includes sequences with and without eyeglasses.

πŸ“Œ Version Updates:
Version 3.0 (2012): Most visible and infrared thermal samples annotated with facial feature points.
Version 2.0 (2011): Added neutral expression images (with and without eyeglasses) in both infrared and visible spectrum.
Version 1.0 (2010): Initial release with spontaneous and posed expressions from 103–107 subjects.

πŸ”— Database Link: NVIE
πŸ“„ Related Paper: A Natural Visible and Infrared Facial Expression Database for Expression Recognition and Emotion Inference

5. AR Face (1998)

This database includes over 4,000 color images of 126 individuals (70 men and 56 women) with frontal views under different facial expressions, illumination conditions, and occlusions (e.g., sunglasses, scarves).

πŸ”— Database Link: Access to this database may require direct contact with the authors. More details can be found in the related research paper.
πŸ“„ Related Paper: The AR Face

6. Belfast Naturalistic Emotional Database (2012)

The Belfast Naturalistic Emotional Database consists of 298 audiovisual clips from 125 speakers (31 male, 94 female). The clips capture spontaneous emotional expressions, each lasting between 10 to 60 seconds. The dataset is designed to provide context for understanding emotion display peaks and their development over time.

πŸ”— Database Link: Access to this database may require direct contact with the authors. More details can be found in the related research paper.
πŸ“„ Related Paper: Belfast Naturalistic Emotional Database

7. Binghamton University Facial Expression Databases (2006 - 2023)

Binghamton University has developed a series of comprehensive facial expression databases for research in affective computing, computer vision, human-computer interaction, security, biomedicine, and psychology. These datasets encompass both static and dynamic 3D facial expressions, as well as multimodal emotion analysis.


BU-3DFE (2006) – Binghamton University 3D Facial Expression Database
A large-scale 3D facial expression database featuring high-quality 3D facial scans from 100 subjects (ages 18-70, diverse ethnic backgrounds). Each subject performed seven expressions (neutral, happiness, disgust, fear, anger, surprise, and sadness) at four intensity levels (except neutral).
Total Data: 2,500 3D expression models, each paired with two-view texture images (+45° and -45°).

BU-4DFE (2008) – Binghamton University 4D Facial Expression Database
A high-resolution 4D dynamic facial expression database capturing detailed facial movements over time.
Subjects: 101 individuals performing six dynamic expressions.
Data Volume: 606 expression sequences (approx. 100 frames per sequence), totaling ~60,600 high-resolution 3D frames.

BP4D-Spontaneous (2014) – 3D Dynamic Spontaneous Facial Expression Database
A spontaneous 3D facial expression dataset designed to capture natural, unposed facial behavior.
Subjects: 41 individuals (ages 18-29, diverse racial backgrounds).
Emotion Elicitation: Eight tasks designed to induce spontaneous emotional expressions.
Data Volume: 2.6TB of synchronized 2D and 3D video sequences with FACS-coded action units, head pose tracking, and facial landmarks.
BP4D+ (2016) – Multimodal Spontaneous Emotion Corpus

BP4D++ (2023) – Large-Scale Multimodal Facial Expression Database
An expanded multimodal dataset featuring 233 participants across 10 emotion-inducing tasks, including synchronized 3D, 2D, thermal, and physiological data.

ReactioNet (2023) – Facial Behavior Dataset with Stimuli and Subjects
A large-scale dataset of 2,486 reaction video clips capturing real-time emotional responses to various stimuli.
Subjects: 1,566 individuals (ages 20-70, globally diverse).
Stimuli: Eight stimulus categories (e.g., animation, films, games, interviews) and 59 subcategories.
Annotations: Facial landmarks, head pose tracking, gaze tracking, and FACS-coded action units.

πŸ”— Database Link: Binghamton University Facial Expression
For access, contact Dr. Lijun Yin (lijun@cs.binghamton.edu). Some datasets require a formal agreement and are available for non-commercial research only.
πŸ“„ Related Paper: BP4D-Spontaneous: a high-resolution spontaneous 3D dynamic facial expression database


8. Biwi 3D Audiovisual Corpus of Affective Communication (2010)

This corpus contains high-quality dynamic 3D scans of faces recorded while subjects pronounced English sentences, with induced affective states.

πŸ”— Database Link: Biwi 3D
πŸ“„ Related Paper: A 3-D Audio-Visual Corpus of Affective Communication

9. Cohn-Kanade AU-Coded Facial Expression (2000)

This database includes 486 sequences from 97 subjects, each starting with a neutral expression and progressing to a peak expression.

πŸ”— Database Link: Cohn-Kanade
πŸ“„ Related Paper: Comprehensive Database for Facial Expression Analysis

10. Dynamic and spontaneous emotional facial expression database (DynEmo) (2013)

The DynEmo database offers dynamic and natural emotional facial expressions from ordinary individuals, with detailed annotations and contextual information.

πŸ”— Database Link DynEmo
πŸ“„ Related Paper: DynEmo: A video database of natural facial expressions of emotions

11. Denver Intensity of Spontaneous Facial Actions (DISFA) (2013)

This database contains stereo videos of 27 adults, with frame-by-frame annotations of 12 FACS action units' intensities.

πŸ”— Database Link DISFA
πŸ“„ Related Paper: DISFA: A Spontaneous Facial Action Intensity Database

12. EURECOM Kinect Face (2014)

This dataset comprises multimodal facial images of 52 individuals, captured under various facial expressions, lighting, and occlusion conditions using a Kinect sensor.

πŸ”— Database Link EURECOM Kinect Face
πŸ“„ Related Paper: KinectFaceDB: A Kinect Database for Face Recognition

13. Extended Yale Face Database B (B+) (2001)

This database contains 16,128 images of 28 subjects under 9 poses and 64 illumination conditions.

πŸ”— Database Link Extended Yale Face Database B
πŸ“„ Related Paper: Acquiring linear subspaces for face recognition under variable lighting

14. Facial Expression In Wild (2011 and 2012)

AFEW (2012): A dynamic facial expressions dataset extracted from movies, simulating real-world conditions.
SFEW (2011): Static facial expressions selected from AFEW frames.

πŸ”— Database Link Facial Expressions in the Wild
πŸ“„ Related Paper: Static Facial Expressions In The Wild: Data and Experiment Protocol

15. Sayette Group Formation Task (GFT) Spontaneous Facial Expression Database (2017)

The Sayette Group Formation Task (GFT) database includes 172,800 video frames from 96 participants in unscripted group interactions, annotated for facial expressions.

πŸ”— Database Link:  GFT Spontaneous Facial Expression
πŸ“„ Related Paper: Sayette Group Formation Task (GFT) Spontaneous Facial Expression Database

16. Indian Movie Face database (IMFDB) (2013) ​​​​​​​

IMFDB is a large unconstrained face database with 34,512 images of 100 Indian actors, annotated for various attributes.

πŸ”— Database Link IMFDB
πŸ“„ Related Paper: Indian Movie Face Database: A Benchmark for Face
Recognition Under Wide Variations
​​​​​​​

17. Indian Spontaneous Expression Database (ISED) (2016)

This database contains high-resolution, near-frontal face recordings of spontaneous emotions along with metadata such as participant gender, ground truth for emotional clips, intensity levels, and peak emotion intensity frames. Head movements in all directions were allowed. The dataset includes four emotions: Happiness, Disgust, Sadness, and Surprise.

πŸ”— Database Link: ISED
πŸ“„ Related Paper: The Indian Spontaneous Expression Database for Emotion Recognition

18. Japanese Female Facial Expression (JAFFE) (1998)

This dataset consists of 213 images of 7 facial expressions (6 basic emotions + 1 neutral) posed by 10 Japanese female models. Each image was rated on 6 emotional adjectives by 60 Japanese subjects. The dataset was developed at Kyushu University by Michael Lyons, Miyuki Kamachi, and Jiro Gyoba.

πŸ”— Database Link: JAFFE
πŸ“„ Related Paper: Coding Facial Expressions with Gabor Wavelets

19. Karolinska Directed Emotional Faces (KDEF) (1998)

The Karolinska Directed Emotional Faces (KDEF) contains 4,900 images of human facial expressions captured under controlled conditions. Developed at Karolinska Institutet in Stockholm, Sweden, the dataset is widely used in psychological and medical research, particularly for studies on perception, attention, and emotion recognition. It includes 70 individuals, each displaying 7 different expressions captured from 5 different angles.

πŸ”— Database Link: Access to this database may require direct contact with the authors. More details can be found in the related research paper.
πŸ“„ Related Paper: The Karolinska Directed Emotional Faces: A validation study

20. MMI Facial Expression (2005)

This database contains over 2,900 videos and high-resolution still images from 75 subjects. It is fully annotated for the presence of facial Action Units (AUs) and coded at frame level, identifying different phases (neutral, onset, apex, offset). A portion of the dataset is annotated for audiovisual laughter.

πŸ”— Database Link: MMI Facial Expression
πŸ“„ Related Paper: Induced Disgust, Happiness and Surprise: an Addition to the MMI Facial Expression Database​​​​​​​

21. ND-2006 Data Set (2005)

The ND-2006 dataset contains 13,450 images depicting six different facial expressions: Neutral, Happiness, Sadness, Surprise, Disgust, and Other. It includes images from 888 individuals, with some subjects having up to 63 images.

πŸ”— Database Link: ND-2006
πŸ“„ Related Paper: Using a Multi-Instance Enrollment Representation to Improve 3D Face Recognition


22. Radboud Faces Database (RaFD) (2010)

The Radboud Faces Database (RaFD) is a high-quality facial expression dataset featuring 67 models of various ethnic backgrounds (Caucasian and Moroccan-Dutch). The models display 8 emotions (Anger, Disgust, Fear, Happiness, Sadness, Surprise, Contempt, and Neutral) while varying gaze directions and camera angles. The dataset is widely used in behavioral science and psychology.

πŸ”— Database Link: RaFD
πŸ“„ Related Paper: Presentation and validation of the Radboud Faces Database
Link: RaFD Database


23. SN-Flip Crowd Video Dataset (2014)

The SN-Flip dataset includes 190 individuals recorded across 28 crowd videos over a two-year period. The dataset captures variations in illumination, facial expressions, focus, and pose. The videos were recorded with Cisco Flip cameras, providing a representation of real-world web video quality. The dataset includes ground truth annotations for subject identities and social groups.

πŸ”— Database Link: SN-Flip
πŸ“„ Related Paper: Active Clustering with Ensembles for Social Structure Extraction

24. The Bimodal Face and Body Gesture (FABO) (2006)

The FABO dataset was created for the analysis of human nonverbal affective behavior, capturing both facial expressions and body gestures. It includes video recordings of participants performing directed actions under controlled conditions, enabling significant advances in affective computing research.

πŸ”— Database Link:
FABO
πŸ“„ Related Paper: A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior

25. The Bosphorus Database (2008)

The Bosphorus Database is designed for research in 3D and 2D facial analysis, including expression recognition, action unit detection, and face recognition under challenging conditions. The dataset includes 105 subjects and 4,666 facial images with a rich repertoire of 35 different expressions, various head poses, and face occlusions (e.g., beard, mustache, glasses, hands covering face).

πŸ”— Database Link: Access to this database may require direct contact with the authors. More details can be found in the related research paper.
πŸ“„ Related Paper: 3D Face Recognition Benchmarks on the Bosphorus Database with Focus on Facial Expressions​​​​​​​

26. Child Affective Face Set (CAFE) (2015)

The CAFE dataset is a large-scale collection of children’s facial expressions, consisting of 1,200 photographs of over 100 child models (ages 2-8). The dataset captures 7 different emotions: Happiness, Anger, Sadness, Fear, Surprise, Neutral, and Disgust. It is a valuable resource for studying emotional expression in young children.

πŸ”— Database Link: CAFE
πŸ“„ Related Paper: The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults

27. CMU Multi-PIE Face (2009)

The CMU Multi-PIE dataset consists of over 750,000 images of 337 subjects, recorded across multiple sessions over five months. It includes variations in facial expression, illumination (19 conditions), and viewpoint (15 angles). High-resolution frontal images are also included. This dataset is widely used for face recognition and expression analysis.

πŸ”— Database Link: CMU Multi-PIE
πŸ“„ Related Paper: Multi-PIE

28. Color FERET Database (USA) (1993-1996)

The FERET dataset was collected in 15 sessions between 1993 and 1996 and contains 14,126 images of 1,199 individuals, including 365 duplicate sets taken on different days. This time-spaced design allows researchers to study long-term variations in facial appearance.

πŸ”— Database Link: Color FERET
πŸ“„ Related Paper: The FERET evaluation methodology for face-recognition algorithms

29. EURECOM Kinect Face Dataset (EURECOM KFD) (2014)

The EURECOM KFD dataset consists of multimodal facial images from 52 subjects, captured using Kinect. The dataset includes RGB, depth, and 3D images, along with manual landmark annotations for facial features. It was designed for research on face recognition under different facial expressions, lighting conditions, and occlusions.

πŸ”— Database Link: EURECOM KFD
πŸ“„ Related Paper: KinectFaceDB: A Kinect Database for Face Recognition

30. MUG Facial Expression (2010)

The MUG dataset includes image sequences of 86 subjects performing facial expressions under controlled lighting conditions. Each sequence captures transitions between neutral and peak expressions. The dataset includes a diverse range of expressions and is widely used in affective computing.

πŸ”— Database Link: Access to this database may require direct contact with the authors. More details can be found in the related research paper.
πŸ“„ Related Paper: The MUG facial expression database

31. UNBC-McMaster Shoulder Pain Expression Archive (2011)

The UNBC-McMaster dataset is designed for pain expression analysis and includes 200 video sequences with spontaneous facial expressions, 48,398 FACS-coded frames, and pain intensity annotations. The dataset also provides 66-point Active Appearance Model (AAM) landmarks.

πŸ”— Database Link: Access to this database may require direct contact with the authors. More details can be found in the related research paper.
πŸ“„ Related Paper: PAINFUL DATA: The UNBC-McMaster Shoulder Pain Expression Archive Database

32. Yale Face Database (1997)

The Yale Face Database consists of 165 grayscale images of 15 individuals, each captured under 11 different conditions, including variations in facial expressions (Happy, Sad, Surprised) and lighting conditions. It is widely used in face recognition research.

πŸ”— Database Link: Yale Face
πŸ“„ Related Paper: The Yale Face Database

33. 10k US Adult Faces Database (2013)

This database contains 10,168 natural face photographs and detailed annotations for 2,222 faces, including memorability scores, psychological attributes, and facial landmark points. The dataset is available in JPEG format with metadata in MATLAB, Excel, and TXT files. It also includes a software tool for creating custom image sets based on attributes like gender, race, and emotion.

πŸ”— Database Link: 10k US Adult Faces
πŸ“„ Related Paper: The Intrinsic Memorability of Face Photographs

34. Specs on Faces (SoF) Dataset (2019)

This dataset includes 42,592 images of 112 individuals wearing glasses under various lighting conditions, designed to challenge face detection, recognition, and classification. It features natural and synthetic occlusions (glasses, nose, and mouth), three image filters (Gaussian noise, blur, posterization), and metadata with facial landmarks, age, gender, and expression labels.

πŸ”— Database Link: SoF
πŸ“„ Related Paper: AFIF4: Deep Gender Classification Based on AdaBoost-based Fusion of Isolated Facial Features and Foggy Faces

35. Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS) (2018)

Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS) (2018)
RAVDESS is a multimodal dataset containing 7,356 files (24.8 GB) of 24 actors (12 male, 12 female) vocalizing speech and song with various emotional expressions (calm, happy, sad, angry, fearful, surprise, disgust). Each expression is available in audio-only, video-only, and audio-visual formats. The dataset is validated for emotional authenticity, intensity, and genuineness.

πŸ”— Database Link: RAVDESS
πŸ“„ Related Paper: The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English

36. BAUM-1: Bahçeşehir University Multimodal Face Database (2016)

BAUM-1 is a multimodal dataset containing audio-visual clips of 31 subjects expressing spontaneous and acted emotional and mental states in Turkish. The dataset includes six basic emotions (happiness, sadness, anger, disgust, fear, surprise) and additional states like boredom, contempt, confusion, interest, and concentration.

πŸ”— Database Link: BAUM-1
πŸ“„ Related Paper: BAUM-1: A Spontaneous Audio-Visual Face Database of Affective and Mental States​​​​​​​

37. Indian Semi-Acted Facial Expression Database (iSAFE) (2018)

Indian Semi-Acted Facial Expression Database (iSAFE) (2018)
iSAFE is a facial expression dataset designed for human emotion recognition in human-computer interaction research. It contains 395 video clips of 44 volunteers (ages 17-22) displaying eight emotions (happy, sad, surprise, disgust, fear, angry, uncertain, and neutral). Expressions were self-annotated by participants and cross-annotated by external evaluators.

πŸ”— Database Link iSAFE
πŸ“„ Related Paper: Indian Semi-Acted Facial Expression (iSAFE) Dataset for Human Emotions Recognition
πŸ“© Contact: shivam9935@gmail.com​​​​​​​

38. Grammatical Facial Expressions Dataset (2014)

This dataset supports automatic interpretation of grammatical facial expressions in Brazilian Sign Language (Libras). It contains 27,965 instances extracted from 18 Kinect-recorded videos, where users perform sentences requiring grammatical facial expressions. Data includes 100 facial landmark coordinates (x, y, z) for eyes, eyebrows, nose, mouth, face contour, and iris, along with manually labeled ground truth for classification tasks.

πŸ”— Database Link: Grammatical Facial Expressions
πŸ“„ Related Paper: Grammatical Facial Expressions Recognition with Machine Learning​​​​​​​

39. Database of Faces (ORL) (1992-1994)

The Database of Faces (formerly ORL Database of Faces) contains 400 grayscale images of 40 subjects, captured between 1992 and 1994 for face recognition research. Each subject has 10 images with variations in lighting, expressions (smiling/not smiling, open/closed eyes), and accessories (glasses/no glasses). All images are 92x112 pixels, 256 grayscale levels, and taken against a dark homogeneous background in a frontal pose.

πŸ”— Database Link:
ORL Faces
πŸ“„ Related Paper: Parameterisation of a Stochastic Model for Human Face Identification

40. Facial Expression Research Group 2D Database (FERG-DB) (2016)

FERG-DB is a 2D facial expression dataset containing 55,767 images of six stylized characters (Ray, Malcolm, Jules, Bonnie, Mery, Aia), modeled in MAYA and rendered for expression analysis. Each character displays seven facial expressions (anger, disgust, fear, joy, neutral, sadness, surprise), making it a valuable resource for deep learning-based emotion recognition.

πŸ”— Database Link: FERG-DB
πŸ“„ Related Paper: Modeling Stylized Character Expressions via Deep Learning

41. Multi-modal Affective Facial Expressions in the Wild (MAFW) (2022)

MAFW is a large-scale multimodal dataset of dynamic facial expressions collected from movies, TV shows, and short videos across different countries and genres. It contains 10,045 video clips with audio, independently annotated by multiple raters for compound emotional categories (11 emotions, including core expressions like happiness, sadness, and additional ones such as contempt, anxiety, helplessness, and disappointment). Each clip includes bilingual textual descriptions of the observed expression.

πŸ”— Database Link: MAFW
πŸ“„ Related Paper: MAFW: A Large-scale, Multi-modal, Compound Affective Database for Dynamic Facial Expression Recognition in the Wild

42. Dynamic Facial Expression in-the-Wild (DFEW) (2020)

DFEW is a large-scale dynamic facial expression dataset extracted from over 1,500 movies. It contains 16,372 short video clips (with audio) capturing facial expressions in real-world conditions. The dataset is annotated into 7 core emotional categories (six universal emotions + neutral), making it highly valuable for training real-world FER models.

πŸ”— Database Link: DFEW
πŸ“„ Related Paper: DFEW: A Large-Scale Database for Recognizing Dynamic Facial Expressions in-the-Wild

43. FERV39k (2022)

FERV39k is a large-scale facial expression dataset in video format, designed for FER in diverse real-world settings. It contains 38,935 video clips, covering 4 primary scenes subdivided into 22 sub-scenes (e.g., talk shows, formal events, live performances). The goal is to study how context influences expression recognition.
Modality: RGB short video clips (face-focused frames, ~336×504 or 224×224 resolution); no audio.

πŸ”— Database Link: FERV39k
πŸ“„ Related Paper: FERV39k: A Large-Scale Multi-Scene Dataset for Facial Expression Recognition in Videos

44. SAMM Long Videos (2020)

SAMM Long Videos is the first dataset focusing on long-duration recordings containing both micro and macro-expressions. It includes 147 high-speed videos (recorded at 200 FPS) with an average duration of ~35 seconds per video. The dataset captures 343 macro-expression events and 159 micro-expression events, all occurring spontaneously in response to stimuli. Each expression is annotated using the Facial Action Coding System (FACS), providing a detailed breakdown of Action Units (AUs). This dataset is particularly useful for micro-expression spotting and emotion recognition in subtle facial movements.

πŸ”— Database Link: SAMM Long Videos
πŸ“„ Related Paper: SAMM Long Videos: A Spontaneous Facial Micro- and Macro-Expressions Dataset

45. 4DFAB (2018)

4DFAB is a large-scale 4D facial expression dataset designed for facial expression analysis and biometric applications. It contains over 1.8 million 3D face meshes collected from 180 subjects across four different sessions over five years (2012–2017). The dataset captures both spontaneous and posed facial expressions using a high-resolution 3D scanning system, resulting in detailed temporal 3D sequences of facial movements. 4DFAB is useful for 3D facial expression recognition, biometric identification, and facial motion modeling.

πŸ”— Database Link: Access to this database may require direct contact with the authors. More details can be found in the related research paper.
πŸ“„ Related Paper: 4DFAB: A Large Scale 4D Facial Expression Database for Biometric Applications

Comments

Popular posts from this blog

Why Do Local Binary Pattern (LBP) Methods Still Matter in the AI Era?

How to Accurately Detect Facial Landmarks? A Step-by-Step Guide