TY - JOUR
T1 - Estimation of Personality Traits from Portrait Pictures Using the Five-Factor Model
AU - Moreno-Armendariz, Marco A.
AU - Duchanoy Martinez, Carlos Alberto
AU - Calvo, Hiram
AU - Moreno-Sotelo, Miguelangel
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2020
Y1 - 2020
N2 - This work presents a model based on Deep Neural Networks for the prediction of apparent personality. It can quantify personality traits with the Five-Factor model (Big Five) from a Portrait image. In order to evaluate the effectiveness of this approach, a new corpus of 30,935 portraits with their associated personality trait was extracted from an existing resource of videos (First Impressions, ChaLearn) tagged with redundant pairwise comparisons to ensure consistency. We propose several models using Convolutional Neural Networks to automatically extract features from a portrait that are indicators of personality traits; then the models classify these characteristics into a binary class for each Big Five factor: openness to experience (O), conscientiousness (C), extraversion (E), agreeableness (A), and neuroticism (N). In addition, we experiment with feature encoding and transfer learning to enrich the representation of images with additional untagged portraits (45,000 and 200M), reaching a percentage of accuracy within the state of the art (albeit not directly comparable), obtaining 65.86% as a classifier averaging the 5 factors (O=61.48%, C=69.56%, E=73.23%, A=60.68%, N=64.35%). Compared to human judgment (mean accuracy of 56.66%), the model obtained higher average performance and higher accuracy in 4 of the 5 factors of the Big Five model. In addition, in comparison with the state of the art this model shows several advantages: (1) it requires only a single portrait to make the prediction, being this a non-invasive and easily accessible resource (e.g. selfies) (2) the extraction of features from the portrait is done automatically, (3) a single model performs the extraction of characteristics and classification.
AB - This work presents a model based on Deep Neural Networks for the prediction of apparent personality. It can quantify personality traits with the Five-Factor model (Big Five) from a Portrait image. In order to evaluate the effectiveness of this approach, a new corpus of 30,935 portraits with their associated personality trait was extracted from an existing resource of videos (First Impressions, ChaLearn) tagged with redundant pairwise comparisons to ensure consistency. We propose several models using Convolutional Neural Networks to automatically extract features from a portrait that are indicators of personality traits; then the models classify these characteristics into a binary class for each Big Five factor: openness to experience (O), conscientiousness (C), extraversion (E), agreeableness (A), and neuroticism (N). In addition, we experiment with feature encoding and transfer learning to enrich the representation of images with additional untagged portraits (45,000 and 200M), reaching a percentage of accuracy within the state of the art (albeit not directly comparable), obtaining 65.86% as a classifier averaging the 5 factors (O=61.48%, C=69.56%, E=73.23%, A=60.68%, N=64.35%). Compared to human judgment (mean accuracy of 56.66%), the model obtained higher average performance and higher accuracy in 4 of the 5 factors of the Big Five model. In addition, in comparison with the state of the art this model shows several advantages: (1) it requires only a single portrait to make the prediction, being this a non-invasive and easily accessible resource (e.g. selfies) (2) the extraction of features from the portrait is done automatically, (3) a single model performs the extraction of characteristics and classification.
KW - Personality traits estimation
KW - convolutional neural networks
KW - five-factor model
KW - image analysis
UR - http://www.scopus.com/inward/record.url?scp=85096336159&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2020.3034639
DO - 10.1109/ACCESS.2020.3034639
M3 - Artículo
AN - SCOPUS:85096336159
SN - 2169-3536
VL - 8
SP - 201649
EP - 201665
JO - IEEE Access
JF - IEEE Access
M1 - 9244051
ER -