The materials consisted of grayscale photographs of facial expressions of 34 female identities and 35 male identities taken from the Karolinska Directed Emotional Faces KDEF Lundqvist et al. The name of one of the six basic emotions was printed in the center of the screen. Participation was voluntarily and no monetary reward was offered. The presentation time of the targets during the learning period changed depending on the number of targets displayed, ranging from 30 to 60 s. Each emotion occurred as both a target and as a distracter.
Social Psychology in the News: Weekly Updates from the New York Times
In this task, we required participants to determine the mixture ratios of two prototypical expressions of emotions. This test aims to measure individual differences in recognizing complex affective states Vellante et al. Results are summarized in Table 5. This is just a sample. One of my primary test taking strategies is process of elimination and my approach to this test was no different.
This task was designed by adapting the FaMe-N task by using stimuli containing emotional instead of neutral faces. The next screen collects some demographic information. Means and standard errors of the mean of the accuracy and reaction times on the face and house part-to-whole matching task split by age group. The original version of Reading the Mind in the Eyes consisted of a set of 25 photos showing the area around the eyes and a choice of two possible mental states for each photo. Gender on the other hand does not seem so influential, but this article provides guidelines and data for both gender and age groups regardless. The following number of outliers were discarded; upright face parts:
Models were also requested to inspect their facial expression in a mirror. Anger faces were recognized considerably worse than sadness faces. Models were asked to imagine a personally relevant episode of their lives in which they strongly experienced a certain emotional state corresponding to one of the six emotions happiness, surprise, fear, sadness, disgust, and anger. Temporal dynamics of brain activation during a working memory task. Therefore, with the 1 -back task using emotion expressions we aimed to assess recognition speed of emotional expressions from working memory and expected accuracy levels to be at ceiling. Sign up for my FREE newsletter! Further studies are needed to investigate whether learning and recognizing emotion-morphs are tapping the same ability factor as learning and recognizing prototypical expressions of emotion.