Facial EMG Analysis to Quantify Facial Mimicry in order to Facilitate Communication at Birmingham City University

HomeHealth & Fitness

Facial EMG Analysis to Quantify Facial Mimicry in order to Facilitate Communication at Birmingham City University

my-portfolio

[ad_1] Project Description: Good communicators recognise and mimic the facial expression of emotions of the person interacted with. Us

Research Associate / Senior Research Associate at University of Bristol
Senior Postdoctoral Scientist in T cell immunology at University of Oxford
Research Associate/Fellow in Proton Lung Imaging (Fixed term) at University of Nottingham

[ad_1]

Project Description:

Good communicators recognise and mimic the facial expression of emotions of the person interacted with. Using this ‘facial mimicry’ is important for e.g. Mental Health professionals. However, many people struggle with facial mimicry, especially those with autistic spectrum disorder. We aim to develop a training system to improve facial mimicry, and so effective communication.

This can be done through neurofeedback, where the trainee receives a signal, dependent on the degree of facial mimicry. To achieve that, we need to be able to express the electrical activity of facial muscles, recorded from trainee and trainer (live or off-line), in a single unit (vector). The mathematical correlation between the vector of the trainee and that of the trainer, will give a single number: the ‘facial mimicry quotient’ (FMQ), that is positive the more their facial expressions are in synch, or negative when out of synch. The average FMQ can be used to quantify facial mimicry in an individual or for comparisons between groups. Because the FMQ will fluctuate during communication, it can be utilised as an instantaneous feedback signal that informs the trainee of the quality of communication through e.g. the colour (warm/cold) of the ambient light.

To obtain a reliable FMQ, we need an experimental approach that allows optimising the accuracy and speed of the analysis method that quantifies facial expressions. We will start with highly controlled conditions, but then further adapt analysis to work in realistic communication settings, which needs collaboration of psychologists, Mental Health specialists, biomedical scientists and engineers. To be successful in this project, the researcher should be acquainted with quantitative analysis and having MATLAB experience would be a great asset.

Anticipated Findings and Contribution to Knowledge:

Various methods for quantifying facial mimicry have been established, but expressing facial EMG as a 3-D vector is novel. The direction of the vector is determined by the relative activity of three muscles that will be different for each type of emotion. The magnitude of the vector will depend on the physical and behavioural properties of the participant that determine the maximal amplitude of the recorded facial EMG with intentional expressions. Normalising the 3-D vector to these individual properties is novel and would provide a substantial improvement to the quantification method.

Combining the normalised 3-D vectors of two persons communicating, to calculate the facial mimicry quotient (FMQ) is innovative and is likely to provide a valuable parameter to quantify the level of facial mimicry and so the effectiveness of communication.

There are gender, age and race differences in how emotions are expressed, which leads to ineffective communication and misunderstanding between individuals of different gender, age and race groups. The average FMQ can be used to explore and quantify those differences and relate the quality of facial mimicry to psychological traits, like empathy and disorders like autism.

Facial mimicry is partly subconscious, determined by nature and nurture, but can be learned and altered. Using the dynamic FMQ for neurofeedback is an exciting new idea that has the potential to be developed into training programs, to help people that struggle with facial mimicry to improve communication. It may even help to improve communication and avoid misunderstanding between gender, age and race groups.

Contact (and Director of Studies for this project): Dr Martin Vreugdenhil

Closing date is: 23:59 on 30 September 2023

[ad_2]

Source link

COMMENTS

WORDPRESS: 0
DISQUS: