Emotion classification from vocal expressions is an important area of research. This study has been able to numerically quantify emotions based on the vocal expressions. This has opened up an area of study in the medical diagnosis of Alzheimer’s disease.

Emotions are fundamental for human beings, impacting their perception and everyday activities such as communication and decision making. They are expressed through speech, facial expressions, gestures and other nonverbal clues. Speech emotion recognition is the process of analyzing vocal behavior, with emphasis on nonverbal aspects of speech. Difference of emotional states can be considered as one of the important evaluation criteria to measure the performance of cognition procedure especially for the process of decision making and action tendency. At the early stages of Alzheimer’s disease the patient suffers from intermittent memory deterioration leading towards lack of cognitive and perceptual ability in speech, language, construction of sentences etc. Since during early stages there is mild memory disorders, patients and also their relatives are not able to relate the symptoms with Alzheimer’s disease, rather they tend to relate the cognitive changes with age. Normally 2 to 3 years are taken to take medical advice after the onset of the symptoms. Emotion plays a significant role in influencing motivation and focus of attention. A methodology is proposed for early detection of Alzheimer’s disease through fractal analysis of speech. Fractals can be of two types: monofractals and multifractals. Monofractals are those whose scaling properties are the same in different regions of the system. Multifractals are more complicated self-similar objects that consist of differently weighted fractals with different non-integer dimensions. Multifractal de-trended fluctuation analysis and fractal dimensions(Hurst exponents) of 1400 Speech signals consisting of six basic emotions are calculated revealing significant differences between the angry and the sad emotions for normal subjects. Hence any aberration from the widths of the multifractal spectrum for different emotion for normal subjects, can be contributed towards onset of neuro-cognitive impairments like Alzheimer’s disease a model for which is proposed in this work. Further testing and validation of the model is, however, required with diseased subjects to come up with biomarkers for detection of Alzheimer’s disease from speech.


Authors: Susmita Bhaduri, Rajdeep Das and Dipak Ghosh
Journal of Neurology and Neuroscience (Probable Impact Factor: 1.45)
DOI: 10.21767/2171-6625.100084
J Neurol Neurosci 2016: Volume 7, Issue 2 (84)
Status :Published
Area : Speech Analysis