Adaptive Emotion Recognition Framework Leveraging Lightweight Convolutional Networks and Fuzzy Logic for Enhanced Interpretability and Efficiency in Low-Resource Environments
This work presents a light-weight CNN based emotion recognition system which employs a fuzzy interface system for improving the facial expression recognition process. The envisioned framework uses CNNs for objective input features capturing critical facial characteristics expressing emotions and the...
Saved in:
| Published in | 2025 International Conference on Intelligent Control, Computing and Communications (IC3) pp. 808 - 813 |
|---|---|
| Main Authors | , , , , , |
| Format | Conference Proceeding |
| Language | English |
| Published |
IEEE
13.02.2025
|
| Subjects | |
| Online Access | Get full text |
| DOI | 10.1109/IC363308.2025.10957318 |
Cover
| Summary: | This work presents a light-weight CNN based emotion recognition system which employs a fuzzy interface system for improving the facial expression recognition process. The envisioned framework uses CNNs for objective input features capturing critical facial characteristics expressing emotions and the fuzzy system to remove any fuzziness in the visual cues and improve classification accuracy. This dual approach increases the reliability of the emotion detection in contrast to other approaches that are often used where fine differences in facial expressions are required to be noted. Originally developed for environments with limited computational capabilities the system can be seamlessly run on low end hardware which places it in the sphere of application in healthcare diagnostics, human-robot interaction and adaptive educational technology. Employing the FER+ and JAFFE datasets for learning the model, the introduced system exhibits considerable efficacy in identifying prominent emotions for the various subjects. Combined with OpenCV, TensorFlow Lite, and skfuzzy libraries, this method lays a basic framework for large-scale real-time Emotion Recognition System in smart applications. |
|---|---|
| DOI: | 10.1109/IC363308.2025.10957318 |