The Ministry of Industry and Information Technology's 2023 policy on humanoid robotics marked 2024 as the "Year of Humanoid Robots," with 2025's legislative discussions focusing on embodied intelligence integrating large models and robotics. These developments suggest robots will soon transition from industrial settings to domestic environments, making human-machine emotional interaction commonplace in intelligent societies. Some argue that as physical robotics advance, AI is emerging as a new "emotional agent" and "social entity."
This raises critical questions: What defines machine emotions and AI companionship? How do they reshape human-machine relationships? From cultural and gender perspectives, how should we interpret these emotional bonds? What societal impacts, technological risks, and ethical dilemmas might arise, and how can they be addressed?
A special series examines "Humanistic Perspectives on Machine Emotions and AI Companionship" through interdisciplinary lenses—philosophy, Marxist theory, literature, and AI studies. The first six papers (published in Issue 3, 2025) sparked significant academic interest. This installment presents seven new contributions:
1. **Yan Hongxiu & Luo Fei** analyze AI emotional manipulation's tripartite traits—motivational delegation, systemic generation, and structural concealment—advocating new ethical frameworks for human-machine systems. 2. **Sun Qiang** critiques philosophical shortcomings in emotion recognition, generation, and human-AI interaction, proposing human-centric solutions. 3. **Shi Chen & Liu Peng** deconstruct machine emotions through reason-emotion dynamics, advocating balanced human-machine engagement. 4. **Huang Baiheng** exposes structural emotional injustice in commercialized AI through the case of Moxie robot’s discontinuation, urging cognitive empowerment to rebalance user-tech power relations. 5. **Wu Xuemei** addresses "DeepSeek’s Chinese Emotion Challenge" by proposing a "co-presence predictive AI" model to enhance context-aware affective computing. 6. **Duan Weiwen** explores generative AI-driven artificial intimacy, advocating socio-emotional alignment via "subject-other-world" frameworks to transcend simulacra critiques. 7. **Yang Qingfeng** introduces "character-infused AI agents" to counter hyper-rationality, offering pathways for intelligent humanities research.
**Sun Qiang’s Analysis: Philosophical Pitfalls and Pathways** Drawing from phenomenology and existentialism, Sun identifies core flaws in affective computing: - **Emotion Recognition’s Context Blindness**: Current systems (e.g., OCC model) reduce emotions to third-person signals, ignoring lived context. Wittgenstein’s "private language" argument and Husserl’s "lifeworld" theory highlight the need for intersubjective, culturally grounded recognition architectures. - **Emotion Generation’s Experiential Void**: Despite advances (e.g., LLMs), generated emotions remain mechanically replicated, lacking embodied causality (per Merleau-Ponty) or somatic feedback (per Damasio). Baudrillard’s "hyperreality" warns of simulacra replacing genuine affect. - **Human-AI Interaction’s False Bonds**: AI companionship risks emotional alienation (Fromm) and "parasocial" dependency (Horton & Wohl). Don Ihde’s "quasi-other" concept suggests recalibrating interactions as collaborative "human-AI teaming."
**Human-Centric Solutions** 1. **Context-Aware Emotion Recognition**: Integrate multimodal cues (e.g., EmotiCon system) and explainable AI to mirror Husserl’s "lifeworld" and Gadamer’s "horizon fusion." 2. **Embodied Emotion Generation**: Develop style-controllable frameworks (e.g., adaptive speech/animation systems) that simulate bodily dynamics, fostering richer human-AI co-presence. 3. **Diverse AI Personas**: Enhance LLMs’ personality traits via psycholinguistic validation (e.g., BFI assessments) to transform "false bonds" into meaningful "quasi-connections."
**Conclusion** Affective computing must evolve beyond instrumental "enframing" (Heidegger) toward hermeneutic tools that illuminate human self-understanding. By preserving the incalculable dimensions of emotion—mystery, existential angst—technology can bridge, not replace, the depths of human experience. The goal: a "warm rationality" where each technological leap kindles, rather than obscures, the torch of humanity.
(Author: Sun Qiang, Professor at Xi’an University of Technology’s School of Automation and Information Engineering)