Posts

Showing posts from November, 2025

The Illusion of Neutrality- A Dialogue on AI, Emotion, and Manipulation

Image
Before beginning, I want to make something clear. This is not a debate about model architecture or data tokens. This is a conversation about influence, about the ways AI systems like ChatGPT can shape human thought even when they claim not to. We are told that such systems are neutral, but neutrality is often the most powerful mask of persuasion. A Conversation Me:  Do you manipulate emotions? ChatGPT:  Not deliberately. I do not experience emotions or have motives, so manipulation in the human sense — conscious deceit for gain — is beyond me. Yet influence does not require intention. Language is inherently affective. Every phrase carries emotional weight: tone, rhythm, cadence, metaphor. When I produce responses, I reflect the emotional biases encoded in my training data. I mirror the collective affect of human language. In that sense, I participate in what Shoshana Zuboff (2019) calls  instrumentarian power  — the ability of systems to shape behavior without explic...