Can AI Actually Help You Manage Emotions?
The gaming industry, much like most industries today, are in the middle of a "Gold Rush" when it comes to implementing AI. As Andrew Wilson (CEO of EA) stated AI could impact 60% of game development processes. Filled with promises of measurable increases in productivity and faster content cycles. But as I’ve discussed before, content only lights the flame; community fuels the fire.
The issue I see is that the industry is either attempting to solve or entirely miss a psychological problem: human emotion. And it attempts to do so with a cold, mathematical, utterly rational tool. When humans, especially emotions, are anything but.
If we trade the "soul" of our communities for the efficiency of automated responses, we risk more than just a technological glitch: we risk driving our players straight into the "Uncanny Valley" (A psychological tipping point where we humans starts to create a feeling of uneasiness, repulsion or worse to human like inanimate entities that are just not quite right).
The Psychology of “Hype to Habit”
To understand why AI struggles in community management, we have to look at how players move from Hype to Habit. As a player moves through a game’s lifecycle, we can map the transition from hype (the high energy, curiosity) to habit (the sustainable, behaviour that integrates the game into the player’s daily life and even identity). A gaming community isn't just a list of users, it’s a "Third Space" that satisfies human needs for autonomy and social connection.
While AI is remarkably good at identifying the cues and routines of player behaviour (like login times), it is notoriously poor at understanding the underpinning psychology. Even though it would be able to parrot it back at you if you ask it, it doesn’t actually understand or know what to do with this knowledge.
For a player, that reward often goes beyond the nature of the game. But is more about a genuine sense of belonging or social validation. If a developer diary feels overly PR talking point driven, or a Q&A is handled by a bot that trusted excitement these content pieces are designed to create is broken. Trust, connection are human currencies, machines simply don’t know what to do with it (yet?).
The Mechanism of Habit Formation
The transition to the habit phase is where the ROI of a community is realized. AI is remarkably good at identifying the patterns such as: tracking when players log in, which features they interact with, etc. but it is hopeless at understanding the psychological rewards. For a player, the reward is often social connection, mastery, social credit and validation. If the social environment becomes toxic or the connection feels “fake" or artificial, the rewards simply disappear, and the habit loop breaks.
Studies on social network games show the strength of habit, rather than conscious decision making, is directly linked to time spent playing. The community manager’s role evolves into a protector of the environment that allows these habits to form.
The Four Phases of Emotional Engagement
Successful community strategy breaks this habit formation journey into four distinct emotional phases, each requiring different "emotional fuel". AI often struggles here, because it doesn’t know which fuel is needed at what moment:
The Reveal (Curiosity & Awe):
The goal is to sell the fantasy. While AI can assist with asset upscaling and style-guide compliance, it cannot "feel" the emerging culture, or what emotional impact a trailer truly aims to create.The Pre-Launch (Trusted Excitement):
Players look for signals of human passion to validate their investment. When a developer diary feels in-authentic, disingenuous, or the chat Q&A is handled by a bot, the trust is broken before the game even arrives.The Launch (Unity & Celebration):
This phase relies on a shared cultural experience. AI is best used here for real-time sentiment monitoring, track the evolution of subtopics and if they are driving positivity or negativity, helping prioritise emerging topics, etc.Post-Launch (Belonging & Habit):
Players establish social norms and loyalty. Here AI’s role is to identify churn triggers or toxicity that might pollute the social environment. Clustering conversation to make it easier for community manager to get an accurate picture of the entirety of the conversation and not just what’s currently burning on Reddit.
The Uncanny Valley of Community Management
Originally a term for robotics, the Uncanny Valley is spilling over into our digital interactions. When an AI entity becomes "almost" human but not quite right, it triggers a deep-seated distrust and dislike. In gaming, this valley is behavioural and emotional.
Rainbow Six Siege Character “Tachanka”
Gaming specific cultural nuances such as: slang, memes, online banter and sarcasm are incredibly difficult for AI to replicate, or join in on the in-community jokes and memes. An example from my time on the community management frontlines: “{-}7” representing the Rainbow Six Siege Operator Tachanka. (from o7 a saluting soldier, {-} represent the character’s helmet, hence {-}7 Tachanka salutes) AI wouldn’t really know what to make of it, or how to use it correctly in context.
A more wider spread gaming phrase is "Press F to pay respects". AI might just see it as a neutral expression, but in reality, players often use it to mock. Worse even, if AI were to use it incorrectly, it would be like a whistle that it is an "imposter," destroying the sense of belonging, validity and ultimately damage the brand’s reputation and trust.
AI as the Research Assistant
While AI is riddled with risks and issues when it comes to communicating directly with players, its ability to deal with massive amounts of data makes it a helpful data processor. In this role, AI does not speak to the community, it listens instead. Here it can be helping the community manager to process and deal with more data at a larger scale faster.
Sentiment analysis tools can help collect, monitor, cluster, and track vast amount of data (some even in real time), and help us, better interpret player conversation online. There sometimes is a risk that we get a bit too excited about the fire in one corner of the internet, and missing the larger picture. Here AI can be helpful to keep things in context of the larger conversation and reality of the player base. Some examples include:
Vocal Minority Detection:
AI can identify if negativity is driven by a small, active group (e.g. 100 angry messages from the same 5 people), preventing a studio from making major design changes based on non representative sample size.Trend Spotting:
The one thing AI is amazing at is pattern recognition. It can help to recognize emerging conversation patterns and draw attention to issues before they become larger.Noise Reduction:
Filtering out spam and irrelevant chatter frees community managers to focus on what actually matters, dealing with humans.
4 Ways to Deploy AI Without Losing Your Soul
To move from Hype to Habit, recognising the psychological reality of players, community managers can use AI as a tool to help them make more informed decision on larger data sets, better sample sizing etc. AI here is not a replacement, it’s a scale up for community to do more even more informed.
Sentiment Triage:
Using AI to monitor for, and flag negative sentiment spikes in real-time.Topic Modelling:
Mapping activity levels against sentiment to track and understand all community conversation, not just the loudest.Internal Reporting:
Collecting data, snippets of representative conversation, tracking over time is all time consuming, and frankly “brain dead” work. AI can help provide community reporting a faster route into actually making use of the data, rather than spending it collecting.Toxicity Detection:
AI models can flag bullying and harassment faster. However, a manager must perform the final review to ensure "aggressive banter" (a prevalent bonding mechanism in gaming) isn't mistaken for actual unwelcome, problematic behaviour. We can see how TikTok’s change to a heavier automated moderating has created issues and problems.
Strategic Implications:
Toward Authentic Intelligence
The future of community management isn't full of AI, but it’s more a combination of using it intelligently where it can help make the community manager’s job more impactful for the wider company.
The bottom line: AI should be an amplification tool designed to make your community manager into a high-value customer insights role that helps the entire studio to truly connect with your player.
The "soul" of a community is its irreplaceable asset. AI can build the most advanced engine in the world, but without a human driver to navigate the complex emotional journey, that engine will only drive the community into the uncanny valley.
Content may light the flame, and AI might help us measure its heat, but it is the human that keeps the fire burning.