What Game Designers Know That AI Engineers Don’t
You know what I noticed?
It is 2026 and AI tools are everywhere: chatbots, coding assistants, writing helpers, analytics platforms and more. Billions in funding. Millions of users signing up, and yet, most people (including me) try them once and never come back.
But here’s the thing, game designers solved this problem 30 years ago.
I’ve spent years building games. Creating experiences that make people want to come back every single day. Systems that reward progress. Features that make accomplishment feel real. Feedback loops that keep people engaged.
Now I’m watching the AI industry, one of the fastest-growing and most hyped markets, struggle with retention, engagement, and getting people to actually use what they’ve built.
So let me tell you what game designers know that (some) AI engineers don’t (yet).
The Real Problem: AI Tools Work, But People Don’t Use Them
Let’s start with the simple stuff. Not the advanced agentic AI everyone’s talking about, just the basic tools that already exist.
Writing assistants. Coding copilots. Image generators. Research tools. They work. The technology is solid. The outputs are genuinely useful.
So why do most people use them once and abandon them?
Because there’s no progression. No sense of achievement. No rewards. No goals.
You write something with an AI assistant. It’s great. Then… what? There’s no dopamine hit. No acknowledgment that you just accomplished something. No visible progress toward a goal. It just sits there like a vending machine waiting for your next query.
Compare that to a game. Even the simplest mobile game knows how to make you feel accomplished. You complete a level, you see fireworks. You unlock a new ability that you didn’t know you wanted until you try it out. You get a notification celebrating your winning streak. You see your progress bar fill up.
AI tools? Nothing. Just a blank input field waiting for the next command.
What’s Missing: The Game Design Fundamentals
Here’s what game designers have known for decades that AI engineers are ignoring.
Missing Feature #1: Progression Systems
Every good game shows you your progress. Experience bars. Level indicators. Skill trees. Achievement counters.
Why? Because humans are wired to respond to visible progress. We’re motivated by seeing how far we’ve come.
Most AI tools and products don’t have this. You could be using a coding assistant for 6 months, writing thousands of lines of better code, learning new patterns, getting more productive and the tool gives you zero acknowledgment of any of it.
Imagine if your AI writing assistant showed you:
- Total words written this month
- Your writing streak (days in a row you’ve written)
- Documents completed
- New writing skills you’ve practiced
- How your clarity scores have improved over time
- Or even, allow you to easily share your writing with others to get real HUMAN feedback
Suddenly it’s not just a tool. It’s a companion tracking your growth. People don’t only build for the end results, they build for the process.
One thing that stuck with me that drives this home is the fact that as kids we always wanted to BUILD a sand castle or BUILD a Lego structure. Afterwards, we usually broke it apart and started over. Why? Because we are builders and enjoy the process of creation. This is what we need to nurture in these tools.
Missing Feature #2: Rewards and Recognition
Games reward you constantly. Not just at the end of a level, but during it. Small wins. Combo notifications. “Nice shot!” “Perfect timing!” “You’re on fire!”
This isn’t manipulation. It’s feedback. It tells players they’re doing well, which reinforces learning and makes the experience enjoyable.
Most AI tools are silent. You could write a brilliant piece of code, and the AI says nothing. You could finish a complex analysis, and there’s no acknowledgment. You could use the tool flawlessly for weeks, and it treats you the same as someone who just signed up today.
For example, I recently started to use Boardy. Boardy is an emerging tech startup that uses AI to help professionals make meaningful business connections. It functions as an AI‑powered networking assistant that engages users in conversational matching to introduce them to relevant contacts, mentors, partners, investors, etc.
It allows me to connect with other humans and I like that. But it could be so much better. What if it could also:
- Track users with most connections, most introductions, or highest engagement.
- Offer Networking Challenges “Make 5 high-quality connections this week,” or “Help 3 people get introduced.”
- Gave micro-feedback: “You helped 3 startups grow today — that’s +15 XP!”
- Showed you a weekly visual of your network growing like a tree or constellation.
It sounds simple and in some cases it is, but these moments of recognition completely change how people feel about using a tool.
Missing Feature #3: Notifications That Matter
Games know when to ping you. Not randomly—strategically. “Your energy has refilled.” “Your teammate needs help.” “New daily quests available.”
These notifications serve a purpose: they bring you back at the right moment, when there’s something meaningful for you to do.
Some AI tools I have tested recently either spam you with irrelevant updates or stay completely silent. They don’t understand rhythm. They don’t know when you’d actually want to hear from them.
Imagine if your AI tools sent you smart notifications:
- “You haven’t written in 3 days—want to keep your streak alive?”
- “New research published on the topic you were exploring last week”
- “You’re close to completing 100 code reviews—milestone achievement available”
- “Based on your usage, here’s a new feature that might help”
That’s not spam. That’s intelligent engagement and it is lacking in most of the tools I have tested.
Missing Feature #4: Goals and Quests
Every RPG ever made knows this: people need clear objectives.
Main quests. Side quests. Daily objectives. Clear steps. Visible progress. Rewards at the end.
AI tools? They just sit there. No suggested goals. No challenges. No structure. Users have to create their own motivation entirely from scratch.
What if your AI writing tool suggested:
- “Daily Quest: Write 500 words on your current project”
- “Weekly Challenge: Complete three blog posts this week”
- “Learning Quest: Try writing in a new genre”
- “Mastery Path: Improve your business writing skills”
Suddenly the tool isn’t passive. It’s actively helping you build better habits and achieve your goals.
Missing Feature #5: Retention Loops
Game designers build retention into everything. Daily login bonuses. Weekly events. Monthly challenges. Limited-time content.
It’s not about addiction, it’s about creating reasons to return. Encouraging good habits, and making the tool part of your routine.
AI tools have none of this. There’s zero incentive to use them consistently. No benefit to daily engagement versus sporadic use. No reason to come back tomorrow if you don’t have an immediate task.
The result? People forget the tool exists. They remember it only when they have a problem, are searching for alternatives, and may not even come
Why This Matters for Advanced AI Too
Now let’s talk about the future. Agentic AI. Autonomous systems. Multi-agent coordination.
The market knows this is huge. We’re looking at a $5.25 billion market in 2024 growing to $199 billion by 2034—a 43.84% compound annual growth rate. 45% of Fortune 500 companies are already piloting these systems[1].
But here’s what is frustrating: only 2%[2] have actually deployed them at scale.
Why?
Same problems as above, but at a greater scale. These advanced systems are solving technical challenges, goal planning, multi-agent coordination, autonomous decision-making. But they’re ignoring the human side.
Nobody trusts them. Nobody understands what they’re doing. Nobody wants to hand over control to a black box.
Problem #1: Goal Management
The AI industry is asking fundamental questions: “How do agents break down complex goals? How do they plan? How do they adapt when plans fail?”
Notably, planning and goal management are now the fastest growing components of the agentic AI stack, with learning and adaptation frameworks accounting for approximately 29% of market share[3].
What is striking is that the solution already exists, and has for decades. It is embedded in game engines. Today, more than 50% of game studios use AI for behavior generation built on these exact patterns.
The answer is Quest systems. Games have been solving this problem since the 1990s. Every RPG breaks massive objectives into manageable chunks: main quest, supporting side quests, defined steps, checkpoints, and continuous feedback at every stage.
Problem #2: Multi-Agent Coordination
How do we get multiple agents to work toward shared goals? How do they communicate? How do we handle conflicts?
Real-time strategy games may have the insights needed.
You think managing AI agents in an enterprise is hard? Try coordinating 200 units in StarCraft, each with different abilities, different roles, different priorities, all working toward a single objective while an opponent tries to destroy them.
MMO raid bosses coordinate NPC teams that adapt to player behavior in real-time. City builders simulate thousands of citizens with individual needs and collective goals.
Game designers have been building these systems for decades. Now 66.4% of the agentic AI market is focused on coordinated agent systems, and 40% of Fortune 100 firms are using Microsoft’s AutoGen for multi-agent coordination[4].
Companies are paying millions to rebuild what was already created and I am not sure if they have considered the gaming parallels for work that has already been done in this space.
Problem #3: Trust and Transparency
This one bothers me most.
Only 25% of Americans trust AI to provide accurate information[5].
Everyone asks: “How do we build trust? How do we make systems transparent?”
Game designers spent 30 years making sure players always know what’s happening.
UI feedback systems. Intent telegraphing. Visual indicators. Status bars. Cool-down timers. Ability to pause. Ability to intervene.
When an NPC is about to attack, you see the wind-up animation. When a companion is low on health, you see the red indicator. When the game auto-saves, you see the icon.
Transparency isn’t optional in game design, it’s fundamental, because if players don’t trust the system, they quit.
Same with AI. The solution is better design. Show people what the agent is doing. Let them intervene if needed and make the system and what is happening behind the scenes more understandable. This transparency builds trust.
The Missing Piece: Why AI Companies Need Game Designers
Here’s where the AI industry is getting it wrong.
They’re hiring brilliant engineers, PhDs in machine learning experts in distributed systems, people who can optimize algorithms and scale infrastructure.
And that’s important. You need those people.
But in many cases, they’re missing the other half of the equation: designers who understand human behavior, motivation, and engagement.
Engineers build what’s possible. Designers build what people actually want to use.
And right now, the AI industry is building incredibly powerful tools that people don’t want to use. Not because the technology is bad, but because the experience is bad.
Every AI company should have game designers on staff – NOT as an afterthought. NOT as consultants brought in to “make things pretty” but instead as core team members from Day 1 to get the best possible results.
Here’s what that looks like in practice:
- Engineers design the AI model. Designers design the progression system that shows users they’re improving.
- Engineers build the autonomous agent. Designers build the transparency layer that shows users what it’s doing and why.
- Engineers optimize the algorithm. Designers create the reward system that makes using it feel satisfying.
- Engineers handle the backend. Designers craft the notifications that bring users back at exactly the right moment.
This isn’t about dumbing things down. It’s about making powerful tools accessible, engaging, and trustworthy.
I have been working on an AI product in a design capacity, and it has shown me how early design involvement can rapidly focus a product around the user.
The best AI products will be built by teams where engineers and designers work collaboratively from the beginning. Where technical capability and user experience are given equal weight. Where “does it work?” and “do people want to use it?” are both questions that must be answered yes.
Where This Is All Going
By 2028, 68% of customer service interactions will be handled by agentic AI[6]. Multi-agent systems will be standard in enterprise.
But more importantly, simple AI tools, the ones people use every day, will finally understand what makes people want to use them.
The companies that figure this out first, the ones that pair engineering excellence with design excellence, will dominate.
The UX patterns will come from games:
- Progression systems that make growth visible
- Rewards that acknowledge and reinforce achievements
- Smart notifications that re-engage users at the right moment
- Quest systems that provide clear, structured goals
- Retention loops that help build better habits
- Tutorial systems designed for intuitive onboarding
- Character-driven design principles embedded in every agent
- Social features that encourage participation and shared progress
The convergence is already happening.
50% of game studios are using generative AI. NVIDIA ACE is building AI for digital humans in games. ACE is part of NVIDIA’s broader AI platform and can be used to power everything from AI NPCs in games that respond to player questions and actions, to digital humans in customer service, healthcare, and virtual experiences. It integrates technologies like NVIDIA Riva for speech, NeMo for language models, and Audio2Face for facial animation, and can run on the cloud or on devices accelerated by NVIDIA GPUs.
Inworld AI creates generative NPC behavior. Inworld AI is a platform that enables developers to create generative, AI-powered NPCs that act, speak, and behave like real characters in games, VR, and interactive experiences. Unlike traditional scripted NPCs, Inworld AI characters have distinct personalities, goals, and memories, allowing them to respond dynamically to player input with contextually appropriate dialogue, gestures, and emotional expressions. This results in more immersive, lifelike interactions, where NPCs can adapt to the story, the environment, and the player’s actions, making worlds feel alive and reactive in real time. Ubisoft developed Ghostwriter, a proprietary AI tool for generating NPC dialogue (especially short lines known as “barks”) used inside the company to help narrative teams write dialogue more efficiently. The tool was created in‑house by Ubisoft’s R&D arm (La Forge) and is intended to generate first drafts of short NPC speech that human writers can then choose from and edit to fit the game’s needs. It’s specifically designed to assist with repetitive dialogue tasks like crowd chatter and triggered lines in open‑world games.[7]
Game companies are naturally early adopters and they have the staff and expertise to solve these challenges. What is needed now is for AI and product focused companies to make better products by using entertainment and gamification techniques.
Smart companies are hiring game designers to help with their AI tools because they’re finally realizing what we’ve known all along: engagement and intelligence aren’t separate problems, they go hand in hand.
The Opportunity
Think about the possibility:
Your coding assistant tracks how many bugs you’ve prevented, shows your improvement over time, celebrates when you write elegant solutions, and suggests daily coding challenges matched to skills you want to improve.
Your research tool maintains a quest log of your current projects, notifies you when relevant new information appears, rewards you for thorough analysis, and shows your knowledge progression across different topics.
Your writing assistant tracks your streak, celebrates milestones, suggests daily prompts, shows your style evolution, and makes the act of writing feel like progress toward mastery.
These aren’t fantasy features. They’re straightforward applications of game design principles that have been used for decades.
And the returns? Average ROI of 171%. US enterprises achieving 192%. Conversion rates improving 4-7x[8].
Yet, 40% of projects fail due to inadequate infrastructure and inadequate design, often because teams build from scratch with an engineer-first and little design involvement.
Hiring a designer for your AI product isn’t just about aesthetics. It is about making your complex models usable, understandable, and trustworthy. A designer turns technical outputs into intuitive interfaces, clear visualizations, and actionable workflows, reducing rework and ensuring users actually benefit and have fun using AI tools.
Game designers bring creative problem-solving, spot UX gaps engineers might miss, and use data-driven feedback to refine experiences, so your product not only works technically but delivers real value users want to rely on and interact with.
Iteration
The future of AI, both simple tools and complex agents, isn’t being invented in a lab somewhere. It’s being discovered by looking at what game designers already built and adapting and iterating for a new way of doing things.
We spent 30 years making systems that people actually want to use. Systems that feel rewarding, that build habits, that make progress visible, that celebrate achievement. and most importantly, that bring people back.
Every problem the AI industry is facing from basic tool adoption to advanced agent deployment, game designers have already solved.
But designers can’t solve these problems alone. Engineers can’t either.
The breakthrough will come when designers and engineers work collaboratively. When AI companies understand that a game designer is as essential to the team as a machine learning engineer. When “how does it feel to use?” is asked in the same breath as “how does it work?”
The companies that win won’t be the ones with the most PhDs or the best designers in isolation. They’ll be the ones that bring both together from day one, understanding that design and engineering aren’t separate phases, but collaborative partners in building something people actually want.
Because at the end of the day, it doesn’t matter how intelligent your AI is if nobody wants to use it.
Game designers know this. Some engineering teams are still learning. I believe that the best results will be from designers and engineers working together on tools and experiences that we want to use and ideally that help make the world a better place.
And future of AI needs this unity.
[1] Precedence Research (2025, December 1) “Agentic AI Market Size, Share and Trends 2025 to 2034” Precedence Research. https://www.precedenceresearch.com/agentic-ai-market
[2] Pandey, Santanu, (2025, July 7) “200+ AI Agents Statistics: UsageAI Agents Statistics, Usage, ROI, & Industry Trends.” Tenet. https://www.wearetenet.com/blog/ai-agents-statistics
[3] Precedence Research (2025, September 4) “Agentic AI market size to reach USD 199.05 billion by 2034, driven by autonomous decision making and North America’s leadership” [Press Release]. Global Newswire. https://www.globenewswire.com/news-release/2025/09/04/3144393/0/en/Agentic-AI-Market-Size-to-Reach-USD-199-05-Billion-by-2034-Driven-by-Autonomous-Decision-Making-and-North-Americas-Leadership.html
[4] Market.us. (2026, January). “Agentic AI market size, share, trends, and forecast analysis.” https://market.us/report/agentic-ai-market/
[5]Fullview. (2025, November 12). 200+ AI statistics & trends for 2025: The ultimate roundup. Fullview. https://www.fullview.io/blog/ai-statistics
[6] Cisco. (2025, May 27). Agentic AI poised to handle 68% of customer service and support interactions by 2028 [Press release]. Cisco Newsroom. https://newsroom.cisco.com/c/r/newsroom/en/us/a/y2025/m05/agentic-ai-poised-to-handle-68-of-customer-service-and-support-interactions-by-2028.html
[7] Capitol Technology University. (2025, October 23). AI in video game development: From smarter NPCs to procedural worlds. CapTechU Blog. https://www.captechu.edu/blog/ai-in-video-game-development
[8] Saks, D. (2026, January 5). 39 agentic AI statistics every GTM leader should know in 2026. Landbase. https://www.landbase.com/blog/agentic-ai-statistics
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.
