Feedback is one of the hardest things to get right in the world of work. It’s so much easier in video games. So just copy how games do it! I have an entire chapter about how you should (and should not) copy games when it comes to giving feedback in my book, The Engagement Game: Why Your Workplace Culture Should Look More Like a Video Game.
I sometimes get asked why video games are so popular. Part of my answer is that we love getting good feedback about our performance because it helps track our mastery of skills and progress towards goals. Not only do we love getting this kind of feedback, but we typically don’t get enough of it in our jobs or schools. At work, feedback comes from things like performance reviews, customer surveys, and sales reports. At
So we turn to video games, which scratch this psychological itch far more effectively.
This is because video games are engineered to give us really good feedback about our performance, which lets us adjust our strategies, change our behaviors, and reach our in-game goals. Specifically, performance feedback in video games is typically:
- Immediate, not delayed
- Frequent, not intermittent
- Focused on outcomes, not people’s identities
- A mix of positive and negative
- Useful for showing progress towards goals (think progress bars)
These are all things that psychologists studying performance feedback have found that we love and which leads to using the feedback to improve.1 So, I guess feedback in video games is wholly superior to workplace feedback, eh? Guess that’s why all them kids are playing the Fortnite instead of pursuing their careers.
Actually, no. The feedback we get it games is typically missing a critical component.2
I’ve been reading about what organizational psychologists know about effective performance feedback.3 While all of those things above are indeed characteristics of effective feedback, there’s one major feature of effective feedback that video games rarely, if ever, deliver:
Focus on process, not just results.
Let me explain. It’s fine to get feedback that lets you know outcomes –you hit your sales numbers, you killed the
At the end of a game of Heroes of the Storm, for example, the game may tell you your team lost and give you some stats. But it won’t say “Hey, when playing as Sonya you should have spent more time taking merc camps and
But I have found one
World of Warcraft (WoW), as you may know, is a massively multiplayer game where players team up to tackle challenges like big boss battles. Especially at high level play, WoW offers many different approaches to this task and many different roles for players to fill. Each player on such a raid must not only properly equip and prepare themself, but they have to perform well when it’s time to sling spells or swing swords. You can perform well, or you can perform poorly. WoWAnalyzer helps you figure out how to improve.
The tool was originally created by Dutch software developer Martin Hols. He had been playing WoW and even programming add-ons for it for years, but in 2016 Hols became serious about improving his Holy Paladin spec. So he joined a guild and got to work. “As a part of my desire to improve I joined the (Holy) Paladin Discord server,” Hols told me via e-mail. “After observing for a bit, I joined the conversation and quickly became a regular there. I started doing some basic analysis using Warcraft Logs (a site that allows you to view logs of boss fights you did) and this slowly got more and more advanced.”
Hols soon noticed that manual review of combat logs was impractical but that a lot of the information was suitable for automation. One could feed these logs into a program and have it return specific feedback. Seeing it as an interesting challenge, he eventually created a proof of concept for a “Holy Paladin Mastery Effectiveness Calculator.” His work quickly gained attention and eventually expanded to include feedback for dozens of specs, equipment, and classes. To reflect this broader application, Hols swapped out the project’s somewhat clunky name along the way for the much more direct “WoWAnalyzer.”
What makes WoWAnalyzer impressive to me is how it provides the kind of process feedback that video games usually neglect. It looks at your combat logs and then gives you very specific feedback and recommendations on what to stop doing and what to start doing more. A report for a Fire Mage build, for example, might look at her recent performance and note “You cast Fireball instead of Scorch when the target is under 30% health 11 times. When using Searing Touch always use Scorch instead of Fireball when the target is under 30% health since Scorch does 150% damage and is guaranteed to crit.” And when players did something well, they were congratulated for it.
The logic and content of these reports are built by dozens of contributors who know different player specs inside and out. To help make feedback reports consistently useful, Hols and his team put together report writing guidelines that recommend what he describes as “concise suggestions that allow users to quickly understand what potential issues and changes they need to make to improve.” Suggestions should:
- Explain what was found
- Make a suggestion that is future oriented
- Explain why the suggestion is important
- Suggest a specific, better behavior to take as an alternative to what the player did
In this way, WoWAnalyzer uses what psychologists studying feedback call “feed forward.” This very useful and effective feedback technique provides information about how to do better next time instead of focusing on what has been done in the past. It’s a subtle shift in focus, but people tend to react well to this kind of feed forward approach. 4
But this kind of analysis and feedback on performance is really hard to do and doesn’t fit all kinds of games. In general, video games just aren’t equipped to provide it. But maybe that will change. Maybe advances in artificial intelligence will get to the point where a virtual coach can examine your performance in real time and give you feedback and feedforward after every match, play, or level. Maybe it could tell you something like “You kept trying to use the shotgun at medium to long range, resulting in an accuracy of only 22%; try switching to a rifle for longer range engagements.” Or “Your team’s tank is susceptible to physical damage and died more often than normal; if you want to be an effective support choose the Iron Clad ability so that you can grant additional armor.”
Stats, data, and logs are great, but something that helps players make sense of them and apply lessons would be extremely appealing.
2 thoughts on “How Video Games Do Feedback Well (and Poorly)”
This is great but I wonder if excluding the process part of the feedback equation is a reason *why* games are so compelling. Players essentially have to fill in that part for themselves, which requires self-reflection, learning from past mistakes, etc. Figuring out why something did or didn’t work is part of what makes them compelling. Too much data might in fact take some of the challenge (and fun) away. It’s gratifying to figure things out for yourself and then prove the lesson in practice through adjusted play.
If you’re playing competitively it’s a different story, but if you’re playing “just for fun” then I’d think this information might be detrimental to the experience.
You’re right that too much data can be problematic, but all of the research I’ve read says that specific feedback about specific behaviors and what to change helps people improve their performance.
Of course, if they’re not in it to improve performance and just want to mess around, then you’re probably right that they don’t care!
You must log in to post a comment.