I recently picked up the 2016 Hitman reboot. I had heard that the game had lots of systems that players could gleefully smash into each other on their quest to assassinate Agent 47’s deserving targets. There’s a lot of freedom on the game and a lot of different tools at your disposal, plus a whole stealth mechanic, disguise mechanics, and other things to figure out. It’s a complex game, so it starts off with a couple of “guided training” levels to ease players in.
Are you a Patreon supporter? Get the audio version of this article here.
The first thing the tutorial asked me to do was find a disguise so I could sneak past security guards. The nice lady in my ear started
In the spirit of science, I ignored her and decided to march right past the security guards. To see what would happen. The game stopped and I got this screen:
Essentially, the game tried to stop me from making the mistake of infiltrating without a disguise. I had to pause, read the message about what I was doing wrong, and then tap the Enter key to sheepishly acknowledge my error.
Most modern game tutorials and early levels try to hold players’ hands like this so that they can’t really screw up, fail, or make errors until they’re done learning the basics of the game. There are guided prompts, highlighted paths, and in the most extreme examples, the game practically plays itself until it feels the player is ready to take over the controls. In educational
This may be a mistake. Especially if they want players to quickly grasp concepts and possibilities. In some cases, games may be better off giving players information about what to do (e.g., “Infiltrate that yacht”) and then making them fumble around until they figure it out. This is because of a concept in psychology called “error management.”
I’m thinking specifically of a 2003 study published in Personnel Psychology where the authors aimed to test how effective it was to integrate what they called “error instructions” into training novices on how to use Microsoft Excel.1 The idea was that making mistakes was instructive if you could get people not to get so stressed out about it. If you could convince people that making errors was not only okay but expected as part of the training program, then you could mitigate the stress, heartburn, and other inhibiting emotions that might otherwise get in the way of learning. They did this by telling learners things like “Errors are a natural part of the learning process!” and “The more errors you make, the more you learn!”
This was all relative to people who were given exhaustive, step-by-step instructions designed to avoid errors. They also had tutors who would literally jump in to keep learners from trying to do something the wrong way, much like how Hitman stopped me from trying to waltz past security without a disguise.
So who mastered Microsoft Excel better? The short version is that it was the group that was forced to make mistakes and that was told that this process was totally normal. Weird, right?
The explanation the researchers offered was that errors help learners create and test more complete mental models of how the software worked and what was possible with it. That is, they helped them develop a more complete and detailed understanding of how the software operated, what its design conventions were, and what to expect if they tried something new. Those mental models helped the subjects learn the software better and use it more effectively. Thanks to errors and experimentation that they made trying to fill in the blank spaces on the models themselves.
This is also similar to a point made by those studying how feedback interventions affect our ability to learn, relative to just learning through trial, error, and discovery.2 Again, the theory is that getting feedback from experimenting with how to do the task provides a
So given this, should game developers allow players to make more mistakes and try to convince them that this is all part of the tutorial process? Maybe. There’s certainly something to be said for proper onboarding of players and making sure that they don’t get frustrated early on to the point where they give up and stop playing because they don’t know what’s going on. Unlike workers or students, gamers have more leeway to just quit. Game developers probably wouldn’t be well served by dumping everything on players at once and then punishing them when they can’t take it all in.
But if a game involves learning a lot of different systems, making strategic decisions, making preparations, and applying knowledge, then game developers might find that this kind of error management gets them up to speed more effectively IF mistakes are truly not punishing and recovering from them is both instructive and frictionless. And the game needs to explicitly tell players that what they did was an error or an ineffective strategy. That piece of the feedback loop is critical, lest players simply think that a game is too difficult.
So, what do you think? When would error management be a good strategy to try when trying to teach people how to play a game?