Your Video Game Is Gaming You – Hunting "Whales"
The widespread use of video games on personal mobile devices has fundamentally altered gaming. Hidden under colorful cartoons is often an aggressive sales agent.
Traditional video games like Super Mario Bros., Donkey Kong, and Pac-Man operate on a single-pay model: You buy the game once, and then you own it. This applies to console games like Nintendo, where you buy a cartridge and play it as often as you like, and arcade games where the gamer pays a fee per session. However, as games have moved onto our cellphones, video game companies have gained more access to both our identities and credit cards. In this environment, a new pay model is becoming increasingly popular. The “freemium model” offers free access to a game, but then offers in-game purchases for things like upgrades and shortcuts within the game. This model is also known as the “free-to-play” or “F2P” model, and game companies can generate staggering profits if they successfully monetize their product.
While every game needs a revenue stream to remain viable, the open-ended nature of F2P economics relies on marketing strategies that range from reasonable to exploitative. In some well-publicized cases, game manufacturers and platforms targeted children by hiding expensive items in children’s games and allowing credit card authorization to persist after parents had handed the phone to their children. This resulted in big profits from unauthorized purchases. The problem became so severe that it led to a pair of FTC rulings against Apple and Google for deceptive business practices, totaling $32.5 million and $19 million respectively. Both companies ended the deceptive practice, but the fact that damages accumulated to tens of millions of dollars before any corrective action was taken conveys the magnitude of the problem.
What else are F2P game companies up to?
Some of the more sophisticated game manufacturers are tracking user data to identify, target, and exploit the weaknesses of individual gamers who are likely to spend more. Similar to methods used by casinos, game manufacturers identify and target big-spending customers known as “whales.” While specific cases of player manipulation may not be discussed publicly, techniques employed in F2P games are widely known. One tactic involves changing the nature of the game as a player gets deeper into the game. A skill-based game can be changed to a pay-based game by sharply increasing its difficulty. As the difficulty rises, the game offers the player the option to pay for things to help make the game easier.
The practice of increasing the difficulty of a game until a player begins to lose, and then offering a purchase that will relieve that stress is called “fun pain.” However, justifying the direct manipulation of a player’s emotional status as part of game play is an ethically grey area, especially when it involves sums of money that are large enough to significantly impact most household budgets. For example, consider the report of a whale who invested $600 in virtual walls to fortify a virtual city that was promptly leveled by rival players.
Another trick is the use of pay walls, which are effective if players have progressed to a point in a game where they can lose all of their invested time and effort if they do not move to the next stage. An example of this is charging for an expansion pack that players need to continue a game. All of these techniques become increasingly powerful when companies combine them with analysis of a player’s commitment level, spending limits, and tolerance for risk.
What Can Be Done?
A helpful way to view the problem of F2P games is through the lens of The Belmont Report, a 1979 report from the U.S. government that outlines ethical practices regarding the treatment of people used in research studies and experiments. It is not unreasonable that the principles detailed in this report should be applied to F2P games, as manipulating players by altering game play can be considered a form of experimentation.
According to the report, three basic ethical principles should be followed when conducting research on people: “respect for persons,” “beneficence,” and “justice.” Respect for persons, the most basic of these principles, entails acknowledging an individual’s right to consent regarding whether or not to participate in a study. Additionally, this choice must be performed with informed consent. That is the subject must be provided with accurate and thorough information concerning the risks and benefits of the activity. The absence of informed consent is an indication of harm and relates to beneficence, which requires that a person’s well-being is ensured. In this setting justice requires that everyone is treated equally. While the violation of these principles is not yet a matter of public record, experimentation on players without their knowledge or permission appears to be integral to the basic business practices employed by F2P app companies. F2P app monetization tactics are likely to violate all three of the guidelines by failing to provide notification, obtain permission, disclose the extent and goal of in-game manipulations, and by treating “whale” users unequally.
When it comes to F2P games, The Belmont Report’s guidelines can be implemented by providing notice to users and through a simple ratings system. Notice is a common concept employed in ethical guidelines, and F2P game developers should be prompted to take significant steps to provide it to consumers. Notice would include a clear delineation of in-game alterations to game play and their intended effects. To pursue clear and conspicuous notice, a simple ratings system needs to be created that concisely describes data collection and analysis behaviors, game play techniques used for monetization, and a single number to indicate the distribution of income generated from players. For example, F2P developers would include the percent of their income generated from the top 1 percent of players.
In order to be effective, the ratings and categorization need reflect the concerns of consumers. When TV ratings were first implemented it became clear that consumers wanted a rating system that provided specifics about the type and intensity of content that might be considered dangerous. We suggest that app makers disclose a similar level of detail to empower the consumer to make informed decisions. How much of their revenue comes from a few players? Is the difficulty of game play fixed or does it change based on spending criteria? Is information pulled in from other games to identify certain players? With the introduction of a privacy and game play ratings app, markets could provide consumers with the ability to apply filters in order to exclude applications with unacceptable ratings.