Inside the casino, the house is always watching
Cultural anthropologist Natasha Dow Schüll explains how casinos use surveillance technology and algorithms to monitor and manipulate players and convince them to wager more.
Mark DiOrio/Observer-Dispatch/AP
Most casinogoers probably assume they are being watched for security reasons. But many casinos are now using surveillance technology in an effort to manipulate players, too.
Modern casinos are deploying real-time behavior analytics, algorithms, and player tracking techniques to get gamblers to spend more, says Natasha Dow Schüll, an associate professor at the Massachusetts Institute of Technology's Program in Science, Technology, and Society and author of "Addiction by Design: Machine Gambling in Las Vegas."
I recently spoke with her about personalized gambling. Edited excerpts follow.
Selinger: By now, we’ve all heard of personalized ads. But what is personalized gambling?
Schüll: When casino patrons use player tracking cards, slot machines become portals for gathering robust data sets filled with precise information about their gambling histories – what games they like to play, how they respond to different game functions, even how fast they tend to press the buttons. Personalized gambling happens when casinos use this information to manipulate a particular customer’s behavior.
Selinger: Is this a new trend?
Schüll: Yes, although it rests on the bedrock of player tracking, which casinos ported over from airline frequent flyer programs in the 1980s. Player tracking was originally an aggregate marketing system: poll everybody, put them into tiers, and then target certain groups with certain kinds of deals to keep them coming back. This still goes on and it’s become very sophisticated – but, increasingly, you see a shift away from these wide broadcast techniques toward individualized marketing. That has become possible for a number of reasons: more precise data collection and faster transmission, better behavioral analytics and behavioral intelligence software, and the shifting of game content from chips embedded in machines to servers in the cloud.
Selinger: Tell me more about that. Do these games look any different from former slot machines?
Natasha: Not a bit. From the outside they appear exactly the same. But the machine cabinets are largely empty inside, without pre-programmed content. In server-based gaming, the content comes from a jukebox in the cloud. This means casinos don’t have to anticipate what groups are going to be on the floor. Instead, after consulting their real-time behavioral information systems, they can actually see what kind of games customers on the floor at that time are wanting to play, and in 20 seconds download new content – or allow an individual gambler to choose her own game characteristics.
Selinger: Can you give an example of the game characteristics in question?
Schüll: Well, an important one is game volatility, or how “risky” a game feels. A game that’s “high hit frequency, low volatility,” for instance, means its mathematical pay tables are programmed to dribble out a lot of small wins, instead making you sit through a long dry spell to get a large jackpot. With continuous customer surveillance it’s possible to figure out where you fit on a volatility index – how risk averse you are, for example. In fact, player tracking data “knows you” better than you know yourself in that regard. When you’re sitting in front of a device clicking away, sometimes over 1,200 times an hour, you’re giving casinos what laboratory scientists call “key press data.” The consumer touch points are massive. They far exceed what an airline can collect, or even an online seller like Amazon. All this granular behavioral information generates rich customer profiles that casinos can capitalize on – through traditional marketing and, increasingly, in real time.
Selinger: What would a real-time marketing adjustment look like?
Schüll: Imagine someone comes over to a game and puts in a player card.
Selinger: Wait, why would they do this?
Schüll: Eighty percent of people today play with cards – not in cash or coins – because they’ve bought into what I call the “bonuses for bytes” mentality that the casinos industry promotes. They make enrollment in tracking programs seem like the most rational thing to do, because of the reward points you can earn.
Selinger: OK. Go on.
Schüll: So, you stick your card in and the machine says hi and addresses you by name. It knows who you are, who your partner is, what hours you gamble, and what your personal “pain points” are – in other words, the game events that typically make you decide to stop a session. Casinos have come up with ways to use this knowledge in real-time to get customers to stay and continue gambling.
For instance, one casino franchise developed a system in which a casino manager, observing from the backroom that a customer was nearing a pain point and statistically likely to leave, would send out a luck ambassador – a live person who provided an incentive for you to stick around, like a bonus certificate. The problem was, players were annoyed to be interrupted and didn’t want to visit a kiosk to cash in their bonus.
Selinger: Why don’t casinos incentivize players directly, from within the game itself?
Schüll: They would love to do that! But they’ve been frustrated by legal prohibitions against changing game odds once a session of play has commenced. That may all change soon, because right now patents are in development that would circumvent that prohibition by creating algorithmic versions of luck ambassadors who operate from within the game to sense an individual’s tolerance for loss and incentivize him accordingly so that he’ll continue to play.
Selinger: But doesn’t dispensing rewards from within the game violate the legal prohibition against changing the odds within a session of play?
Schüll: My thoughts exactly! But patent engineers have come up with a very tricky, mathematically complicated way to get around – or appear to get around – these laws.
The way it works is this: The machine is set to pay out at the minimum legal payback percentage, or lowest odds allowed – but all the while it pays an extra percentage into a “bonus pot.” When the tracking algorithm senses that you’re reaching your pain point and need an incentive to stay, it draws from the pot that’s been accumulating – that you yourself have been funding – to sweeten your rewards.
Selinger: How is that not changing game odds in the middle of a session? Where’s the loophole?
Schüll: The legal sleight of hand here is to classify the holding tank of bonus funds as a “marketing module” that doesn’t interfere with the pay tables of the base game. To my mind, that’s both technically disingenuous and ethically dubious. In the experience of the player, there’s no distinction at all! All of this is happening invisibly and seamlessly behind the scenes. The algorithm is detecting your preferences and calibrating game results to optimize your game spending – and you don’t even know it’s happening.
Selinger: This sounds like what goes on in massive multiplayer online games where profiling occurs to keep folks engaged. If a game is too easy, they’ll be bored and go somewhere else. But if a game is too difficult, they’ll give up and quit. So the game experience gets adjusted.
Schüll: Right. It’s called dynamic game balancing and it also happens in social games on Facebook. In consumer retail it’s called price discrimination. All these techniques involve surveillance of customer behavior and the use of algorithms to responsively tailor product cost or experience so as to maximize profit. I find player tracking and customized volatility algorithms particularly problematic ethically because what’s at stake isn’t a single purchase like a pair of shoes or a free game like Candy Crush– instead, it’s a moment-to-moment modulation of your experience based on continuous data-tracking and machine-learning algorithms.
Selinger: Do you think casinos are going to be called out on their tracking and use of personal data to customize gaming experiences?
Schüll: Probably not. The people who sit on gaming boards generally have no idea how algorithms work and what’s going on in the guts of machines, and state legislators have even less technical knowledge. The gaming labs performing regulative functions have no incentive to make the approval process for new games difficult or slow; on the contrary, they see their role as facilitating the process so as to help raise money for states. Through interviews like this, I’m hoping to raise public awareness and draw some much-needed scrutiny to these ethically and legally questionable trends.
Selinger: Can’t the knowledge of how to use algorithms and surveillance to micro-manipulate people migrate from casinos to other domains? Is it fair to say that folks should worry about this more generally?
Schüll: Absolutely. These techniques are turning up in banking and financial management, the gamification of office tasks, and places we might not expect like online education, medication compliance, and employer wellness programs. In each case, intimate behavioral information is used by customization algorithms so as to incentivize engagement, retain attention, and keep people hooked in – whether they’re gamblers, financial clients, students, patients, or employees.
Evan Selinger is an associate professor of philosophy at Rochester Institute of Technology. Follow him on Twitter @EvanSelinger.