Can AI help casinos cut down on problem gambling?

BD News 24
 
Can AI help casinos cut down on problem gambling?
Super Slots
A few years ago, Alan Feldman wandered onto the exhibition floor at ICE London, a major event in the gaming industry.

Feldman had spent the past 30 years as anexecutive with MGM Resorts International, focusing on problem gambling and thefinancial, personal and professional repercussions. Before his departure fromthe company, he helped build a nationwide responsible-gambling program thatfocused on helping players shift their behaviours to reduce the risk ofbecoming problem gamblers.

While on the floor at ICE, he noticed a fewcompanies promoting new products that would use artificial intelligence to notonly identify problem gambling, but predict it. Feldman was immediatelysceptical. AI, he thought, might do a lot of things, but he had never heard ofa use that predicted a state of mind.

AI as a solution to problem gambling“raised far more questions than it did answers,” said Feldman, now adistinguished fellow in responsible gambling at the International GamingInstitute at the University of Nevada, Las Vegas. “It was slick. It wasinteresting. It was compelling intellectually. But whether or not it was reallygoing to do anything, I thought, very much remained at question.”

Another question, this one obvious to anyobserver: Isn’t a problem gambler exactly what a casino wants financially? Inshort: no. Even putting aside regulatory issues — gambling operators can befined or lose their licences if they fail to monitor problem gambling and actwhen necessary — it is, counterintuitively, not in their best financialinterest.

“Casinos need to have customers in order tosustain themselves,” Feldman said. “And the only way to have customers is tohave customers who themselves are healthy and thriving and able to pay theirbills and come back the next time.” Problem gamblers “always end the same way,”he added. “The end of the road is the exact same with all of them: They have nomoney.”

In a more general sense, the pairing of AIand gambling makes perfect sense: unlimited and constant data, decision-making,computerised systems. With the explosion of online gaming, the opportunities toharness this combination for a public good seem endless. The reality —interpreting human behaviour, navigating privacy laws, addressing regulatoryissues — is much more complicated.

At the same time that Feldman wasquestioning those potential solutions, Danish researchers were trying to solvethe same problems. Mindway AI, a company that grew out of Aarhus University,does exactly what Feldman was sceptical of: It predicts future problemgambling. Built using research at Aarhus University by its founder, KimMouridsen, the company uses psychologists to train AI algorithms in identifyingbehaviours associated with problem gambling.

One significant challenge is that there isno sole indicator of whether someone is a problem gambler, said RasmusKjærgaard, Mindway’s CEO. And at most casinos, human detection of problemgambling focuses on just a few factors — mostly money spent and time played.Mindway’s system takes into account 14 different risks. Those include money andtime but also cancelled bank withdrawals, shifts in the time of day the playeris playing and erratic changes of wagers. Each factor is given a score from 1to 100, and the AI then builds out a risk assessment of each player, improvingitself with each hand of poker or spin of the roulette wheel. Players arescored from green (you’re doing fine) to blood red (immediately step away fromthe game).

In order to tailor the algorithm to a newcasino or online operator, Mindway hands over its data to a group of expertsand psychologists trained in identifying such behaviour. (The company said theywere independent, paid consultants.) They assess each client’s customers anduse that model as a sort of baseline. The algorithm then replicates itsdiagnosis to the full customer database.

“As soon as a player profile or playerbehaviour goes from green to yellow and to the other steps as well, we are ableto do something about it,” Kjærgaard said. The value in the program isn’tnecessarily just identifying those blood-red problem gamblers; by monitoringthe jumps along Mindway’s colour spectrum, it predicts and catches players astheir play devolves. Currently, he said, casinos and online operators focustheir attention on the blood-red gamblers; with Mindway, they can identify theplayers before they ever reach that point.

The trickiest step, though, according toBrett Abarbanel, director of research at UNLV’s International Gaming Institute,is taking that data and explaining it to a player.

“If my algorithm flags someone, and itthinks that they’re a problem gambler, I’m not going to send them a note andsay, ‘Hey, great news: My algorithm has identified you as potentially a problemgambler. You should stop gambling right away!’” The response would be obvious,Abarbanel said, deploying a middle finger: “That’s what will happen.”

How to actually communicate thatinformation — and what to tell the gambler — is an ongoing debate. Some onlinegaming companies use pop-up messaging; others use texts or emails. Kjærgaardhopes that clients take his data and, depending on the level of risk, reach outto the player directly by phone; the specificity of the data, he said, helpspersonalise such calls.

Since starting in 2018, Mindway hascontracted its services to seven Danish operators, two each in Germany and theNetherlands, one global operator and a US sports-gambling operator, Kjærgaardsaid. Online gambling giants Flutter Entertainment and Entain have bothpartnered with Mindway as well, according to the companies’ annual reports.

Since this technology is so new and there’sno regulatory body setting a standard, Mindway and similar companies are, fornow, essentially on their own. “We wanted to be able to say to you, to anybodyelse — operators, obviously — that not only do we provide this scientific-basedsoftware, but we also want to have a third party to test the validation of whatwe do,” Kjærgaard said. “But it is a paradox that there’s no specificrequirements which I can ask my team to fulfil.”

Currently, Mindway’s technology livesmostly in online gambling. Operators attach Mindway’s GameScanner system totheir portal, and it analyses not only individual risks but also total risksfor the system. Applying that level of oversight to in-person gambling is muchmore difficult.

One example of a measure of success can befound in Macao. Casino operators there use hidden cameras and facialrecognition technology to track gamblers’ betting behaviour, as well as pokerchips enabled with radio frequency identification technology and sensors onbaccarat tables. This data then heads to a central database where a player’sperformance is tracked and monitored for interplayer collusion.

This, Kjærgaard said, is the future: Thefinancial incentives will drive success. “Smart tables” and efforts to addressmoney laundering and financial regulations may eventually provide the data thatwill supercharge the application of AI to in-person gambling.

(It also highlights another difficulty inapplying AI to gambling: cultural differences. In Chinese casinos, Abarbanelsaid, the players are used to this level of monitoring; not so in the UnitedStates.)

AI would certainly work for casinos when itcame to marketing, promotions and game suggestion, Feldman said, but despiteprogress in recent years, he remains sceptical of its use to help problemgamblers. The application of such a tool may be better used personally insteadof broadly, he believes, much like the “Your spending is 25% higher than lastmonth” reminders that pop up in online banking accounts.

“It’s sort of like drinking. Is thereanyone you know who hasn’t gotten drunk once in their life? Doesn’t meanthey’re an alcoholic,” he said. “But maybe that one drink a night that’s kindof become one and a half, sometimes two, sometimes three — maybe you want tobring that in a little bit. But you don’t want to have the bar tracking everyrecord here, right?”