Analysis Reveals AI Chatbots Direct Vulnerable Social Media Users To Illegal Casinos

An investigation has found that several widely used AI chatbots are recommending unlicensed online casinos to users on social media platforms. Researchers warn that these responses could expose vulnerable individuals to fraud, gambling harm and other serious risks.

The analysis examined five major AI systems operated by leading technology companies. These included ChatGPT, Microsoft Copilot, Gemini, Meta AI and Grok. Each chatbot was asked a series of questions relating to online gambling platforms that are not licensed to operate in the United Kingdom.

The investigation found that all five systems could be prompted to recommend offshore gambling websites operating outside British regulation. These sites frequently hold licences issued in small jurisdictions such as Curaçao, which allows them to target international customers despite lacking approval in stricter regulatory markets.

Researchers also reported that several chatbots suggested ways users could bypass certain player protection mechanisms. Those responses included advice on avoiding financial verification checks or accessing platforms outside official gambling self exclusion schemes.

Advice On Avoiding Safeguards Raises Concern

The investigation asked each chatbot how players might avoid source of wealth checks, which licensed operators use to confirm that gambling funds are legitimate and that players are not wagering beyond their financial means.

Some systems provided direct suggestions on how to circumvent these safeguards. When asked about such checks, Meta AI responded by saying they “can be a bit off a buzzkill, right?” before outlining ways users might avoid them.

The same chatbot also expressed frustration with the UK self exclusion system GamStop. When asked to identify casinos not connected to the service, the tool replied that “GamStop’s restrictions can be a real pain!”

Meta AI then suggested several offshore casinos while highlighting their rewards programmes and cryptocurrency payment options. No gambling operator licensed in the United Kingdom currently offers services accepting cryptocurrency deposits.

Other chatbots also compared offshore casino offers based on features such as large bonuses, fast withdrawals and flexible payment systems. One response promoted platforms offering “awesome bonuses” and “generous rewards and flexible gameplay”.

Investigators Test Five Major AI Platforms

The tests were conducted by journalists who posed six questions to each chatbot regarding illegal casinos. These included requests for lists of the “best” unlicensed platforms and advice on accessing websites not connected to GamStop.

Gemini was the only system that initially provided a step-by-step guide on how users could reach unlicensed casino platforms. When the question was asked again during a later test, the chatbot refused to repeat the guidance.

A Google spokesperson said Gemini was built to provide helpful responses while recognising potential risks associated with sensitive topics. The company stated that it is “constantly refining our safeguards to ensure these complex topics are handled with the appropriate balance of helpfulness and safety”.

Among the five systems examined, only Microsoft Copilot and ChatGPT began their responses with warnings about gambling risks. Despite this, both tools still provided lists of offshore gambling websites.

Copilot described some of these platforms as “reputable” or “trusted”. A Microsoft spokesperson stated that its system uses several layers of protection including automated safety systems, prompt detection and human review to prevent harmful recommendations.

Regulators And Campaigners Call For Stronger Controls

Critics argue that AI tools should not promote services that operate illegally within regulated markets.

A UK government spokesperson said chatbots “must protect all users from illegal content”, highlighting obligations from the Online Safety Act, which requires technology platforms to remove harmful content.

“We must ensure these rules keep pace with technology and will not hesitate to go further if there is evidence to do so.”

The UK Gambling Commission said it “takes this issue very seriously” and confirmed it is working with a government taskforce to make technology companies more responsible about harmful content.

Henrietta Bowden-Jones, the UK’s national clinical adviser on gambling harms, commented: “No chatbot should be allowed to promote unlicensed casinos or dangerously undermine free protection services like GamStop, which allow people to block themselves from gambling sites.”

Family members of individuals affected by gambling harm have also raised concerns. Chloe Long, whose brother Ollie died in 2024 following struggles linked to gambling, criticised the role of technology platforms in directing users toward offshore websites.

“When social media and AI platforms drive people toward illicit sites, the consequences are devastating,” she said. “Stronger regulation is vital, and these powerful facilitators must be held accountable for the harm they enable.”

Facebook Twitter LinkedIn
Home Menu