"Passport verification in games is no longer a question of 'if,' but rather 'when' and 'where'," says Natalia Vozian from Xsolla about the tightening of regulations
We discussed in detail how governments worldwide are beginning to tighten verification rules in games with Natalia Vozyan, who holds the position of VP of Legal & Operations at Xsolla.
Natalia Vozyan
Alexander Semenov, App2Top: Increasingly, we read in news reports about the tightening of verification tools on the internet. Let's start by understanding when this topic began?
Natalia Vozyan, Xsolla: The idea of user verification has its roots in the discussion about child protection in the digital space. This agenda gained momentum around 2020, with every reputable legal journal writing about the risks for children online.
Who was the first to seriously tackle verification? China?
Natalia: No, China is among the pioneers, but the first significant step was the Children’s Online Privacy Protection Act (COPPA) in the U.S., which took effect on April 21, 2000. Then came China, and only relatively recently, in 2018, the General Data Protection Regulation (GDPR) came into force in the European Union.
Did it become a milestone?
Natalia: Absolutely. GDPR effectively introduced the concept of "know your customer" (KYC) in relation to personal data, including that of children. After its implementation, many countries took the European experience as a basis and adopted local regulations based on it.
Where is verification being actively implemented today?
Natalia: Recently, the UK Online Safety Act came into force in 2025, aimed at regulating content for children and requiring verification for the sale of digital content.
Brazil has introduced the most radical measures, literally this year. There, the usual "I confirm I am 18" is no longer sufficient — verification through real documents or biometrics is required. Self-declaration is completely banned.
You just mentioned self-declaration. Before we move forward, let's stop on this. What did you mean?
Natalia: Self-declaration is one of the verification tools. They can be divided into three types. The first is self-declaration: the user just checks the "I am 18" box. Fast, convenient, but absolutely unreliable. The second is verification through data: credit card, phone number, ID. Slightly more reliable. The third is verification through documents or biometrics: uploading a passport, Face ID, cross-referencing with government databases. The most reliable.
And now, is there a shift away from self-declaration?
Natalia: And it's quite logical: modern children start using phones before they can speak and understand perfectly well that when asked "are you 18?", they just need to answer "yes" to access content. Regulators have realized this and have started moving toward more reliable tools. Triggering additional urgency were parents' mass complaints about unauthorized spending by children, particularly in-game purchases with parents' credit cards for significant sums. This created public pressure and accelerated the shift to more stringent identity verification methods.
Is this happening in Russia as well?
Natalia: The situation here is different: user verification remains conditional, and the protection of specific groups is mostly regulated by the advertising law. However, public initiatives increasingly call for mandatory identification to access the internet.
Earlier, you noted that tightening verification is logical. From the perspective of the state as an institution, yes. But I'm curious what you personally think about the current situation?
Natalia: If we're talking about my position, the balance of interests is clearly disrupted here. There are industries where the same KYC is absolutely justified. But the gaming industry is not the financial sector. A game developer does not need to know a user's passport details to provide their services. Nonetheless, in Brazil, they are now required to store biometric data. This is both an operational burden and an enormous regulatory risk.
Regulators are essentially shifting the responsibility for controlling and monitoring children from parents to businesses. Monitoring what content a child consumes and how much time they spend in games should primarily be a parental function. Businesses should comply with reasonable restrictions, but when a game developer is required to collect biometrics and verify the age of every user, it's a disproportionate burden that doesn't fit the nature of the business.
I am not questioning the importance of compliance and protecting vulnerable groups — these are fundamental values. The issue is one of proportionality: regulation should consider the specifics of the industry and not create barriers where lighter tools are sufficient.
You pointed out well that for businesses, this is a disproportionate burden. But do I understand correctly that verification primarily concerns gaming platforms rather than the developers themselves?
Natalia: Look, the current wave of verification restrictions primarily impacts two major categories.
The first is financial services: banks, payment systems, cryptocurrency exchanges. Here, user verification has long been the norm, and KYC and other procedures are industry standards; no one is particularly surprised.
The second category is entertainment services. Here the situation is complicated. It involves streaming platforms, social networks, online gambling, and, of course, video games. These are under the greatest regulatory pressure today because regulators see major risks for children and adolescents there, whether it be access to inappropriate content, excessive online time, or in-game purchases.
The regulators' logic is understandable: if a platform allows minors to access content or sell something to them, it should control this. The question is whether the proposed tools are proportionate to this task.
What are regulators demanding from them or hoping to achieve?
Natalia: The stated goals are noble: protecting minors from inappropriate content, combating fraud, ensuring transparency in the digital environment. On paper, verification appears as a tool for the public good.
In practice, the picture is more complex. By requiring businesses to know their users, the regulator gains a ready infrastructure for collecting citizen data without its own expenses. Any company is obligated to provide accumulated information to a government request. Convenient.
Add to this the punitive component: non-compliance with verification requirements results in serious sanctions that become a significant source of revenue for the budget.
I wouldn’t say that the legal goal is purely pretty words. But for regulation to truly work in society's interest, and not just on paper, a dialogue between the regulator and business is crucial. This balance is currently lacking.
Speaking about changes in verification within the gaming sector, let's focus on China, where restrictions were truly implemented earlier than anywhere else (I’m referring not to legislative activity, but to where developers and users first faced restrictions in practice). Tell us about the experience of Chinese game developers, the challenges they faced, and how they addressed them?
Natalia: China can indeed be considered a true pioneer in this area. Back in 2007, several ministries jointly proposed a system of restrictions for minors in video games to protect their physical and mental health. A time-based gradation was introduced: how many hours a day representatives of a particular age group could spend in the game.
However, as is often the case, developers were left to tackle implementation on their own without specific tools. For several years, major market players experimented independently: collecting data from identification documents, implementing biometrics, all at the cost of conversions and colossal operational expenses.
It was only in 2021 that the state finally addressed the problem systematically. The NPPA (National Press and Publication Administration) launched a centralized national verification infrastructure, fundamentally changing the model. Now the developer can query the state database, which in real-time returns the user’s age and automatically enforces the appropriate restrictions. The burden of storing sensitive data was removed from businesses.
Nonetheless, it cannot be said that the problem is completely resolved. Workarounds still exist: children use parents’ documents, and some parents consciously help their children bypass verification. Developers identify such users post-factum, for example, during support calls or refund requests, determining minors by their manner of speaking and phrasing.
I also heard South Korea attempted a similar effort, but it seems they decided to abandon this practice?
Natalia: Korea is a prime example of how strict regulation can evolve into a more balanced approach. In 2007, the country introduced an internet verification system based on real names: users of major sites had to verify their identity using a resident number. In parallel, the so-called Shutdown Law was in effect, prohibiting children under 16 from playing online games from midnight to six in the morning.
However, by 2012, the Constitutional Court overturned the internet verification system, deeming it an excessive restriction on freedom of speech and the right to anonymity.
The Shutdown Law lasted longer but was repealed in 2021. Compulsory restrictions were replaced with parental choice: parents now decide when and how long their child can play.
Thus, Korea transitioned from state control to parental responsibility, and, in my view, that's a logical direction for the gaming industry.
We began our discussion essentially with the European Union. Now I want to return there concerning the measures adopted in Asia. Compared to China or how things were a few years ago in South Korea, the gaming sector in the EU doesn’t feel a strong shift in the verification approach. Is that true?
Natalia: Overall, yes, that’s true. I think the European industry is still digesting the GDPR implementation consequences. This regulation imposed a colossal load, and documentation alone, describing all data processing processes, requires significant time and financial resources. The industry isn't experiencing new large-scale shocks as of yet.
In the gaming sphere, PEGI — the Pan European Game Information age rating system — serves as a protective buffer. Combined with data protection requirements and refund rules, it allows for maintaining reasonable order without rigid verification.
However, GDPR already includes an important provision: processing a child’s data under 16 without explicit parental consent is not permitted. Formally, this implies age verification requirements. But until they are enshrined in separate binding acts for the gaming industry, businesses balance between these norms without resorting to passport checks for every user.
You mentioned the UK Online Safety Act earlier. It seems that verification is stricter in the UK than in continental Europe. I know even Steam has accounted for its requirements.
Natalia: The UK is one of the most serious examples of how a regulator truly intends to enforce what’s on paper.
The new provisions of the Online Safety Act ban self-declaration, verification via payment methods that do not guarantee age, and limitations through user agreements — all falling short of the regulator’s "highly effective verification" standards.
And yes, Valve responded to the law and updated Steam's policy: UK users now need to verify their identity to access 18+ rated games. Valve chose the approach of using linked credit cards, an elegant solution with minimal impact on conversion. Although many experts doubt whether it will pass as "highly effective."
The UK regulator also recently fined AVS Group, managing 18 adult sites, £1 million for lacking proper age verification. The issue wasn’t entirely the absence of verification — AVS’s system accepted photo uploads without verifying "liveness," allowing children to simply show someone else’s photo to the camera.
The law is already changing major players’ behavior: YouTube, Spotify, and several other platforms have implemented verification via identity documentation.
Harsh. What about in the States? You mentioned they were the first to tackle verification, but what's happening now?
Natalia: Discussing the US is challenging because developments occur simultaneously at both state and federal levels, moving in different directions.
At the state level, the process is rapidly advancing. In 2025, laws requiring mandatory age verification came into effect in nine states, including Florida, Georgia, and Missouri. For now, regulators primarily target platforms with adult content and social networks. The gaming industry proper hasn’t been reached yet, but the trend is clear: the boundaries are continually expanding.
On a federal level, there are attempts to create a unified standard — for example, in November 2025, a congressman proposed the Safer GAMING Act, mandating online game developers to integrate parental control tools, like blocking chat between children and other users. A national standard would be a sensible solution since monitoring each state’s requirements and adjusting to them individually is practically unmanageable.
As for prospects, it’s ambiguous. The First Amendment, guaranteeing freedom of speech, already acts as a real barrier: courts have blocked several state laws based on this. Studies also indicate that verification doesn’t achieve its goals — users simply shift to other platforms or use VPNs. After a law was introduced in Florida, VPN demand increased by over a thousand percent.
With all these practices tightening, should we expect full passport verification in high age-rated games?
Natalia: Full passport verification in games with high age ratings is no longer a question of "if," but "when" and "where." Brazil has practically already reached this point. The UK is moving in the same direction. China has implemented a centralized model, with the state itself acting as the verification operator.
Technically, it's feasible — the question is at what cost. A platform with a long-standing audience, where users have built accounts over years and invested real money, would suddenly need to ask all these people to show their passports. The audience's reaction is predictable, and conversion losses are inevitable. Additionally, the developer becomes an operator of sensitive personal data with all that entails — storage systems, data breach protection, compliance with local data processing requirements.
The question isn’t whether it will happen, but whether the industry will be ready when it does.
Oh, tough times await us. Thanks for the conversation!
