Skip to content

Sony is preparing an AI capable of censoring your games in real time, without going through developers



Sony is no longer on its first ambitious patent, but this one is particularly talked about. The Japanese giant has imagined a system powered by artificial intelligence capable of censoring games on the fly, by analyzing their content in real time. A technology that doesn’t just block a word or two: it can cut a scene, blur a character, or even generate a deepfake to hide elements deemed inappropriate.

This AI therefore does not just filter. It dynamically intervenes in the game, modifying it by the second according to the preferences defined on the user account. Violent scenes, nudity, crude language… everything can be erased or replaced on the fly. All without going through the developers.

A patent with dizzying technical possibilities

The patent, spotted by several specialist sites, describes a tool capable of automatically identifying sensitive elements in a gaming session. A combat scene that is too graphic? The AI ​​can blur it. Raw dialogue? It will be cut or modified. An outfit considered too suggestive? It can be replaced using alternative models.

This technology goes far beyond traditional parental controls. Parents will be able to set up a “family profile” directly through the PlayStation Network account, adjusting what can be seen or heard based on the child’s age. But this system also applies to all profiles, even adults, in a logic of absolute personalization.

According to the sources, this system could also relieve studios of the burden of producing multiple versions of their game for markets with strict censorship laws, such as China or Germany.

When AI invades artistic freedom

If technology impresses, it worries just as much. An AI that modifies works in real time, without intervention from developers, poses a central question: what happens to the original purpose of a game? Sony assumes that this artificial intelligence acts as an intermediary between the game and the player. “This system would fundamentally transform the way sensitive content is managed in video games, but it also imposes a new layer of algorithmic interpretation between the work and the player.”

In a narrative game, every word and every scene has meaning. Deleting or transforming an element is not trivial. This amounts to modifying the message carried by the work. Some are already talking about an “algorithmic gag”, and the fear of creative impoverishment is beginning to emerge.

Another extract summarizes this concern: “An AI that modifies a work without the approval of its author becomes, in a certain way, a co-author… or an invisible censor.”

A personalized experience… or fragmented?

With this technology, two players could experience a radically different version of the same game. A shocking scene for one, a blurred passage for the other. Assumed violence here, an entirely masked sequence there. This level of personalization poses a new question about the unity of the video game experience.

Critics ask: “When each player experiences a different version of a game, can we still speak of a shared work or collective experience?”

Collective discussion around games, reviews, forums could lose their meaning if everyone sees a different game. What could appear as progress towards inclusiveness perhaps hides a risk: that of the dilution of the subject, the work, or even the game itself.

A controversial innovation awaiting realization

For now, it is only a patent. No official announcement from Sony confirms the implementation of this system in the next consoles or upcoming titles. But just thinking about it seriously marks a major turning point.

Between creative freedom, public protection and algorithmic personalization, Sony opens a technical and ethical Pandora’s box. An AI capable of censoring on the fly could appeal to some families… but it risks lasting concern for creators and fans of games with a strong message.