Gaming giant Epic has removed a player-created scene from the Fortnite gameplay map after the game’s fans reported that it depicted seriously problematic content.
TheGamingEconomy will offer a brief warning here; this article will necessarily make very occasional and passing references to suicide.
Back in 2018, Epic revealed that an area of the Fortnite map would be given over to a region called The Block. The idea is that Epic uses The Block to showcase some of the best player-made Fortnite environments, rotating the content displayed in the area. In other words, The Block is a curated, interactive gallery of the best fan content, playable as part of the wider Fortnite setting.
As reported by Polygon, on 2 April this year The Block was updated to show a large building titled The Mysterious Mansion. Recently, however, Epic removed The Mysterious Mansion from The Block, replacing it with older fan-made content.
The removal appears to be motivated by player concerns that The Mysterious Mansion depicted a scene of suicide. The area was conceived by Fortnite player FuryLeaks, who shared a video tour of their creation on Twitter. In that video tour, there appears top be a scene showing a concluded hanging, realised in the game’s art style. It is not a particularity realistic or detailed scene, but it is hard to refute the depiction. You can see FuryLeaks’ video here on Twitter, though viewer discretion is advised.
It is worth noting that in the playable version of The Mysterious Mansion, the room where the scene in question exists showed only a dangling noose, and the character model had been removed. As such, in Fortnite itself an allusion to suicide only existed for as long as the fan-made content was live. Presumably, FuryLeaks’ video tour was of an earlier, unpublished version of The Mysterious Mansion.
In a statement to Polygon, an Epic spokesperson said: “This creator’s content was removed from the Block because it did not adhere to our content-creation guidelines.”
As such, it’s a relatively simple story. Epic blocked inappropriate and potentially affecting fan-made content from their game, for very clear reasons. While some will question how Epic ever approved the content for a showcase, the dangling rope seen in the playable version is somewhat different from what was seen in the video tour.
The situation does, however, raise a number of questions about hosting user-generated content (UGC). UGC is nothing new, and for years developers have worked to use automation, artificial intelligence (AI), player reporting, community management, and more, to detect, block, and remove problematic content made by players and then shared within a game.
Currently, however, developers and publishers must contend with the interplay between UGC, social media, and streaming services like Twitch. Add to that the potential for problematic player behaviour in online realms where thousands of games may be played in a day, and it can sometimes feel like a game can grow beyond anything a single game company can reasonably manage. Equally, though, the onus must be on game outfits to make sure the worlds they provide are safe places.
It seems clear that AI will have to play a larger role, working in parallel with human systems like moderation, reporting, and community support. Established automation to moderate game worlds sees the likes of pre-selected keywords added to a list of problem terms. When a player uses a listed word, an automatic response is triggered, be it a warning, banning, or suspension. Similar approaches also detect problematic imagery in fan-made content. While such methods play an important role, they rather lack a capacity for context and nuance.
With outfits like British tech company SpiritAI endeavouring to develop technologies like Ally that can send AI-powered non-human characters out into game worlds to converse naturally with players, there is much potential for the power of artificial intelligence here. Ultimately, Ally will let AI-powered non-human characters out into game worlds to check on players and ask about their experience of a problem. We may yet see technologies than can similarly understand the nuance of UGC, and differentiate genuinely problematic content from architecture that may simply look a little inappropriate when viewed from a certain angle.
Perhaps Epic would have benefited from advanced AI that can appreciate the context in which a rope is displayed in a game, for example.
We are not there yet, but smart developers and publishers would be wise to keep their ears to the rail with regard to technological solutions that meet the challenge of creating and hosting game worlds that remain appropriate for users.
Certainly, for games that rely on advertising and in-game monetisation to generate revenue, keeping content appropriate is absolutely essential.