Facebook Horizon's moderators can invisibly observe and punish users in real-time
Facebook has announced that it is rolling out access to its virtual reality world Facebook Horizon's invite-only public beta test. Though its user base will greatly expand, its current surveillance and content moderation policies will maintain what many may feel to be overly intrusive.
Multiple measures invisible to the end user are implemented in order to moderate Facebook Horizon. For example, the last few minutes of every user's video and audio feed of the game is recorded on a rolling basis to each user's Oculus VR headset. When a user is reported, data from relevant headsets is sent to Facebook's servers for review and potential moderation.
In addition to passive invisible measure, more active yet equally invisible surveillance and moderation actions are implemented as well. If someone reports another Horizon user or if what Facebook deems is "unusual activity" is detected, a Facebook moderator will potentially watch your activity and conversations to see if you are in violation of Facebook's Community Standards. This is completely invisible to the user being watched.
In response to the more widespread reporting of Facebook Horizon's moderation practices, the company released a video showcasing these invisible moderation features. Facebook has not announced any changes to their moderation policy or tactics as of the publication of this story.