Discord, a popular nonfree instant messaging and voice/video call
social platform with more than 150 million active users each month has
joined other social platforms in announcing an age identification
policy. Age verification policies are promoted as being necessary
for protecting kids and teens online, but in reality these policies
force users of all ages to interact with nonfree, invasive
programs. Discord’s new age verification policy does not go
against this worldwide trend, according to its new policywhich
includes a background-running “age inference model” for confirming
each user’s age group:
“We leverage an advanced machine learning model developed at Discord
to predict whether a user falls into a particular age group based on
patterns of user behavior and several other signals associated with
their account on Discord. We only use these signals to assign users
to an age group when our confidence level is high; when it isn’t,
users go through our standard age assurance flow to confirm their
age. We do not use your message content in the age estimation
model.”
Discord is asking users of all ages for a lot of information while
offering very little reliable reassurance that their data won’t be
misused, or even what data will be gathered. It is unclear from
Discord’s press release which types of data will be collected and
run through its “age inference model,” if the same forms of data will
be used for all users, or the maximum amount of data that could be
examined automatically before requiring bio data or a government ID.
Based on Discord’s press release, it seems that not just a couple of
data points will be run through its opaque age inference model but
quite possibly a great majority of all behavior on the platform.
Also notably absent from Discord’s most recent press release is that
there doesn’t seem to be a way for users to opt-out of background data
searching, other than to delete their accounts. The only choice
offered to users so far is if they want to submit bio data or a
government ID should they fail the automatic age verification process.
While Discord has stated it will publish a blog post explaining
the mechanics behind its automatic age verification process, there is
no telling how transparent the platform will be or when to expect this
blog post.
Why trust matters
Discord and its vendors looking through our data isn’t just creepy —
it is dangerous. Use of Discord requires running a nonfree client
program, which means anything could be happening in the background
without your knowledge. Once a platform like Discord or its vendors
has collected data about you or your computing, it can be stored,
misused, leaked, or hacked. When a proprietary software provider
stores data about a user (especially data that could be linked to
offline activity), it can be very risky for the user. Many people rely
on anonymity to speak freely online, gather with other like-minded
people, and even to live in safety. If users can’t trust that Discord
isn’t surveilling their online activity and misusing their data, they
are forced to either accept the risk and possibly change how they
interact or log off Discord forever.
Discord’s track record is hardly encouraging
Discord has already failed to protect user data from bad actors, and
demonstrated at least some comfortableness with mass surveillance.
Just a few months after age verification in the UK went into effect,
Discord contacted over 70,000 users regarding a data breach of
one of its third-party customer service providers. In the breach, a
wide variety of data was accessed in the attack, including:
Discord login and contact information; some payment information; IP
addresses; and messages and attachments sent to customer service
agents. The data this unauthorized party was able to obtain during the
attack was limited to the number of users this particular Discord
vendor had stored. Depending on how quickly any data used to verify
age is actually deleted by Discord and all its partners, the data of
more than 200 million monthly users could be at risk in the next data
breach.
Before Discord announced it would be delaying roll out of age
verification, some users reported prompts for age verification
from Persona, an identity verification service. Persona, backed by
Peter Thiel (the co-founder of the surveillance firm Palantir) isn’t
just a neutral middleman vendor: it could be a tool for mass
surveillance for the US government on hundreds of millions of
users worldwide. Just a few weeks ago, an uncompressed version of
Persona’s front-end code was found on a federally authorized
server. While Persona denied ties to US government agencies
(including the DHS, ICE, and NSA), Discord ultimately ended its
partnership with the controversial company. Discord may not be working
with Persona anymore, but it has not yet denounced plans to work with
other mass surveillance vendors.
Public pressure makes a difference
In response to intense public backlash, Discord made a major update to
its original press release: age assurance would be delayed until the
latter half of 2026. In this update, Discord stated that it would:
“expand verification options, increase vendor transparency, and
publish detailed technical documentation, while continuing to meet
regulatory requirements where needed.”
Public pressure can hold a lot of weight, as it seems to have been at
least partly responsible for Discord updating its original press
release. We need to keep putting pressure on Discord to become as
transparent as possible by freeing its code. No matter how many
promises Discord makes and how much information it offers up
voluntarily, users cannot put full trust in a platform that has hidden
code. While Discord has promised that its vendor partners will delete
this data quicklywe cannot be sure of exactly how “quick” the
data is deleted, if it is copied, and who all sees it before it is
supposedly erased forever.
Discord can repeat endlessly that any data used to verify a user’s age
will only be held temporarily, but as long as the code remains opaque
users cannot trust that Discord is telling the truth. Without full
transparency, users can only guess as to how much Discord is
surveilling its users, and for whose benefit. But, even if Discord does
free its code at some point, users must be allowed to communicate with
others on Discord or any other service, including with users who
refuse to submit to verification. Discord is one of the latest to
gather identification data about its users, but it likely will not be
the last. If Discord wants our trust, it needs to earn it by freeing
its code and respecting users who don’t want to submit to an invasive
age verification process to continue using Discord.
Eko KA Owen
Outreach & Communications Coordinator
“[Meta limita el reconocimiento facial en Facebook, pero seguirá usándolo en sus futuros productos][13]” © 2021 by Gibrán Aquino. This image is licensed under a [Creative Commons Attribution 4.0 International ][14] license.
