Mastodon, Decentralized Solutions, and Privacy
A friend of mine, who joined my server beach.city, asked me the other day about concerns around privacy regarding Mastodon and similar decentralized volunteer-run software solutions. One of his biggest concerns is who can read what, and what the implications of that could be. We talked for a while and ended the conversation that day, as it was on New Year's Eve and the ball was dropping and all... But I thought I should go into more detail into my thoughts and understandings around privacy and similar issues related to mastodon and whatnot.
Let's start by talking about what Mastodon privacy currently looks like, after which we'll compare that to corporate social media, then consider some possible scenarios regarding privacy violations, how they might occur, and what the consequences would be for the admins.
What Mastodon's privacy looks like
Mastodon, as you may know, is a decentralized social network. Each individual server (or “instance”) of Mastodon is completely self-contained and owned and operated by a volunteer individual or group. The implication of this is substantial. A Mastodon instance runs on a server owned by the instance administrator, and that server stores all your private DMs as well as all the images that you share
That means that an administrator could, if they really wanted to, read your DMs and look at your pictures. However, it's not easy to do that. All of these images and private messages are all stored in databases and hidden under various user ids and stuff. In fact, the standard Mastodon software does not permit an admin to do this using that software. An admin would have to create their own software in order to gain access to this content, or know the right commands to execute on the database and what files to hunt down.
This concern is multiplied by the fact that this is the case for any server that you interact with. If @email@example.com sends a private DM to @firstname.lastname@example.org, both myself (The admin of beach.city) and Sky (the admin of yiff.life) have access to that DM, should we dig into the database/files to find it.
How does this compare to commercial social networks?
Pretty much any twitter/instagram/Facebook employee (with appropriate access credentials) could gain access to your private messages in much the same way that admins at a particular Mastodon instance could do so. The company itself can also use its access to your private messages to determine your demographics and interests for targeted advertising.
Both corporate and non-corporate social networks have the common risk that you're turning your information over to another entity. The question you need to ask is “Who do you trust with your information?”. With private, volunteer-run social media, you have to ask yourself if the admins, and their surrounding communities, are trustworthy. With corporate social media, you have to ask yourself if the company is trustworthy.
I frequently look at motivations to answer that question. Corporate social media is profit driven. They want to make money off of your personal data, and off the opportunity to advertise to you. Private admins in the mastodon community are often motivated by contributing to the larger mastodon community.
Both private and corporate social media are subject to the laws and regulations where they are located, as well as international law in the case of users and admins in different jurisdictions. Any violation of your privacy by the owners of a social media network can potentially be redressed via the local legal system(s), where applicable and appropriate.
Where they differ regarding this is in the extra layers before legal redress. In the case of a corporate social network, the company can take action against bad actors inside their company, such as firing an employee that abuses their access. In private social networking, the extra layer is really the meta-layer social network between network administrators. That is, if it is revealed that a mastodon administrator is abusing their users' privacy, other administrators can block access to that server to protect their users and isolate that admin away from themselves.
These are different, and neither is really better than the other. Personal and community bias are more likely in the private case, whereas impersonal bias is more likely in the corporate case.
How it might play out
The problem with the corporate case is that what I'm calling “impersonal bias” is really their inability to understand the needs of small groups of people or to adjudicate in conflicts. Without understanding the nuances and cultural context, a fight between TERFs and trans people can be difficult to understand who is wielding power and who is not, for example.
The private social media case enables moderators and admins to have better visibility into the particular social issues that plague a particular group. Moreover, groups are capable of self-selecting who they do and do not want to be interacting with, which can allow peace between groups that have toxic interactions. Of course, the challenge with this is that personal biases, such as not liking the right fan fiction slash pairing, can wind up coming into play if an admin has strong feelings about that, and there's no corporate team around them protecting folks from that.
However, one saving grace from this is the community aspect of moderation. If an instance administrator is moderating things in a way that other admins deem inappropriate, or worse, is violating people's privacy, admins that find out will suspend or ban that instance. We've already seen multiple examples of this, where an instance admin was not moderating well, where an instance admin was talking about violating user privacy to mine data, and more. In every case, admins highlighted the behavior, shared it with other admins, and collectively blocked access between that instance and their own.
For myself, I am of the opinion that Mastodon is a safer social network than the corporate social networks. I trust a community of individuals far more than I'd ever trust a corporation to make good decisions for my community. I know that the admins I work with aren't trying to make money off me. They're not selling my data for a profit. They're not abusing my privacy. And I know that they'll understand more nuances about my demographic situation if and when a conflict arises.