Dennis Schubert

Why 'unlimited limited' posts are not a thing on diaspora*

2016-12-01 diaspora, software

In diaspora*, when you publish a post, you have to make a decision about the projected range of your post. You can choose between limited and public. If you decide to post your contents to a limited range of people, you get to choose the aspects (that is our fancy slang for contact groups) you want to share with. If you choose public, that’s it. Your post will be on the internet.

People often get confused when talking about our public state since they sometimes have wrong expectations. As you might know, diaspora* is pretty privacy-oriented, and we try to give the user as much control over their contents as possible. However, if you choose to make your post public, we make it public. Really, public. The post will probably get indexed by Google very soon, and everyone, even blocked people, can see the contents.1

You might get confused about us making stuff that public. If you are confused, no worries, you are not alone. We get questions about that very often, and “please implement a ‘diaspora* only’ share visibility” feature requests are very common. People want to be able to publish their posts into a scope where the posts are not visible to outside people, or search engines. Basically, they want an “everyone but X” share visibility.

So, why is it still not implemented? It’s simple: we are not able to design this feature in a way we feel comfortable with.

You might say “but hold on, Network X did implement that!”, and you might be right. I am very well aware that some members of the federated social network club have implemented a feature where you are able to publish a post that can be seen by all members of a given network but not by anonymous users or search engines. diaspora* does not offer you such a choice and most likely will never offer you such a choice, since I think it is a pretty dangerous feature.

Before I go into details on why I think this is dangerous, let me explain the technical difference between limited and public posts in diaspora*.

  • Limited posts have a fixed set of recipients. For each recipient, we take the posts contents, sign it, and encrypt the contents with the public key of that specific recipient. We then deliver those encrypted messages to the recipient’s pods.
  • Public posts do not really have a fixed set of recipients, and their contents are not encrypted, only signed. We generate a list of initial recipient pods by looking into a user’s contacts and into the information we have about who is sharing with the post’s author, but that’s only the starting point. If one of those original recipients reshare that post, diaspora* in turn sends the post to all of the resharer’s contacts. That way, the post will get distributed to pods the post author’s pod did not have a connection with.

Our distribution model for public posts is great since it’s not limited by the amount of connections the author has. But here comes the catch: we have no idea who is going to see that post and we have absolutely no chance of policing the distribution in any way.

In theory, we could add a simple switch that says “please do not show this post publicly” into a public post’s metadata and we would met our user’s requirement. We would probably get a bit of user love and some people would be pretty happy, I guess.

However, by implementing that, we would lull our users into a false sense of security. I am fairly certain a lot of people would use that “unlimited limited” share visibility to avoid being searchable on the internet and they would feel pretty comfortable with that. However, the post would still get sent around to servers all over the world and probably to a lot of different software packages. Our protocol is not very hard to implement, and sooner or later, there will be software that simply does not care about our pseudo-private flag. And suddenly, all posts will be public again, visible for search engines, visible for random people to start harassment campaigns.

Explicitly whitelisting recipients is the only way you have true control over your contents, implementing a blacklist-like behaviour is a bit like mounting a picture to an advertising pillar and asking some people to not look at it.2 Simulating privacy by providing a false sense of control is not what diaspora* is about. It’s about having control and making clear indications if you do not have control.

Footnotes

  1. Do not get confused, though: Not everyone can interact with your posts. People blocked by you will neither be able to like nor comment on your posts.

  2. At least that’s my current understanding. I have yet to be proven wrong (I’d love to be wrong!), and I am looking forward to actual technical arguments and ideas on how this could be achieved.