Facebook is to invest tens of millions in “community leaders” – on the condition that the community leader uses the “Facebook family of apps and services”. We argue for reducing the role of Facebook in your community.
Facebook is currently under fire due, among other things, to the furore surrounding fake news and the co-option of Facebook by terrorists. At a time when numerous critics have concluded that Facebook divides us, Mark Zuckerberg now sees the platform’s future in its “groups”. Therefore, the company is investing in a friendlier image.
One thing the company is doing, is trying to make its services safer for users, through the expansion of its “Community Safety Team” and the launch of new functionalities, including a designated area for groups’ rules of behaviour. On the face of it, a welcome development: it is important to reflect on how platforms are designed. Which functions serve which people? How can clever design at early stages of product development minimise the amount of objectionable content being posted?
Of course, measures that make Facebook safer to use, are welcome. Still, this development leaves us feeling uneasy. By investing in “community leaders”, Facebook is meddling with society’s most essential category: its communities. It is an intervention we can do without. These are our prime objections.
Not a neutral conduit
Facebook shapes interactions between you and your community, and between the members of your community amongst one another. Sometimes, it does this explicitly, for example by offering certain functions and not offering others. But it also does this invisibly, by indirectly influencing your decisions, “nudging”. In both cases, the user has little or no insight into how they are being ”guided”, not to mention that they have little or no control.
Facebook controls what you do and don't discuss
Do we really want to let our freedom of expression be dependent on the terms and conditions of a multinational corporation? Because it is Facebook that decides what is acceptable and what is not. The company claims it wants to be a “friendly platform” that “brings people together”. That automatically means the prohibition of certain types of statements. In the past Facebook has, for example, considered it necessary to block cartoons of the Turkish president Recep Tayyip Erdogan – they clearly didn’t bring people together enough! YouTube, in turn, deemed it important to label LGBTQ-related content as not family-friendly.
Platforms are under increasing pressure to combat the spread of “undesirable content”, a term now being applied to everything from copyrighted material to extremist propaganda and from unpopular opinions to hate speech. If the current trend continues, we can expect to encounter censorship on platforms with increasing frequency in the future.
If the current trend continues, we can expect to encounter censorship on platforms with increasing frequency in the future.
Around the world, communities are doing important work and are taking a stand concerning something they are passionate about. In doing so, they utilise the means available to them. It pays to look critically at those means. Do they work equally well for all members of the community? And where large platforms are involved, how can you prevent them from becoming the gatekeeper between the members of the community?
Invest in technology you control - and keep improving
Bits of Freedom struggles with this, too. We choose to use Facebook and Twitter, but with some restrictions: We do not profile individuals through advertisements, nor do we upload pictures of people or invite them to events. In addition, we invest in channels of communication we control, such as our newsletter and email groups. Finally, we organise gatherings offline.
We realise our communication channels work better for some people than for others. For some, they might not work at all. If we want to increase our ability to rally support for our issues, we need to find more diverse ways of engagement. Where do we start?