There has been much hype around Jack Dorsey’s latest announcement about the Bluesky project, which aims to develop an open and decentralised standard for social media. On the one hand it’s not surprising following the uproar on a few high profile bans on Twitter, but we think this moment represents a deeper existential conflict inside social media platforms.
Once upon a time, we were all startups in search of users — hundreds of millions of users…
Jack points out two specific scholarly articles as inspiration behind Bluesky; one by Stephen Wolfram and his senate hearing on AI, and the other one is by Mike Masnick on pushing for protocols against platforms. Both of them are timely and try to create alternatives to the way social media operates today. Before we dig any deeper in though, first principles force us to think what problem these efforts try to solve? Jack says;
- They’re too centralised now: ‘for a variety of reasons, all reasonable at the time, we took a different path and increasingly centralised Twitter. ’
- This centralisation makes it very difficult, for example, to enforce global policy (read it as; they are trying to be a ‘global police’ while making legal and legitimate decisions at every territory. That is difficult. So without paving the way to moral relativism, establishing a mechanism for local or separate organisations to do that in their so called ‘jurisdictions’ in a decentralised structure may be feasible. Whether that can prevent further misinformation, trolling, personal and public, institutional damage in trust is up for debate. It would probably limit only the distribution of that sort of content.).
- Users don’t have a choice and get exposed to what the proprietary algorithms of these platforms serve them, while often leading to polarisation and controversy
- Technology has developed to a point where decentralisation looks feasible
Well, what he’s trying to say actually is this; all of these ‘automated content selection businesses’ were born in the heydays of the Internet where open protocols were abundant and the number of users was the main metric of growth (at least that’s what the venture capitalists cared about most). In the end, social is where the people are; once you gather the people and keep their attention in your content longer and longer, advertisers will eventually come, as they need the eyeballs for exposure to their brands.
There’s a reason why YouTube’s main KPI is ‘watchtime’, or Facebook’s MAU, as they need to retain their users relentlessly. So these startups preferred to build proprietary, advertising biased algorithms and became platforms on these protocols which form their business models today.
How to fit a triangle into a square?
What are the potential alternatives that Wolfram and Masnick propose to fix these issues that centralisation and integrated AI had created?
In Wolfram’s case, he focuses on the way the algorithms work and thinks of splitting some part of the content selection between the platforms and third parties; so
- either allowing the users to choose among ‘final ranking providers’: platforms do their AI magic on understanding the user information and content features but the final matched selection ranking is done by somebody else;
- or ‘constraint providers’: more complicated than the first option as the automated content selection algorithm stays intact as a system, but the third parties come in to customise the constraints (e.g. to create balance in news) before the matching is completed.
It’s a fascinating read so please go ahead and have a look. What you’ll realise in the end though, is the painful need to fit these solutions into the existing business models of social media platforms. Wolfram quite frankly admits this fact and imagines where these third parties would increase their ads revenue with for example better targeting (does anyone think Google and Facebook have targeting issues?).
And of course, these providers will be paid in ad commissions, and in each case the platforms will create separate businesses as providers themselves. If history teaches anything, in each case, platforms’ own providers will be the ‘default’ option for the users, who wouldn’t bother to switch to any other provider. This is, IF the platforms are willing to even consider these options.
The fact is that the content selection and ads targeting AIs of these platforms are incredibly integrated and are bigger than and intrinsic to the platforms themselves now. Just imagining the massively integrated algorithm behind all of Google search everywhere (mobile, video, apps), would be enough to drop the idea just right there.
In Masnick’s case; he proposes to bring back the golden days of the ‘open Internet’ where open protocols were the standard and for ‘some reason’ companies began to build closed, proprietary platforms around them. The reason was, again, of course, mainly the business models of these companies and their need to tightly control the platforms from a product perspective.
Open protocols could indeed help solve some of the issues created by deep centralisation around the platforms; however, how these new ‘interface providers’ and ‘new social media companies built on the social media protocol’ will generate revenue is still not clear. Well, we guess it will be either paying for new variants of the same protocol or same-old advertising business model based on targeting.
Pseudo-solutions and flying elephants on the platforms
The fact that society has begun to ask for alternatives and question the way these algorithms work (with bias created, polarisation deepened) is a positive sign in itself. However some of the proposed solutions could even make the issues we have right now worse; as in the example of ‘final ranking providers’ in Wolfram’s suggestion which could immediately lead to ever closed echo-chambers that he himself points out as something potentially happening.
Social media platforms need to have deeper conversations with honesty around their ad based business models and the super integrated AIs that they have created. Science and engineering can solve the issues around algorithms and how to balance them in order not to drive the public crazy in wormholes, but they can’t fix the problem of having to keep the users engaged on these platforms to serve them more ads.
You simply can’t achieve them both.