Meta's introduction of verification may ultimately be a clever move in anticipating the costs of policing the content on its platforms under proposed legislation
It may seem counterintuitive that while YouTube and TikTok appear to be going out of their way to court creators, Instagram and Facebook are levying fines in the form of subscriptions.
YouTube pays out around $10 billion a year to creators while TikTok has just introduced the Creativity Program Beta - its Creator Fund 2.0. Meta Verified on the other hand charges creators for access to better authentication and security and the promise of increased visibility and reach on the platforms.
But, perhaps Mark Zuckerberg has seen which way the wind is blowing and has totted up the cost. The Meta CEO has dubbed 2023 the ‘year of efficiency’. In a comment to his blog post announcing the test launch of Meta Verified Zuckerberg explained, “Verifying government IDs and providing direct access to customer support for millions or billions of people costs a significant amount of money. Subscription fees will cover this.”
So why charge now? Well, apart from recurring revenues smoothing the P&L in a way that cyclical advertising revenue is unable to, proposed legislation imposes new undertakings for big tech to police the content on their platforms.
The UK’s Online Safety Bill has begun its final stages in the House of Commons. The E.U.’s Digital Services Act will be directly applicable across the E.U. from 1 January 2024. Regulators around the world are watching these bills closely with the intention of tailoring them to their own jurisdictions’ needs.
The Competition and Markets Authority, too, has issued guidance to social media platforms on how they should deal with hidden adverting in creator content. Platforms are advised to, “Take appropriate, proportionate, proactive steps and use available technology to prevent hidden advertising from appearing on their site.” And to promptly remove content which has been confirmed as hidden advertising.
If applied, this guidance will open platforms to a deluge of complaints from creators dissatisfied that their content has been removed. Platform support and moderation will require both tact and better funding - especially at a time when Meta has made 11,000 employees redundant and announced a further 10,000 redundancies this year.