As the U.S. emerges from perhaps the most divisive election in living memory, Facebook Inc.’s role in perpetuating social fragmentation should come back into focus. It’s resurrected a question that has long puzzled me: Why does Chairman and Chief Executive Officer Mark Zuckerberg seem to care so much about his shareholders?
He has built a business behemoth, is rich surely beyond his wildest dreams and his control of the company means that there’s almost nothing investors can do to replace him. And yet time and again, when presented with a choice between making Facebook — and society — a gentler, better place, versus fueling user growth and engagement, he seems to opt for the latter.
In doing so he is essentially prioritizing shareholder value. “But that’s his fiduciary duty!” I hear the Milton Friedman purists among you holler. But is it?
Facebook’s dual-class share structure gives Zuckerberg control of the voting stock even though he owns a minority of the company’s economic interest. The advantage of such an arrangement is supposed to be that it lets a founder-CEO make decisions that are in the long-term interest of the company even if they destroy some short-term value. Investors knew that was the deal when they bought the stock, and if they don’t like it, they can take their money elsewhere.
This should mean that Zuckerberg needn’t worry too much about near-term business headwinds. Yet there have been a slew of reports this year suggesting that the company has continued to pursue outsize profits at the expense of the broader social good.
The Wall Street Journal, New York Times, Washington Post, Verge, BuzzFeed News and plenty more have recounted how Facebook’s own research showed its platform was deepening social divisions and spreading misinformation — and still the company tempered or scrapped tools that would help because doing so would also reduce user engagement.
It is fully within Zuckerberg’s power to decide that such metrics are a secondary or tertiary consideration. Perhaps he feels burned by the experience of 2018, when he announced that investments in content moderation and security were starting to “significantly impact” profitability. The stock lost 43% of its value over the subsequent five months.
It has since recovered, however, and risen far beyond its peak from that year. Revenue and profit have continued to grow apace, meaning that, over the longer term, investing in moderators was the right choice, even if there’s far more to do. The lesson to be taken is this: Cutting down on toxicity is not bad for business. Nor is it about censoring partisan content. Rather, it’s making a conscious decision to improve the newsfeed presented to users: less bilious, more beneficent.
What’s more, companies now broadly acknowledge that investors are not their only stakeholders. In 2019 the Business Roundtable, a group of major U.S. companies, agreed on a new “Purpose of a Corporation,” which included delivering value to customers and supporting the communities in which they work. Zuckerberg was not one of the 181 CEOs who signed it. He may want to rethink that.
Irrespective of his controlling stake, Zuckerberg also has little to fear in terms of shareholder retaliation in the courts. The business judgment rule — a strand of corporations law that presumes an executive is motivated to act in the best interest of the company — means that, short of him burning piles of investor cash or blatantly self-dealing, a court is unlikely to hold him accountable for a corporate strategy that investors dislike.
Remarkably, BP Plc may be a model for the sort of reset that’s needed at Facebook. For sure, the oil major is facing tougher headwinds, with revenue declining and a loss expected this year. But since his appointment in February, CEO Bernard Looney has started a pivot toward renewable energy sources. The market has punished him with a 45% stock decline, but he has otherwise gained plaudits for making tough, but good, strategic choices.
Facebook already has the necessary tools to stop accentuating partisanship and to cut down on misinformation. Just this week, the company said it would remove posts that contain debunked claims about Covid-19 vaccines. That’s a welcome change for a platform whose recommendation algorithms were at one stage pushing new mothers towards anti-vaxxing conspiracy groups, a classic example of the way that its systems cannot just facilitate but exacerbate division.
As Facebook tries to ingratiate itself to the incoming Joe Biden administration, Zuckerberg & Co. would do well to underpin the effort with more actions that support the rhetoric. Zuckerberg has been justifiably criticized for his concentrated power at the helm of the company. But it also gives him more scope to make decisions that aren’t driven purely by short-term financial imperatives. Doing so would be less painful than he might imagine.