Google: It’s hard to believe Google or Facebook could learn something from Wall Street. But the Internet's two giant success stories may want to consider the way banks set up internal committees to balance short-run temptations against long-term risks. Tapping into users’ information promises lots of advertising dollars. But going too far, as Google’s Buzz did, can trigger privacy concerns and send customers fleeing.
Buzz integrates social networking features into Gmail, Google's popular email service. Among its features was a nifty tool that created a list of contacts a user emailed or chatted with most often — and made it public. For customers whose top contact was a mistress, shrink, or defense attorney, that's hardly helpful. Google quickly made changes to Buzz, but the firm’s reputation for keeping customer data rightly took a dent. Facebook has had similar problems.
While both firms are sensitive to users' privacy concerns today, the worry is they won’t be in the future. Sharper customer targeting technology, combined with rising ad revenues, could tempt them to relax barriers against sharing customer information. Giving in to this desire would be a mistake if it risked encouraging the entry of new firms offering similar services and applications with far greater discretion.
A good bulwark would be to establish bank-like conflict committees that can kibosh plans that, though they may appear lucrative in the short term, could put the company at greater risk down the road. The way this ideally works within a bank is that a committee of senior executives adjudicates whether a deal, say a loan dripping with fees, could wind up imploding at a later date, or whether financing a particular company could hurt the firm’s reputation among a wider client base.
While it’s true that Wall Street’s conflict committees failed on numerous accounts, they offer a sort-of model that companies like Google and Facebook — who trade in other peoples’ private information — could build upon. Simply by asking pointed questions, the members of a committee should force application producers to consider what users want more carefully. Second, since no developer wants to have a product sent back for revision, it might encourage privacy barriers to be built in from the beginning.
These sorts of safeguards aren’t only “not evil” — they are also good business.