When it was founded two decades ago, Google established an unusual corporate practice. Nearly all of its internal documents were widely available for workers to review. A programmer working on Google search could for instance, dip into the software scaffolding of Google Maps to crib some elegant block of code to fix a bug or replicate a feature. Employees also had access to notes taken during brainstorming sessions, candid project evaluations, computer design documents, and strategic business plans. (The openness doesn’t apply to sensitive data such as user information.)
The idea came from open-source software development, where the broader programming community collaborates to create code by making it freely available to anyone with ideas to alter and improve it. The philosophy came with technical advantages. “That interconnected way of working is an integral part of what got Google to where it is now,” said John Spong, a software engineer who worked at Google until this July.
A culture of transparency
Google has flaunted its openness as a recruiting tool and public relations tactic as recently as 2015. "As for transparency, it’s part of everything we do," Laszlo Bock, then the head of Google human relations, said in an interview that year. He cited the immediate access staff have to software documentation, and said employees "have an obligation to make their voices heard."
Google’s open systems also proved valuable for activists within the company, who have examined its systems for evidence of controversial product developments and then circulated their findings among colleagues. Such investigations have been integral to campaigns against the projects for the Pentagon and China. Some people involved in this research refer to it as "internal journalism."
Management would describe it differently. In November, Google fired four engineers who it said had been carrying out “systematic searches for other employees’ materials and work. This includes searching for, accessing, and distributing business information outside the scope of their jobs.” The engineers said they were active in an internal campaign against Google’s work with the U.S. Customs and Border Protection, and denied violating the company’s data security policies.
Rebecca Rivers, one of the fired employees, said she initially logged into Google’s intranet, a web portal open to all staff, and typed the terms: “CBP” and “GCP,” for Google Cloud Platform. “That’s how simple it was,” she said. “Anyone could have stumbled onto it easily,” she said.
In an internal email describing the firings, Google accused one employee of tracking a colleague’s calendar without permission, gathering information about both personal and professional appointments in a way that made the targeted employee feel uncomfortable. Laurence Berland, one of the employees who was fired recently, acknowledged he had accessed internal calendars, but said they were not private. He used them to confirm his suspicions that the company was censoring activist employees. Berland, who first joined Google in 2005, added that he felt the company was punishing him for breaking a rule that didn’t exist at the time of the alleged violations.
Secrecy change
Google declined to identify the four employees it fired, but a company spokeswoman said the person who tracked calendars accessed unauthorized information.
Other employees say they are now afraid to click on certain documents from other teams or departments because they are worried they could later be disciplined for doing so, a fear the company says is unfounded. Some workers have interpreted the policies as an attempt to stifle criticism of particular projects, which they allege amounts to a violation of the company’s code of conduct. These employees point to a clause in the code that actively encourages dissent: “Don’t be evil, and if you see something that you think isn’t right—speak up!” Workers are "trying to report internally on problematic situations, and in some cases are not being allowed to make that information useful and accessible,” said Hahne. There is now a “climate of fear” inside Google offices, he said.
Google’s permissive workplace culture became the prime example of Silicon Valley’s brand of employment. But transparency is hardly universal. Apple Inc. and Amazon.com Inc. demand that workers operate in rigid silos to keep the details of sensitive projects from leaking to competitors. Engineers building a phone’s camera may have no idea what the people building its operating system are doing, and vice versa. Similar restrictions are common at government contractors and other companies working with clients who demand discretion.
The specifics of Google’s business operations traditionally haven’t required this level of secrecy, but that is changing. Google’s cloud business in particular requires it to convince business clients it can handle sensitive data and work on discrete projects. This has brought it more in line with its secrecy-minded competitors. The protests themselves have also inspired new restrictions, as executives have looked to cut off the tools of the activists it argues are operating in bad faith.
Google’s leaders have acknowledged the delicacy of adjusting a culture that has entrenched itself over two decades. “Employees today are much, much more active in the governance in the company,” Eric Schmidt, Google’s former CEO and chair, said at an event at Stanford University in October.
Amy Edmonson, a professor of leadership and management at Harvard Business School, said that Google’s idealistic history increases the burden on its executives to bring along reluctant employees as it adopts more conventional corporate practices. “It’s just really important that if you’re going to do something that is perceived as change that you’re going to explain it,” she said.
Bock, the company’s former HR director who is now CEO of Humu, a workplace software startup, suggested that Google hasn’t succeeded here. “Maybe Alphabet is just a different company than it used to be,” he wrote in an email to Bloomberg News. “But not everyone’s gotten the memo.”
(With assistance from Josh Eidelson.)