A self-described “technology optimist,” Roger McNamee was one of the most influential investors in Silicon Valley, mentoring many of the Valley’s hottest young leaders, including a very young Mark Zuckerberg in the early, critical years of Facebook’s development (including introducing him to Sheryl Sandberg).
He found “Zuck” brilliant, driven—and more than a little quirky. What he didn’t pick up on, McNamee writes in Zucked: Waking Up To The Facebook Catastrophe, his richly detailed assault on company and the people who built it, was how parts of Zuckerberg’s character—specifically, what McNamee describes as a lack of empathy and world-class hubris—might eventually get baked into every facet of Facebook, and Silicon Valley, from culture to code.
The result, he says, are a host of social ills including technology addiction, assaults on democracy by its adversaries and a rending of the social fabric across the globe. The book is a must-read primer for the digital age, packed with regulatory prescriptions to rein in the data age’s worst practices and a hopeful vision for using—rather than being used—by emerging technologies like Artificial Intelligence.
Corporate Board Member asked McNamee what directors, investors and corporate leaders should take away from what he’s learned. What follows is edited for length and clarity:
CBM: There’s not a lot about the Facebook board in here. Why not?
McNamee: My focus in writing the book was to share the narrative of my discovery of what was wrong. Keep in mind, I spent 34 years, as a tech investor and a technology optimist. I was a huge fan of Facebook because at one time, early in Facebook’s life, I was an advisor to Mark Zuckerberg and Sheryl Sandberg. I was a cheerleader, and all of a sudden I started to notice, in the context of the 2016 election and other issues, bad actors exploiting the architecture and business model of Facebook to harm innocent people.
It began with election things early in the year, it moved onto civil rights issues, to things in housing. I reached out to Mark and Cheryl in the fall of 2016. It was to warn my friends of something where I thought they were the victims. As I learned more and more about it, I discovered that, in fact, yes, there were bad actors leveraging the platform but that Facebook had made choices that made that inevitable.
Worse, when faced with evidence of that problem, Facebook initially denied it, then deflected it, and then dissembled. So I look at what’s going on, and it’s super important to ask the question of the board of directors, “What is the right role of the board in providing guidance to young management teams when they’re faced with issues they’ve never seen before?”
I think this issue applies equally at both Google and Facebook. I was perplexed when the chairman of Alphabet was on a book tour about leadership at a time when his CEO was refusing to appear in front of a hearing of Congress. That shocked me because Google would have looked relatively good in that hearing. I don’t see any way a CEO of any other industry would have failed to appear. Yet, in this industry, the CEOs of Google and Facebook believe that they should only appear when they want to.
That just strikes me as symbolic of a failure of judgment, that strikes me as symbolic of just why it’s so important that the boards step up and remind the relatively young executives of these companies that they do have responsibility here, that if their products are causing or enabling harm they should get ahead of that and make the substantive changes to the business model that will be necessary to fix the problems.
How much of this is the downstream symptom of what’s become the standard in Silicon Valley: Dual-class shares? It’s led to a time of imperial CEOs, a concentration of power at the top, which you describe so well in the book. That’s really led to a lot of the problem.
I couldn’t agree more. I believe that the notion of having a separate class of stock that gives the founder control for life is a test that we’ve now run, and it has failed decisively. It’s now time to prohibit that everywhere. It didn’t have to fail, but it has been abused. I fear that the ineffectiveness of the boards of these companies is the direct result of the fact that they have no real power. They only have the power to persuade.
What I worry about is that they do not have an independent view of the problems I’m describing, that they get all their information from the very executives whose behavior is in question. It seems to me that, for boards of all companies at this time, it’s really important to have some access to independent perspectives whenever an issue comes up. And I don’t see that at Facebook and Google, and in the boards I’ve served on, you know, it was harder to get than I would have liked.