Over the last couple of weeks, two big tech companies in Silicon Valley – Uber and Facebook – have faced major crises. Here’s a sampling of headlines on major news organizations about these crises.

On Uber’s self-driving car accident,

The Washington Post:

Self-driving Uber vehicle strikes and kills pedestrian.

The New York Times:

Self-Driving Uber Car Kills Pedestrian in Arizona, Where Robots Roam.

Fox News was less judge, jury, and executioner, instead attributing the statement to the police:

Self-driving Uber car kills Arizona pedestrian, police say.

Uber’s self-driving car | Image Source: Uber.com, ATG

And on Facebook’s Cambridge Analytica “data breach” controversy,

The Washington Post:

Everything you need to know about the Cambridge Analytica-Facebook debacle.

The New York Times:

Facebook’s Role in Data Misuse Sets Off Storms on Two Continents.

The Guardian:

Facebook’s week of shame: the Cambridge Analytica fallout.

The Economist:

The Facebook scandal could change politics as well as the internet.

The Economist’s Cover on Facebook | Image Source: (Ironic, but) The Economist’s Facebook Page

In both cases, if you don’t read past the fear mongering headlines (kills pedestrian, week of shame, data misuse, scandal), you would bet that both these corporations are pure evil. However, on closer examination, it will be clear that the crux of the Uber story is this: on a not-so-well-lit, multi-lane corridor where cars speed at 45 mph, a pedestrian crossing the road outside the crosswalk (walking her bicycle as she crossed) was fatally struck by an Uber self-driving vehicle with an in-vehicle operator (meant to take over the self-driving car during emergencies).

The Facebook story on the other hand is a little bit more complicated: In 2010, Facebook, pursing a platform strategy, provided access to their graph data to any developer willing to build software applications on top of Facebook (think Candy Crush!).1 Not only did they provide developers access to their graph data, they also announced this access publicly at their F8 conference and several news organizations (many of who are now gunning after Mark Zuckerberg and Facebook) reported exactly what data was being shared and why.  To access this data, a developer had to seek permission only from one user, and the developer would then get access to all the users’ friends’ info as well. This was originally intended to make your website experience better. For example, if a bookseller wanted to let you know that your Facebook friend likes a book, they had to know (1) who your Facebook friends are and, and (2) what books they like. And you have to give them access to do so (since your friends have already “told you” via their listing on their profiles what books they like. A question we should all ask is – who owns my data that friends of mine have access to (my phone number, my birthday, my likes, my dislikes), and can friends of mine share it with others for money or otherwise? Your answer to this question will determine whether you think there was an actual data scandal in this whole Facebook-Kogan-Cambridge Analytica episode. This post provides a well-reasoned, logical view of why this scandal is no scandal at all. Although Facebook prohibited developers from passing along or selling this information, there was and is no way to enforce it once the data leaves Facebook’s servers. I am unsure why Facebook users are alarmed now, in 2018, about the fact that developers had since 2010 (1) access to your information if your friends permitted them to get it, and (2) developers could easily share this information with others despite Facebook prohibiting them from doing so. Dick Costolo’s tweet perfectly sums this up.

Long story short, a researcher (Prof. Kogan) developed an app that asked users to share their info and who their friends are and what their interests are, and then passed this data along to Cambridge Analytica, who subsequently used this data to influence (some say manipulate) the US Presidential elections.2 One thing is very clear: every Facebook user (yours truly included) accepted the terms and conditions (who reads them anyway?) of using Facebook that clearly stated that Facebook stores info about my interests, my likes, my posts, who my friends are etc. and this info can be provided to 3rd parties.

Now, make no mistake, whether Facebook giving developers access to their data graph was a sound strategy is a completely different question. Posts from James Allworth and Ben Thompson are must reads about Facebook’s platform strategy. Facebook wanted to be a platform, and in order to attract developers to build on top of Facebook, they shared their graph data with them. It’s not a new strategy; in order to get developers on board, some companies throw money at them (hello there, Microsoft!) and some like Facebook, share(d) access to their data.

By now, it should be plainly obvious that I believe Uber and Facebook have gotten the short end of the stick from traditional media. In Uber’s case, a more truthful headline would have read – “Pedestrian in not-well-lit street hit by self-driving Uber” or “Pedestrian crossing outside crosswalk hit by self-driving Uber vehicle.”3 And in Facebook’s case, a more neutral headline would have been – “Facebook developer partner sells facebook graph data” or perhaps even “Facebook’s decision to open graph data to developers backfires”. One thing that’s clear from the contrast between my made-up headlines and the actual headlines is that people are warier about tech than ever before, so one of the easiest way to get your clicks and views is through click-bait headlines about big bad tech and polarizing political news. Clearly, my made-up headlines above are too bland to garner enough clicks and views, and would’ve have made the cut at any major news organization.

I digress. The point of this post isn’t to debate whether Facebook erred in sharing data with Prof. Kogan or Cambridge Analytica. This post is to highlight the difference between how Uber’s CEO Dara Khosrowshahi, and Facebook CEO, Mark Zuckerberg, managed their PR nightmares.

Uber could have very well argued that the pedestrian crossed outside of the crosswalk, or that the street wasn’t well lit, or that the operator wasn’t paying attention, or any of the numerous externalities that may have caused this incident. Yet their CEO, Dara Khosrowshahi and Uber’s communications team did not blame anyone else; they issued statements expressing their thoughts with the victim and preemptively grounded all their self-driving cars without waiting for a trial by the media and Twitterati, and well before any government regulation forces them to take these cars off the street. I was initially skeptical when Dara Khosrowshahi took over from a product-focused leader like Travis, but so far he’s shown that he’s a skilled diplomat: he’s shown that not only is he a master deal maker (more on that in another post), he also knows exactly when to be combative, and when to just lay low.

On the other hand, after the so-called Facebook-Cambridge Analytica scandal, Mark Zuckerberg did not show up for several days. And when he finally showed up, he went on a PR blitz through interviews with all major news organizations and newspaper ads (with an apology nevertheless). Yet, Facebook continued to play the combative blame game; for example, in their first post about this episode they blamed Prof. Kogan and Cambridge Analytica.

Aleksandr Kogan requested and gained access to information from users who chose to sign up to his app, and everyone involved gave their consent.

Although Kogan gained access to this information in a legitimate way and through the proper channels that governed all developers on Facebook at that time, he did not subsequently abide by our rules.

Cambridge Analytica, Kogan and Wylie all certified to us that they destroyed the data. Contrary to the certifications we were given, not all data was deleted.

They started the next post by playing victim right at the beginning:

What happened with Cambridge Analytica was a breach of Facebook’s trust.

After news reports of this episode indicated how people can check what data Facebook had about them, people started downloading their data and noticed their likes, their messages to people, and calls and texts (but not the actual content of these calls or texts) in Android devices, prompting Facebook to post one more update:

You may have seen some recent reports that Facebook has been logging people’s call and SMS (text) history without their permission. This is not the case.

In each update, they just made things worse by blaming others (even if that may be the truth). Exactly opposite of what Diplomat Dara did. He did not say Uber is to blame, but he also didn’t lay the blame on the pedestrian or anyone else.

M. G. Siegler puts it best when he calls Facebook, the “Foot-in-Mouthbook”.

To make matters worse, Facebook clearly has a bad case of foot-in-mouth disease. Whenever they try to respond to a situation, they just exacerbate the issue.

As an engineer and tech worker, I’ve always wondered why some people think with their hearts and not with their head. Why can’t they see the logic that a car driven by the human would have also likely hit the pedestrian in the Uber accident, or that the number of miles driven without an accident is several times higher for automated cars? Why can’t they see that if nearly 300,000 clicked “Ok, grant permissions” to Prof. Kogan’s app and shared their information and their friends brief information to the app developer, it’s not Facebook’s fault. Yet, sometimes, even when you’re not in the wrong, even when you wonder why others can’t be logical and see someone else is at fault, it takes a diplomat to skillfully manage public opinion by not blaming others, by not combating public mood.

When faced with crises that you know may not be your fault, it takes both skill and tact to stop yourself from fighting the criticism with logical reasoning and simply empathize with those most affected by the incident. That’s exactly what Dara Khosrowshahi has managed, and for that Mr. Khosrowshahi, hats off to you!4


  1. This graph data is Facebook’s competitive advantage compared to Google’s or Amazon’s data. Google and Amazon know primarily only about you and your searches or your product purchases. Facebook, on the other hand,  not only knows about yourself but also your relationship with other users who also like or dislike something. For example, through their graph data, Facebook can determine that you don’t usually watch action movies, but when you do, you only watch it with your buddies from high school (Facebook thanks you for checking in with your 5 high school buddies to all those action movies last year). So what about this data? Well, if I am selling action movies, I can now not only target regular action movie goers, but also influence you through the exact 5 high school buddies who are most likely to make you watch something. Ever seen those ads on Facebook which say a particular friend likes a page?

  2. Although in my opinion their influence or manipulation is still up for debate given how miserably they failed to help their first client, Senator Ted Cruz.

  3. Death is tragic and this made-up headline is by no means trying to downplay the tragic nature of the incident.

  4. I’m not suggesting Uber is not at all fault here. I’m only suggesting that even though there are other factors Uber could blame, they have chosen not to do so.

    Also, the two crises are fundamentally different in nature, and so not ideal to compare. But I make the comparison only to suggest that Facebook and Mark Zuckerberg could learn a thing or two from Uber’s new CEO regarding diplomacy and laying low during an onslaught.