Broken Facebook needs to grow up
The New European
- Credit: PA Archive/PA Images
The data scandal embroiling social network perfectly illustrates the dangers of naïvety, arrogance and evangelism to which all the tech giants are vulnerable, says former industry insider GEOFF SUTTON
Mark Zuckerberg is clearly a smart guy. He got over the decision-making challenge of what to wear by donning precisely the same style of grey T-shirt every single day.
His reasoning is simple, somewhat reassuring to his supporters and yet breathtakingly arrogant: 'I really want to clear my life to make it so that I have to make as few decisions as possible about anything except how to best serve this community... I'm in this really lucky position, where I get to wake up every day and help serve more than a billion people. And I feel like I'm not doing my job if I spend any of my energy on things that are silly or frivolous about my life.'
He said this in 2014 when Facebook as a company was arguably at its peak: massive growth in audience and revenue numbers, especially on mobile phones; ubiquitous; popular; a largely uncritical public and media perception; everyone clamouring to be a friend of The Zuck.
But what he said about his one-style-fits-all clothing and need to serve a grateful billion people reveals a lot about how Silicon Valley's giant companies operate, with a dangerous mixture of idealism, naivety, arrogance and leadership that borders on dictatorship. It is this that has created just the conditions for the hot water in which Facebook now find themselves, amid a swirl of allegations over the harvesting and use of personal data from its users by political consultants Cambridge Analytica – and whether it was used to influence the outcome of the US presidential election or the Brexit referendum.
You may also want to watch:
Both Cambridge Analytica and Facebook deny any wrongdoing, but with the social network's share price on the slide and Zuckerberg facing calls to testify before the US Congress about how his firm will protect users, this does seem to be a moment of reckoning for the company, and a warning for the rest of the 'Big Five' tech firms – Apple, Alphabet, Microsoft and Amazon. Their weaknesses have long been apparent, even if they are not always clear to all those working on the inside. I worked at Microsoft for 18 largely happy years, from 1996 to 2014, and have deep knowledge of how seemingly-brilliantly run companies can walk into the type of nightmare scenario now unfolding for Facebook from the Cambridge Analytica allegations.
Facebook's 2014 uncritical peak highlights the most dangerous moment for that handful of giant technology platform companies, who have such a massive influence on everything that happens in the western world (China has its own versions of these firms), when optimism is unchecked, and self-questioning is overlooked. When Zuckerberg talked complacently about 'serving' more than one billion people, there is little indication he understood how the platform he had created – where all those people share data and information – would spew up the sort of challenges that are now bearing down on his company like an army of Irish front row forwards.
Facebook, Google, Microsoft, Apple and Amazon are great companies. They have changed the world, mostly in a good way. But there are common characteristics in their make-up which mean that there are times when they are inevitably blind to what is really going on out in a cyberworld that is largely unpoliced and populated by very many bad guys.
It is up to the authorities to conclude whether Cambridge Analytica are among the 'bad guys', but what is already clear is that their problems are also Facebook's problems. The Californian firm is finding out what happens when it remains blind to – or uninterested in – what else is going on out there.
The Silicon Valley giants have each been led by undoubted geniuses – Zuckerberg, Sergey Brin and Larry Page, Bill Gates, Steve Jobs and Jeff Bezos – men who retained an iron rule and microscopic control over their companies even as they became too large for one or two people to understand all the details of what could go wrong. The fact that Zuckerberg's personal 2018 resolution is to 'fix' Facebook's problems demonstrates that the company allowed too many issues to go unchecked – but that he still sees himself as the individual to resolve them.
Until they are forced to grow up, all of the West Coast mega-platforms are undermined by an inherent culture of wide-eyed idealism that fails to see the dangers hidden in the very products they are successfully creating.
I have a number of very good friends who work for Facebook. Some previously worked at Microsoft, Google and other big tech firms. Not surprisingly there is a merry-go-round of job-swapping around those firms and a recruitment war to bring the best ones on board from their rivals. All of them have an evangelical zeal at the heart of their culture that ensures employees are totally committed to the cause. They believe.
Facebook's mission statement, until 2017, was 'to give people the power to share and make the world more open and connected'. It is now, to 'give people the power to build community and bring the world closer together'. They are classic platform mission statements supported wholeheartedly and uncritically by everyone who works there.
But the original left the company vulnerable. If part of Zuckerberg's goal is to make the world open and connected, if he and his software engineers build a platform that can do just that, then there should be no surprise that politicians, criminals, intelligence agencies, the military, marketers and many others will want to use that openness for their own agenda. Even now that some of those darker consequences are becoming clearer, the evangelical belief of the Facebook zealots is unshakable. The idea that being open and connected is the right thing – end of story – is bought into totally at the company's Menlo Park campus, in California, and its offices around the world. And it leads to an arrogance that is inherent in Zuckerberg's belief that he wakes every morning to serve a billion people – and spreads around the company.
Two years ago, prior to Brexit and the election of Donald Trump, I spent quite a lot of time discussing with Facebook people and other observers about how the firm should work with media partners, and the dangers of the company's laid back, tunnel-visioned 'dive into our platform, the water's warm' approach. Emily Bell, the former Guardian journalist and now an academic at Columbia University's Graduate School of Journalism, was presciently warning that 'Facebook is eating the world', but the people inside the company didn't care and were utterly unworried as they marched forward into a world of 'fake news' and 'echo chambers.' The revenue was rolling in with more than 80% of mobile ad revenue coming to Facebook and Google, so why worry?
One reason for this apparent nonchalance, apart from the naivety and arrogance, is the way Facebook likes to self-identify as just a 'platform'. This is what every tech firm wants to be. It means your software is the basis for other companies and consumers to come and create whatever they want. The platform companies aim to lock you in to their platform, so that you use their services and no-one else's. It is why they all have their own operating system, browser, communication services, music and content services and hardware such as phones and Alexa-type devices. Every little piece that you do on that company's platform locks you in a bit more. It is why most people are either Apple iPhone users or Google Android owners: once you are in, it is very hard to get out. One of the problems with the 'we are just a platform' argument is that it allows companies to absolve responsibility from the bad stuff that is going on.
A newspaper is not a platform. It publishes what its editors believe are the right things to publish for their audience. Those editors are legally accountable for that content. The tech platforms are desperately trying to argue that they are not publishers but platforms; that what other people publish or do on their platform is nothing to do with them. It is exactly this attitude that allowed 'fake news' to establish itself.
It is like saying that if you are a phone company and two crooks discuss a bank raid over the phone, that is nothing to do with you. We had a classic and difficult platform issue in the early 2000s at Microsoft's MSN. This was relatively early in the history of the internet and we had chatroom technology that was fun, interesting, a bit scary and mostly unpoliced. In the UK, we started to get a lot of heat about how paedophiles were going into these chatrooms to groom children.
The official line was that we were a platform, and that the problem was nothing to do with us. The bosses in the US were not very interested. They didn't see it as a big issue... 'We are just the platform'.
I am glad to say that we took a stand with our leadership in Redmond, Washington State, and got them to agree that we could close down those chatrooms in 2003. We made the right decision, recognising that we were not just a platform and had a responsibility to do the right thing. I don't see Facebook yet fully recognising that they have to take responsibility for their own platform.
That internal Microsoft dispute over chatrooms also indicates another of the challenges for these global tech giants: those that arise when the technology is built centrally and distributed to local teams. It is a fantastic economically-brilliant model: essentially, one size fits all around the globe. Just see how Amazon's global platform is essentially destroying High Streets, shopping malls and retail chains across continents.
At MSN, we always had local editorial teams who had full oversight as to what was being published in each market: a local versus global balance that meant that we could ensure local sensibilities were taken into account. Facebook doesn't appear to have this to the same extent. They have local sales and marketing teams, but in other areas the company can appear very clumsy. One high profile example came in 2016, when Facebook demonstrated a lack of local understanding in its mishandling of the publication, in Norway, of the iconic 1972 photograph showing a naked Vietnamese girl running along a road after a napalm attack. Facebook initially removed the image, saying it violated its community standards barring child nudity. The incident even led to the Norwegian prime minister having a post deleted.
Such an impersonal intervention also hints at another problem for Facebook and others – the subjugating of common sense to the power of technology. The top software engineers are not only utterly brilliant people. They believe utterly in technology and the power that it brings. They believe that Artificial Intelligence will bring only good. They may say that they are concerned at the effects that AI will bring, but they will push on regardless. I had a brilliant genius of a boss at Microsoft who believed that we would be able to replace all journalists and editors with algorithms in due course. He has moved on now, but the process is continuing.
If tech firms involve humans in their process they not only create expenses, but also throw up other problems for the companies that it would be simpler to avoid. Such sites use technology to identify naked skin in pictures. At Microsoft, back in the day, we had a team in the Philippines to then check if the image was decent or not. The law and regulations were basically on the basis of 'notice and take down'. If you were alerted to bad content, then you should take it down. If you didn't notice, then it wasn't your issue. These are classic platform rules. Once again, we had issues with potential paedophile content. The lawyers told us that we couldn't mount our own campaign to 'notice' the content actively. If we did, then we could be liable. We went covert. We found bad content. We got rid of it. We alerted authorities when we needed to. We didn't tell anyone what we were doing. It was common sense. And the technology was not in charge.
Another weakness that these companies have is their very size. One of the fascinations about working for the Big Five is that the job you do has impact on millions and millions of people. The attractions of working at these places are well documented and all true: you are well compensated in terms of salary and stock grants; the kitchens have free food, Diet Coke, fruit and chocolate; many have gyms and fitness areas; it is fine to work from home regularly. But it is also seductive to be working on products that are used by millions of people. I remember introducing my then 14-year-old son to a mate from Facebook. He was instantly a total fanboy. But for those lucky men and women who fill those jobs there is a danger they get so wrapped up in the kudos of their role, making more and more money, hitting their key performance indicators and growing around the world, that they forget how it can go wrong. And that when it goes wrong, it affects so many people, can change governments, force companies out of business, result in young children being abused online. Not enough people in these organisations are focused on ensuring the bad stuff doesn't happen.
Many people will have read Dave Eggers' The Circle or seen the movie, starring Emma Watson and Tom Hanks. It shows the platform company as a cult, where the core is rotten and the staff are believers. It looked very far-fetched but really it isn't. I don't believe the core of Facebook, or Microsoft or the others are rotten, although I never bought Google's claim to Do No Evil. But as the Cambridge Analytica scandal shows – and the fake news problems of the past few years have exposed – there is a vulnerability at the heart of these firms that they seem unable to fix themselves, because they are simply unable to see it.
Microsoft changed radically when faced with break-up action from the Department of Justice and the EU regulators, around the turn of the millenium. It is hard to describe just how much of a wake-up call that was for everyone in the organisation everywhere. It was a near-death experience that, I believe, made it a much better and more responsible company.
It feels like Facebook is now edging closer to that scenario itself. Maybe Zuckerberg will eschew his grey T-shirt and dig out his old 2009 tie when he faces the Commons Select Committee rather than sending along his minions.
Geoff Sutton is a journalist, digital media veteran and chairman of Virtual Reality company Spinview
Become a Supporter
The New European is proud of its journalism and we hope you are proud of it too. We believe our voice is important - both in representing the pro-EU perspective and also to help rebalance the right wing extremes of much of the UK national press. If you value what we are doing, you can help us by making a contribution to the cost of our journalism.