Tom Fletcher has been a diplomat (he was the youngest senior British ambassador for 200 years), a foreign policy adviser (to three UK Prime Ministers) and an academic — he is currently Visiting Professor at New York University and becomes Principal of Hertford College, Oxford, in September. He is also a member of the Global Tech Panel, created to encourage a new collaborative approach between diplomacy and technology.
He is uniquely placed to understand the impact of digital technology on a global scale, the way it is changing power and what the implications of this rapid evolution are for international diplomacy. Four years ago he put his thoughts on paper in ‘The Naked Diplomat,’ a book that went on to become a best seller.
The premise of ‘The Naked Diplomat’ is that diplomacy and international relations are more important than ever as the world struggles to plot a future shaped by digital technology. Yet, as he points out in this interview, we have arrived at a critical point in human history, one where several of the major international powers seem to be pulling in different directions.
In this interview Tom also discusses the role that the UN can play and how future stability could depend on closer ties between governments, global organisations and representatives of the big tech companies. As he says “if you had a conference to deal with the challenges and opportunities of AI, you couldn’t just have the five big countries in the room. The tech companies would — rightly — laugh at the idea.”
Tracking the roll out of artificial intelligence across the globe, as well as the way it is regulated and governed, is central to the work of the Intelligence team at Tortoise
Our Global AI Index, which we unveiled last year, is the first index to benchmark nations on their level of investment, innovation and implementation of artificial intelligence; measuring factors related to education, funding, research, governance and talent. Yet while we map the rise of artificial intelligence, we are also very aware of the issues that surround it, such as the role of education in preparing societies for its implementation and crucially its impact on power structures.
It is these themes that Tom explores in this interview
Alexandra Mousavizadeh: You wrote in ‘The Naked Diplomat’ “Digital technology is changing power at a faster rate than any time in history. Distrust and inequality are fueling political and economic uncertainty. The scaffolding built around the global order is fragile. And the checks and balances created over centuries to protect our liberty are being tested, maybe to destruction.” Four years later, do you think this still holds true?
Tom Fletcher: Yes, I do. I’m still optimistic about humanity’s ability to find a way through this period. On current trends my grandkids are 200 times less likely to die violently than my grandparents. But when I look back at that book, I would probably now turn my optimism down a couple of notches. Three big causes for pessimism have become starker.
Firstly, the big tech companies have got bigger and less manageable, and less willing to be managed. The second one is that we couldn’t have predicted quite how fast the Trump/Putin combo would undermine the whole international order on which we’ve based the organisation of humanity for the last 75 years. While we were busy building driverless cars we ended up with a driverless world. I don’t go full throttle on the Requiem for the West, but wars happen when you get an economic downturn, rising nationalism or erratic leaders. We have the full set. And thirdly, the bad guys are pretty good at using the tech. My book looked at how the good guys could deploy it more effectively. And yet for a lot of the intervening period, characters like Putin and Trump have weaponised the internet. Chaos has been a ladder for them. Declining powers are sometimes more dangerous than rising powers.
Alexandra Mousavizadeh: Tom, you talk about power in your book. Can you just walk us through the big sweep of how power has rested, and where it sits now?
Tom Fletcher: Even ten years ago, you could see that power was moving from the ‘maps and chaps’ world that I was very familiar with in diplomacy and government. It used to look like a British banquet. You knew what order the courses came in and who sat where. It’s now much more like a Lebanese meal. More chaotic, less orderly. You don’t know what’s going to turn up on the table when, and who’s going to sit where but it’s a whole lot more enjoyable. One thing that strikes me, talking to people at the top of government, civil society and business, is that everyone assumes that power is somewhere else. No one feels powerful. Perhaps the one thing that Barack Obama and Donald Trump would agree on is the difficulty of actually exercising power. Bob Zoellick put it well: power is becoming easier to get, harder to use and easier to lose. And I think that sense of lack of power helps to explain the instability and fragility that we’re all feeling now. It’s an uncertainty over who’s actually in charge. We hear a lot about taking back control, and that’s a very human instinct.
And I think behind these day to day dramas that we tweet and agonise about — Trump’s latest clanger or Brexit or the rise of whichever party in Europe — there are these three really big trends. One is the growth of distrust in anything that looks like an institution, not just governments or politicians or the media, but doctors and schools and police services. The sense of agency that technology gives us also undermines our trust in authority.
And secondly you’ve got this growing perception of inequality. There’s no reason why the financial crash of 2008 will have fewer huge implications than the Great Crash of the 1920s. At least in 2008/9 when I was working for Gordon Brown we had effective international coordination. One irony of a pandemic that has been made worse by inequality and lack of international cooperation is that it has further worsened both. The virus has exposed and widened existing inequalities: countries that under-invest in health systems can’t respond effectively enough; countries where politics is failing to deliver fairness don’t have the collective will to act together; and countries that have created an underclass of unprotected workers have seen the virus spread faster as a result.
And then, of course, the third is the technological tsunami. We all talk about it, yet we don’t really comprehend the scale. How do we organise ourselves as a society and as a community in response? The combination of these dramatic trends is why these last three or four years are the new normal. There’s not going to suddenly be reset to a nice neat, organised, understandable time.
Alexandra Mousavizadeh: The trust in institutions has gone down, but maybe the importance of institutions have gone up? As we don’t know what direction technology and AI will take the world in, and I suspect it might be fragmented in terms of AI being used for good in some sectors and not so good in others. So what can institutions do? Are they equipped to regulate our world? And if not, why not?
Tom Fletcher: So the short answer is — sadly — no, they’re not. We need diplomacy and these institutions more than ever: if diplomacy didn’t exist, we would need to invent it. But diplomacy is losing altitude fast, and these institutions are orphaned and gouty. You look at the UN Security Council. At least three of the five permanent members are actively disrupting the stability they are there to protect. And the other two are the UK and France, who are going through their own crises of confidence.
Three years ago, I wrote a report for UN Secretary General António Guterres on what technology would do to the UN and how the UN could respond. And as a result, we set up the Global Tech Panel — chaired by the President of Microsoft and Vice President of the EU — to try to change the conversation between tech leaders and governments. You need translators for governments to talk to tech. It’s currently as if tech leaders are some kind of naughty school child, who governments want to calm down and behave themselves. Meanwhile, tech talks to governments like some sort of boring uncle: getting in the way of all the really exciting disruption. This is where ‘move fast and break things’ meets ‘move (very) slowly and build things’. And that linguistic, philosophical divide is getting worse. Partly because the people at the top of government and the top of international institutions — great, well meaning, purposeful people — don’t have what Megan Smith, Obama’s former CTO, calls TQ: tech intelligence. And they feel incredibly vulnerable. I was talking with the EU defence ministers in Helsinki and someone put a hand up and said, “Can someone tell me what AI is?”
To make it even more complicated, big tech is hoovering up talent so that divide, that pace, the control of data, that gulf between tech and government will grow. And as a result the creators of the tech are the ones coming to governments and saying, “Please tell us where the lines are, what are the ethics around using this tech? Don’t come and find us in three decades and tell us that we were doing something wrong”. People like Mustapha Suleiman at Google DeepMind have thought a lot about these issues. Elon Musk and others wrote to the UN about the need to debate lethal autonomous weapon systems. The UN didn’t know which bit of the system was even meant to draft the response.
Alexandra Mousavizadeh: If regulatory bodies, and governments can’t step in and fill that gap in terms of ethics and regulation when it comes to things such as autonomous vehicles or weapons, who will?
Tom Fletcher: So I think this is a really important question. In the short to medium term, it will fall back on national governments to take the lead on writing the new rules. So the good guys, in shorthand, will regulate their industries in an ethical way. But many of the bad guys won’t feel constrained in the same way. And so the arms race around AI and lethal autonomous weapons will be won by the wrong people. I think that should worry all of us. Whose ethics do we give the tech? Maybe we only get one chance to decide.
One related challenge is working out who actually has the convening power to bring together these discussions. With every big tech advancement in killing people — gunpowder, chemical, nuclear — the weapons have got ahead of the rules. And then it’s taken us, usually slowly and in an imperfect way, time to help the rules catch up. But there’s always been a convener who can bring together the right countries to make that deal or treaty. 200 years ago, when the challenge to Europe had been Napoleon, it was very easy to work out who should be in the room. No-one was saying “we need now to hear from a youth activist or a civil society representative or a corporation”. Now, if you had a conference to deal with the challenges and opportunities of AI, you couldn’t just have the five big countries in the room. The tech companies would — rightly — laugh at the idea. And so getting that right is one of the big challenges. It’s one of the things I’m working on now.