Can big tech be trusted to use the vast power it holds responsibly? That question was at the heart of our Power & Responsibility summit, held in London this month. Of the big tech five, Google joined us to make its case as a responsible business. It rightly fielded tough questions on tax and monopoly power. Julian Blake reports.
The fallout for tech business from this year’s Cambridge Analytica data and democracy scandal has been immense – and far from confined to just Facebook, the big tech company implicated in that story.
Allowing tech platforms to be used illegally to swing elections is a big charge indeed. But the impact on democracy is far from the only tech fear out there.
Cambridge Analytica sprung the genie of public concern from the bottle – and, as Edelman told the summit, that concern has hit public trust in tech across a range of key issues.
Data privacy is one of those key trust fears, and that goes well beyond the CA breaches encountered at Facebook. Much big tech business growth is down to the vast pools of data it holds on us. Andrew Keen (another summit speaker) has called it “the surveillance economy.”
So public anxiety is rising about big tech business power, whether it’s monopoly control destroying competition (and jobs) across sectors like retail, advertising and media – or a public position at least that enough tax is being paid in the face of huge revenues and reserves.
The combined effect of all of this is that big tech is under the spotlight as never before – with growing moves to rein in the power of the big five tech corporations of GAFAM in particular. In one of the biggest regulator moves to date, this summer Google was hit with a huge $5bn EU fine for using Android to block its rivals.
In their summit discussion paper, Eva Appelbaum and Jess Tyrrell identify 10 big challenges to move us from ‘tech fear’ to ‘human tech’. High on their list are big tech taxation, monopoly power’s economic impact and the sense that tech business models are fuelling inequality.
Google is to be congratulated for joining us at the summit, when it would have been easy to decline the invite. We wanted to hear their response to concerns, including on tax, where they really know full well that they must do more.
Earlier this year, it was confirmed that Google would pay £49m in UK corporation tax to the Treasury in 2018 , up £13m on the previous year. The total value of the company’s UK sales was £5.7bn. With all else equal, that should mean annual corporation tax of around £1bn.
Google UK’s public policy and government relations manager, Katie O’Donovan (pictured), made an articulate summit case positioning Google as a responsible business.
O’Donovan’s case was two pronged: that Google’s massively popular products and services were a driver of economic growth; and that the responsible deployment of its resources was good for society.
“We want to make sure that the technology that we develop can be used for inclusive economic growth, that it can help contribute to people’s skills and their opportunities,” she said.
Engineers built products like search, maps and Gmail that were used by millions every day in the UK. “That gives a positive impact to British businesses and a positive impact to the economy, and even impacts productivity,” she said.
O’Donovan recognised the “massive challenge” in the UK, with almost 11m lacking the basic digital skills to use its technology. That was the thinking behind the Digital Garage public education programme.
Stressing Google’s sheer economic value, O’Donovan said 40m in the UK alone used Google, creating a consumer surplus of £37bn. “Households have told us they would rather lose a car, a TV licence or an hour’s sleep than lose access to online search,” she said.
As Google approached its 20th anniversary, O’Donovan agreed the company had made “mistakes”, but was learning from them. “We are thinking very deeply about how we can do more with the people and the institutions who are interested in us and who use us, to really create that trusted future where technology does enhance our everyday life, but in a way that we are able to shape and choose.”
She said Google was guided by clear principles, including making its technology available for everyone regardless income – and keeping customers safe and secure. “We never sell personal information and users can remove their data,” she insisted.
“We are working with the UK government on their internet safety strategy,” she added. “We support their ambition to create a safer and more trusted internet, recognising the important role we can all play in ensuring that.
As it moved to become an “Ai-first business”, O’Donovan said Google had new “guiding principles around tackling bias and accountability” in Ai models. “We have made our Ai systems available and free for everyone to use through tensorflow, extending the ability to use that technology to the smallest of companies and charities.”
O’Donovan made a compelling case for social responsibility – and much of her thinking was brought out in Google’s UK impact report this month.
But in a summit panel discussion with the heads of the Institute for Public Policy Research, Digital Catapult and Coadec she faced tougher questions on the power side of the equation.
The IPPR published its major Prosperity and Justice report last month, urging reforms to counter the monopoly of big tech, protect against Ai-driven job losses and action on the tech-driven gig economy.
“I think it would be fairly widely recognised is that some companies are too powerful,” IPPR director Tom Kibasi (pictured) said. “Take Google and Facebook – between them they control 61% of global [online] advertising revenue.”
Asking whether big tech power mattered, Kibasi saw a likely negative for employees. He said the evidence showed that “as corporate power goes up, shares going to workers within firms tend to go down.”
He agreed that Google and others undoubtedly innovated – but predicted less innovation to come. “All the evidence shows that over time as companies become too powerful the rate of investment and innovation actually starts to fall back,” he said.
Kibasi acknowledged that Google was “actually a rather good company” – and said that “if we really want to focus attention on the abuse of power Amazon is really the company to focus on. It’s a quite a good example of how if you don’t constrain power it can go in the wrong direction.”
“How they [Amazon] treat their workers which is a disgrace,’ Kibasi said. “And this move under enormous pressure to finally put wages up while you have got Jeff Bezos becoming the world’s wealthiest man worth $150bn. Having his workers pee in bottles in fulfilment centres because they are being tagged and tracked and it’s too far to walk to the bathroom and they risk getting future hours. They don’t risk their jobs because they are zero-hours contractors.”
On tax, Kibasi insisted that big tech was not paying the taxes it should. The IPRR proposed corporation taxation on the basis of revenues, “assuming that you make the same rate of profit in the UK market as you make globally”, meaning that corporation tax would be based on that assumed level of profitability.
“If you want to restore trust the first thing you have to do is pay your taxes,” he said. “If you have got deep pockets and you can employ an accountant to find ways to minimise what you pay just because you can doesn’t mean that you should. There are many things that you can do that really shouldn’t do.”
Responding, O’Donovan said all international companies had “been under really heated scrutiny for the last five years.” She insisted that “we absolutely do pay the tax that we have to in the UK. If you look at the tax we have paid over the last few years it has significantly increased”.
But she conceded that “nobody believes that to be the correct settlement. We have said for many years that there should be a new settlement on taxation from a global basis. What does make it challenging for us is if that happens on a country-by-country basis because then you have different countries doing different things which upsets our global settlement.”
Offering perspective, Digital Catapult chief executive Jeremy Silver (pictured) said “we are in an amazing and unprecedented place. We have never seen companies at this scale with this kind of influence before. The very companies that were our heroes are suddenly the companies with whom for whom we have all kinds of doubts and uncertainties.
“Any of us who have spent any time experiencing a Google or an Apple or an Amazon service know how damned good those services are and how much we value them. On the other hand, we all have enormous concerns the more we understand about the way that our individual data is being used.”
“The thing that is most challenging for this and it is most challenging for Amazon and for Google and for those companies too is that there are not the counterbalancing global institutions who can act as global regulators.”
Coadec executive director Dom Hallas, representing startups, reminded us that not all tech businesses are the same. “We have over 220,000 digital businesses in the UK, and actually they don’t feel very powerful,” he said. “They worry whether regulation that might hit much much bigger businesses and be aimed at much bigger businesses might put them out of business.”
A key proposal in DigitalAgenda’s discussion paper was that big technology businesses should be required to use their vast technology power, especially on Ai, for human gain. So should big tech be forced to deliver tech for good?
O’Donovan argued that Google already does this. “We do believe that we create the potential for tech for good. If you think about standard products like search or YouTube being used in education, or Google maps being used to track genocide, that wasn’t done by Google engineers. It was done by people using our technology that we provide for free.
“We work very hard to ensure that we are able to do that with responsibility,”she insisted.
It is hard not to argue that Google products, positively applied, can and do have a good social impact. Aside from its core consumer products, projects like DeepMind are driven by a motivation to make the world a better place, and it wants its approach to Ai to be ruled by a strong ethical framework.
Problems occur, of course, when tech products have less positive outcomes – even if those consequences are unintended.
The effects of YouTube on young people’s wellbeing have raised anxieties, for instance. And its exposed work on the Project Maven military Ai project can only have grown public doubt – even if the company did abandon that work under pressure.
The dilemma that many of us have – so well articulated by Jeremy Silver – is that we have all lapped up and continue to use the brilliant technologies that big tech has provided, to make our lives and work easier. Cambridge Analytica and more have given us cause to doubt the wonder in which we have held these services – and there is no doubt we worry more about our data.
Some sneered at Nick Clegg’s appointment this month as Facebook’s global affairs chief, “to build bridges between politics and tech“.
It can only be good that experienced heads are coming in to big tech to strengthen links with regulators. It was great to see Google engaging in our debate about responsible business and upping its work with government. There’s a lot to do on a lot of issues, and tech and government must learn from each other.
Some will never be satisfied that big tech is doing enough. But, when it comes to winning back public trust, paying the right tax looks a pretty good place to start.
The Power & Responsibility discussion paper is open for comments until 31 October. Read the full paper and add your comments here.