Big tech companies’ apps and devices consume the world’s collective attention in pursuit of their profits, with insufficient attention to the harms they cause users and the good of society. I believe we can say that Society’s attention is neurologically hijacked by a tsunami of devices, ‘weapons of mass distraction’, that lock it in what big tech wants us to see and want. Socially conscious Generation Z entrepreneurs with their strong sense that businesses need to have a purpose beyond profit know this and are starting to mount a ‘techlash’, developing ideas to protect us from digital overload and interference. Governments around the world are also starting to face into the fact that antitrust regimes, particularly America’s, have failed to prevent massive dominant companies developing in search, advertising, news dissemination, connecting, messaging, recruiting and retailing. More recently, the big techs have started moving into financial services where they are less regulated than the traditional banking sector incumbents. Companies which were less than a decade ago start-ups have become global monopsonies that according to the recent report of the US Anti-Trust Sub-Committee on the Judiciary, wield their dominance in ways which erode entrepreneurship, degrade privacy and undermine the vibrancy of the free and diverse press. The result, so that Committee said, is less innovation, fewer choices for consumers and a weakened democracy. These companies and their products and services have reached such scale and ubiquity in our daily lives that traditional anti-trust remedies won’t work and new approaches are required. Big tech appears to be in denial that it needs to reform its model to have more societal purpose and responsibility.
The world has a history of sleepwalking into crises: the dot com boom and bust, the global financial crisis, climate crisis and not preparing for a widely forecast pandemic to name a few in recent history. Big tech has grown rapidly over a decade or so but gradually, with no discontinuity – the thing which normally sparks a regulatory reset. Worryingly, over the same period as technology has become ubiquitous, adolescent wellbeing and happiness has been in continual decline and is now at record low levels. Device, gaming and porn addiction (or at least compulsive use), anxiety, loneliness, unhappiness, depression, self-harm and suicide amongst this group have grown by a factor and are at record highs. Causality is not proven, but many believe the conflation of the two is no coincidence and big tech is accused of not cooperating with academia and civil society to allow proper examination of the data they hold which would prove the case one way or another. If the accused won’t cooperate, perhaps society should draw a conclusion about its motives and apply the precautionary principle – to work on the basis that the burden of proof should fall on big tech to prove longer screen times, excessive social media use, gaming and widespread underage access to age-inappropriate content are not the cause. As it took decades too long for governments to create health and safety legislation in industrial society, so it is taking too long for governments to catch up with the negative impacts of the behemoth tech companies grown out of the failure of antitrust legislation.
In this context, the UK Government is to be congratulated on the Online Harms Bill which will make its way through Parliament this year under the stewardship of DCMS. This will place a world ground-breaking statutory duty of care on big tech to assess the harms its products and services may cause, particularly on children, and take actions to mitigate those harms. OFCOM will assess its judgements and fine these companies if it deems the actions are not sufficient. Alongside it, the Age-Appropriate Design Code, if robustly enforced by the ICO, will provide children with even more protection. New compulsory Relationship Education in schools, whilst falling short of being a full course in responsible internet use, will at least highlight the differences between on and offline relationships. The new Digital Markets Unit at the Competition and Markets Authority will draw up and enforce a code to set out the limits of acceptable monopolistic behaviour, working on the basis that it is too late and or impractical to break these companies up. The US Attorney General has sent a law to congress which would reduce the shield big tech currently have from having to take responsibility for user created content. Other countries may adopt Australia’s lead in appointing an eSafety Commissioner, and forcing monopolistic platforms to pay smaller companies generating news content which they distribute. EU Commissioner Vestager promises similar proposals in the EU in relation to the data big tech gathers, promising to force them to make it available to smaller companies and good work is going on in the UK and EU in relation to the future Governance of AI and algorithms to attack bias.
One major disappointment with the UK’s Online Harms Bill is DCMS’ refusal to include fraud in the list of online harms the proposed legislation will require big tech to protect us against. Fraud now represents the UK’s largest crime statistic, at about 40 per cent of all recorded UK crime. It only represents about 1 per cent of police spending, so the importance of prevention cannot be overemphasised. It no surprise, therefore, that Government, law enforcement and the banking sector have recently focused on the stark contrast between the work they do in public private partnership to tackle fraud with huge effort and at substantial cost to bank shareholders, and the lack of ‘same risk, same regulation’ approach in the wider technology sector. Notwithstanding their frequent role in initiating e-commerce and e-retail, big tech is not required to prevent fraud or ‘online burglary’ to the same degree as banks. Some of the social media platforms make profits from placing adverts that turn out to be investment scams, others accept adverts for the recruitment of mule account operators, often students who need money to make ends meet and have no idea they may by committing criminal offences and blighting their records. BBC’s Moneybox recently highlighted that payment scams frequently emanate from e-commerce on social media platforms whose vendors demand payment direct from customers’ bank accounts, enabling them to circumvent traditional controls and consumer protection mechanisms developed in the cards arena.
The argument against not requiring a level regulatory playing field attributed to DCMS is that since such regulation is not required in other countries, to require it in the UK could make the UK uncompetitive as a location for big tech operations. But this runs in the face of the fact that the Online Harms Bill at its core goes beyond what almost any other country requires of big tech in pursuit of the Government’s laudable objective of making the UK the safest place to go online in the world. Big tech has been one of the few beneficiaries of Covid, seeing profits and market capitalisations increase, whilst consumers and other businesses have suffered. In this context, adding fraud into the mix doesn’t seem unreasonable. And given the scale and importance of the UK market, it doesn’t seem credible that such a move would really lead to the big techs leaving or not coming the UK.
The Governor of the Bank of England and Chief Executive of the FCA have both expressed their concerns that fraud is not currently included in the list of online harms in the bill. I would go so far as to say that if the UK Government is really serious about its objective of making the UK the safest place to go online and of putting in place a really effective fraud action plan as the UK banking and finance sector has recommended, then leaving fraud out of the bill represents a strategic flaw in the approach which undermines the excellent work being done across existing public private partnerships. It is not too late for DCMS to reconsider and I urge it to do so.
Born Digital by Robert Wigley is out now, £19.99 (Whitefox)