It has become abundantly clear in the past six weeks, after a thorough examination of the relevant transcripts, that politicians both in the United States and the European Union have failed abysmally to come to terms with the genuine threat to democracy offered by companies such as Facebook and Google. With the occasional exception, politicians both on Capitol Hill and in Brussels appear severely clueless about the fundamental issues involved, asking questions which either exposed their technogical naïvety or their fundamental misunderstanding of the relative risks posed by monopolies in the manufacturing, service and financial sectors and those in the field of IT.
Both Facebook and Google have ambitions which are essentially hegemonic and limitless. With the spirit of Ayn Rand always lurking in the background, they repeatedly show contempt for liberal democracies (at every level) even while allegedly supporting them! All this is nothing new, I flagged up some of the ethical issues surrounding encryption as early as 1994 at the University of Essex, examined the relation between information security and public trust as part of the Royal Society's Science in Society programme in 2004 and, in 2011, provided a critical introduction to Adam Curtis’s series All Watched Over by Machines of Loving Grace at Cardiff University.
Sadly, we are now light years away from Tim Berners-Lee’s vision of a free and open internet. The walled-garden approach of social media is ultimately a cynical trade-off between useful functionality, admittedly appreciated by many, and the increasingly murky and unaccountable world of data gathering. As Facebook and Google are not like traditional manufacturers or financial institutions, simple appeals to classic, anti-trust legislation aren't sufficient. Other approaches need to be examined and, if reasonable, put into practice. For example, companies such as Facebook or Google should only be able to employ ‘smart algorithms’ if these have been 'ethically assessed' in advance (in terms of aims/expected outcomes/level of steer/granularity etc.) by an independent body representing a significant range of interests, not least those of civic society. Moreover the algorithms themselves, if approved, should be formally entered in an Independent Registry open to public scrutiny. The cry of ‘commercial sensitivity’ simply won’t wash here.
In addition, Facebook and Google should, along with fufilling their corporate tax obligations, face severe, punitive penalties for breaches of privacy which adequately reflect not only actual harm to individuals, organisations and corporations but also realistic potential jeopardy.
Finally, the legal profession needs to conduct a sober internal debate as to whether lawyers should seek employment with companies which put both privacy and democracy so clearly at risk.
Director of CIEPP
Dr Ian Kenway