How to build trust into big data and AI: Accountability
How to build trust into big data and AI: Accountability
Can we use blockchain technologies to build trust into big data/AI by building accountability into how they work? Analyst Anthony Buck explores.
This weekly comment was written by Anthony Buck and reflects his personal analyses and opinions, rather than those of EARS.
Nearly everyone has a favourite pair of jeans. They look and feel great. But they also can serve an additional role: accountability. Now you may be thinking: ‘My jeans hold me accountable?’ It is not a perfect analogy to be sure, but potentially yes!
You see, our favourite pairs of jeans never change size. Since I bought my jeans while I had a healthy lifestyle, whether I still fit into my jeans is an unavoidable reckoning on how well I am doing at maintaining a healthy lifestyle: eating a balanced diet, getting regular exercise, and sleeping enough. If any of these areas of a healthy life go untended for too long, my jeans will no longer fit. I will be held accountable by my jeans’ inability to hold all of me.
Accountability, despite my humorous analogy, is desperately important when it comes to how we think about building trust into the way big data/AI work and into how they are used by Big Tech, governments, religions, and even everyday people. Promises, self-pronouncements, and prescriptions by these organisations are not enough to build trust without some way of holding them accountable for breaking them. Yet, in the way big data/AI work today, accountability is lacking.
I have outlined why we need to build trust into big data and AI, especially as it relates to religion. First we asked: ‘Can religions trust Big Tech and governments with Big Data and AI?’ and then ‘Can religions trust themselves with big data and AI?’ The powerful digital technologies of big data and AI are terrifying in their scope and in their potential for misuse, whether they are in the hands of Big Tech (e.g. Google, Facebook, Amazon, etc.),[1] [2] governments,[3] [4] [5] or even religions.[6] [7] There is an ongoing risk of abuse[8] and already some concerning unethical behaviour.[9] [10] [11] [12] [13] [14] [15]
There are five things we can do to make the future of how digital technologies operate more humanising. We have already explored three of them: make them share access, share development, and radically share profits. Now I propose the fourth element of building trust into the way the digital landscape works: accountability.
Internal and external accountability: a model of accountability from the church
No framework of human accountability can be perfect. There are always limitations. However, there are patterns of accountability that can be learned from. One of those patterns comes from the church. In many Christian traditions, there is a level of accountability for churches at the local level, but also one above the local level. There is accountability internal to a local church (e.g. safeguarding officers, layperson leadership boards, etc.) and external to it (e.g. bishops, presbyteries, denominational oversight). This allows unethical behaviour to have two options for reporting and resolution.
Big Tech often pushes only to have an internal level of accountability. However, it is already clear that they cannot be trusted to keep themselves accountable only from executives and ethicists inside their organisations. Google fired its ethics team when it reported undesired features of their AI algorithms.[16] [17] [18] Leaks revealed Facebook kept secret for two years its internal research that Instagram was negatively impacting young girls’ mental health.[19] [20] [21] [22] Amazon made fake accounts of Amazon workers on Twitter to defend the company against reports of its unethical treatment of its workers by its workers.[23] [24] [25]
Thus, internal accountability is not enough. External accountability is also essential. Certainly it includes things like legislation and investigative journalism. Accountability systems must also provide accountability not only to Big Tech, but governments, religions, and other organisations.
The system must be insulated both from political whims, religious good intentions, and corporate cash. Big Tech already spends millions of euros on lobbying to influence politics.[26] [27] External accountability is not really external if the big players can buy or legislate their way out of it in practice.
Moreover, our first three elements of building trust into big data/AI of sharing access, sharing development, and sharing profits are all external forms of accountability. They are fundamentally designed to function as structural forms of accountability. They promote a relationship of interdependence among stakeholders and prevent power imbalances. They are the accountability of shared knowledge, tools, and rewards. But we still need to go beyond these forms of accountability. They are essential, but insufficient on their own. However, blockchain technologies may provide just this kind of accountability. But for today we only have space to consider the model of accountability we need. To explore the possibility of using blockchain technologies for accountability, we will have to wait until our next article.
Accountability and trust
Ultimately, when there is a gap of trust between parties, one essential way of building trust is building accountability into the relationship. Law serves this purpose, so do trade unions and ecclesial hierarchies. They work on internal and external levels. To build trust into the way big data/AI operate, we will need to do the same.
One possible way uniquely available with these digital technologies is using blockchain contracts. These contracts can potentially help ensure these digital technologies share access, development, and profits. So, tune in next time when we will consider blockchain technologies as a way of applying this model of internal and external accountability to digital technologies.
This weekly comment was written by Anthony Buck and reflects his personal analyses and opinions, rather than those of EARS.
Interested in similar topics? Go to our Dashboard and receive free updates.
[1] Algorithms drive online discrimination, academic warns
[2] Analysis | The Technology 202: Nuns are leading the charge to pressure Amazon to get religion on facial recognition
[3] Perspective | China’s alarming AI surveillance of Muslims should wake us up
[4] Opinion | China’s Orwellian War on Religion
[5] Your body is a passport – POLITICO
[6] Iglesias brasileñas adquieren tecnología de reconocimiento facial para controlar la asistencia y emociones de sus fieles
[7] Kuzzma Inteligência Artificial |
[8] EUROPEAN COMMISSION Brussels, 19.2.2020 COM(2020) 65 final WHITE PAPER On Artificial Intelligence – A European approach to excellence and trust.
[9] EU companies selling surveillance tools to China’s human rights abusers
[10] Sandra Wachter, ‘Affinity Profiling and Discrimination by Association in Online Behavioural Advertising’, Berkeley Technology Law Journal 35.2 (2020 Forthcoming). Accessed 16 Dec 2020. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3388639#
[11] Indonesian app to report ‘misguided’ religion ‘risks dividing neighbors’
[12] India’s founding values are threatened by sinister new forms of oppression | Madhav Khosla
[13] Margaret Mitchell: Google fires AI ethics founder
[14] Google fires second AI ethics researcher following internal investigation
[15] Google fires top AI ethicist
[16] Margaret Mitchell: Google fires AI ethics founder
[17] Google fires second AI ethics researcher following internal investigation
[18] Google fires top AI ethicist
[19] Facebook aware of Instagram’s harmful effect on teenage girls, leak reveals | Instagram | The Guardian
[20] MP calls for Facebook to be punished if it holds back evidence of harm to users
[21] Facebook harms children and weakens democracy: ex-employee – BBC News
[22] Frances Haugen: Facebook whistleblower reveals identity – BBC News
[23] ‘Fake’ Twitter users rush to Amazon’s defence over unions and working conditions | Amazon | The Guardian
[24] ‘Fake’ Amazon workers defend company on Twitter – BBC News
[25] Amazon fight with workers: ‘You’re a cog in the system’ – BBC News