How to build trust into big data/AI: Sharing Development

Reading time: 6 minutes

How to build trust into big data/AI: Sharing Development

Do Google, Facebook, and Amazon need your help to develop big data and AI technologies? Anthony Buck argues: probably more than you might think.

This article was written by Anthony Buck and reflects his personal analyses and opinions, rather than those of EARS.

During my first year on staff at a Korean-immigrant church, there was an annual summer event for primary aged kids that had previously always been done by the parents and other adults of the congregation. But I planned for the secondary school students to design and execute it.

After the fact, I found out a number of people were nervous about entrusting the youth with that much responsibility. By God’s grace, not only did these 12-19 year-olds do an amazing job – the event was even better than anyone could have imagined – but they had tons of fun getting to serve, develop their own gifts and talents, and put them to use. But it only happened because people trusted me to trust these secondary school students. It only happened because we built trust into the event by sharing development.

In previous weekly comments, I have outlined why we need to build trust into big data and AI, especially as it relates to religion. First we asked: ‘Can religions trust Big Tech and governments with Big Data and AI?’ and then ‘Can religions trust themselves with big data and AI?’ The powerful digital technologies of big data and AI are terrifying in their scope and in their potential for misuse, whether they are in the hands of Big Tech (e.g. Google, Facebook, Amazon, etc.),[1] [2] governments,[3] [4] [5] or even religions.[6] [7] There is an ongoing risk of abuse[8] and already some concerning and unethical behaviour.[9] [10] [11] [12] [13] [14] [15]

But in the third article, we flipped the script from tacking trust on to digital technologies as an afterthought to push for building trust directly into how big data and AI technologies work. There are five things we can do to change the future of how the digital landscape operates. The first is to make them share access. The second is to make them share development.

Share development
Sharing development as a goal and a principle of building trust into big data and AI, at first, sounds crazy. But there are already people, organisations, and projects sharing development with the public, known as open source.[16] [17] Some amazing digital technologies, including things often considered the domain of big tech, have been developed open source: LÖVR a virtual reality platform,[18] operating systems Linux kernel[19] and Ubuntu,[20] web browsers Firefox[21] and Brave ,[22] HTTP servers,[23] encrypted messaging services Signal[24] and Telegram,[25] and officework software LibreOffice,[26] among many other more niche or technical open-source projects. Not to mention, many big tech firms have open-source projects themselves.[27] [28] [29] [30] [31] [32]

So, in one sense, what I am suggesting is neither impossible or radical: the world already has some digital technologies that have trust built into them by sharing the development of these technologies with the public, in addition to sharing access – in fact many of these are completely free. However, the scale and scope of shared development I am hoping for is more radical.

“When Helping Hurts”: flipping the script to sharing development
In 2009, Steve Corbett and Brian Fikkert published their book When Helping Hurts: How to Alleviate Poverty Without Hurting the Poor . . . and Yourself.[33] Its release began a cascade of self-reflection among Christians seeking to do development. However, this book was not alone.[34] During 1990-2000s the danger of help that did more harm than good was a growing realisation by those at work in development across the world from Christian missionaries and ministries to non-religious NGOs.[35] [36] [37] [38]

Helping hurts when it generates a dependency on the aid and when it helps the helped internalise a view of themselves as ‘less than’ or ‘incapable’,[39] [40] all while making the helpers feel superior.[41] But Corbett and Fikkert flip the script on development: instead of thinking in terms of needs, they push to thinking as much as possible in terms of assets: the rich and poor both have something essential to contribute to a relationship in which both sides can flourish.[42] Thus, trust is built into the system rather than dependence by sharing development.

Sharing development with big data and AI
Currently big tech operates like the missionaries and rich Christians once did. It positions itself as the one with intellectual and technical capital to both build technologies for those without these skills and to tell them what they should want – they have what the people need, but they also like and want to be needed. How many people use a search engine besides Google? Buying something online is nearly synonymous with Amazon. Spotify and Netflix know and market to our tastes. And, of course, unlike the missionaries, they all have financial incentive to do so.

Unless trust is built into big data and AI by sharing development, development will be done for rather than with. Development for means producing technologies that expand and demand dependence rather than empowerment. Moreover, while development for sounds like it benefits the user, it mainly benefits the developer. Thus, sharing development goes against the capitalistic impulse that seeks to create a market that can be exploited. It is a collaborative process seeking the development of digital technologies for the benefit of the community, company, government, organisation, religion, nation, and world.

Sharing development in practice
So what would sharing development look like in practice? It does not mean shared development cannot lead to profit, but it does make non-profit technologies possible. Yet, as our next article will show, it will often be in everyone’s interest to make these technologies both beneficial and profitable. Moreover, it means greatly expanding open-source development. This could come through government grants or crowdsourcing for open-source projects, but also through public pressure.

Sharing development could also involve legal regulations that require all big data and AI technologies and projects – whether done by companies, governments, religious organisations, etc. – to have a panel of representatives made up of the public or perhaps two advocates – one with ethical expertise and one with technical expertise. Another option is developing an open-source platform not just for people to collaborate on projects, but for the public to suggest projects as well. These kinds of actions would not just make the technologies themselves better but they would build trust into the way big data and AI work.

Sharing development assumes that everyday people have something to contribute to making the digital technologies of the future even better. It assumes people do not exist for the benefit of technologies, companies, organisations, or governments, rather they exist for the benefit of people. In other words, sharing development is about putting people first, not technologies, and not the entities developing and wielding them.

I have only just begun to offer some ideas, but I would love to open source this development and hear what ideas you might have to add or critique in the comments. Stay tuned for the next article on Sharing Profits. It is going to be the most radical yet.

This article was written by Anthony Buck and reflects his personal analyses and opinions, rather than those of EARS.

To all weekly comments ->

Interested in similar topics? Go to our Dashboard and receive free updates.


[1] Algorithms drive online discrimination, academic warns

[2] Analysis | The Technology 202: Nuns are leading the charge to pressure Amazon to get religion on facial recognition

[3] Perspective | China’s alarming AI surveillance of Muslims should wake us up

[4] Opinion | China’s Orwellian War on Religion

[5] Your body is a passport – POLITICO

[6] Iglesias brasileñas adquieren tecnología de reconocimiento facial para controlar la asistencia y emociones de sus fieles

[7] Kuzzma Inteligência Artificial |

[8] EUROPEAN COMMISSION Brussels, 19.2.2020 COM(2020) 65 final WHITE PAPER On Artificial Intelligence – A European approach to excellence and trust.

[9] EU companies selling surveillance tools to China’s human rights abusers

[10] Sandra Wachter, ‘Affinity Profiling and Discrimination by Association in Online Behavioural Advertising’, Berkeley Technology Law Journal 35.2 (2020 Forthcoming). Accessed 16 Dec 2020. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3388639#

[11] Indonesian app to report ‘misguided’ religion ‘risks dividing neighbors’

[12] India’s founding values are threatened by sinister new forms of oppression | Madhav Khosla

[13] Margaret Mitchell: Google fires AI ethics founder

[14] Google fires second AI ethics researcher following internal investigation

[15] Google fires top AI ethicist

[16] What is open source software?

[17] Open Source Software: The Complete Wired Guide

[18] LÖVR

[19] Ubuntu: Enterprise Open Source and Linux

[20] Linux Foundation – Decentralized innovation, built with trust

[21] The Mozilla Manifesto – Pledge for a Healthy Internet

[22] Brave Browser: Secure, Fast & Private Web Browser with Adblocker

[23] Welcome! – The Apache HTTP Server Project

[24] Signal >> Home

[25] Telegram Messenger

[26] Home | LibreOffice – Free Office Suite – Based on OpenOffice – Compatible with Microsoft

[27] Google Open Source – opensource.google

[28] Microsoft Open Source

[29] GitHub: Where the world builds software · GitHub

[30] Open Source – Amazon Web Services

[31] Apple Open Source

[32] Facebook Open Source

[33] Steve Corbett, and Brian Fikkert, When Helping Hurts : How to Alleviate Poverty without Hurting the Poor…and Yourself (Chicago: Moody Publishers, 2009).

[34] Robert D. Lupton. Toxic Charity: How Churches and Charities Hurt Those They Help (and How to Reverse It) (New York: HarperOne, 2011).

[35] Øyvind Eggen, and Kjell Roland, Western Aid at a Crossroads: The End of Paternalism (Basingstoke: Palgrave Macmillan, 2014), https://doi.org/10.1057/9781137380326.

[36] Peter Devereux, “International volunteering for development and sustainability: outdated paternalism or a radical response to globalisation?”, Development in Practice 18:3 (2008): 357-370. DOI: 10.1080/09614520802030409.

[37] Michael Palmer, “Viewpoints: On the pros and cons of volunteering abroad”, Development in Practice 12:5 (2002): 637-647. DOI: 10.1080/0961452022000017000.

[38] For more on the history, theories, and challenges of international development cf. JeanGrugel and Daniel Hammett, eds., The Palgrave Handbook of International Development (London: Palgrave, 2016); and Arjan De Haan, How the Aid Industry Works: An Introduction to International Development (Sterling, VA: Kumarian Press, 2009).

[39] Why Good Intentions Aren’t Enough

[40] Walking Well with Churches in the Majority World

[41] Why Good Intentions Aren’t Enough

[42] From Needs to Assets—Shifting the Dynamic of Ministry