Can religions trust themselves with big data and AI?
Religions are increasingly gaining access to the same technologies of big data and AI as the big tech firms and governments. But can they trust themselves with those powersI? Our analyst Anthony Buck dives into this question in this article.
Religions must face a question of trust
‘Be fruitful and multiply.’ An ancient blessing given to humanity. But if it is pronounced over digital platforms and technologies instead, does the blessing become a curse on humanity?
Without a doubt, these digital technologies are multiplying. In my last article, we considered the growing area of concern over powerful big data and AI technologies in the hands of Big Tech firms and governments. Please check it out here for a bit of explanation on what big data and AI are. We faced the question of whether religions in particular could trust Big Tech and governments with these powerful technologies. Yet, not only are Big Tech and governments beginning to use, produce, and deploy these digital platforms and technologies, increasingly, so are religious individuals, groups, and institutions.
The power differential involved in many of these technologies is inescapable – that is, the controllers of the digital technologies have most of the power and the users have almost none. An old adage says ‘power corrupts, and absolute power corrupts absolutely.’ If so, the question is, can religions – which are communities built on shared trust – trust themselves with the extraordinary power of big data and AI?
What do religions have to gain from big data and AI?
So, how are religions themselves already finding ways to use and develop big data and AI? Perhaps more than you might think.
First, religious institutions are implementing systems to harvest data and then utilise AI for analysis. In Brazil, there are churches using facial recognition technology to track attendance and suggest which attendees might need a pastoral visit based on records of the emotional expressions on their faces during church. One of these companies, Kuzzma, claims to comply with Europe’s GDPR requirements, suggesting their technology may already be in use here in Europe. Likewise, a UK app enables users to track their mental health, send automated messages to friends if their mental health declines, and guide themselves through Christian meditations. At their best, these technologies may help people in need of pastoral care connect with the support structures they need, whether from the laity or the clergy.
Moreover, big data, AI, and social media are also enabling religions to connect in new ways – especially important in a post-COVID world – reach a wider audience, and (re)build public trust in religious authorities. For example, priests, rabbis, and imams have gathered large numbers of Instagram followers. There is also a game from Polish developers in which players take on the role of the pope. 
Finally, religions are often a part of the lives and value systems of the people who are developing and using big data and AI. There are even people who want to create AI priests,  and even a god.   Moreover, the Roman Catholic Church is working with Big Tech and governments on the ethical questions surrounding big data and AI.       This benefits religions in two ways. First, religions get the opportunity to participate in shaping the future of big data and AI. Second, big data and AI have raised questions that demand input from religion: what does it mean to be human? What is a person?   What is death?  These questions have enhanced religions’ perceived value in the increasingly big data and AI age, especially because Big Tech and governments seem both to be unequipped and to have a conflict of interest in answering them.
What do religions have to lose from big data and AI?
However, just as religions question the ethics and trustworthiness of Big Tech and governments using big data and AI, people might question the ethics of religious institutions developing and using big data and AI. At some point, the question becomes whether anyone can be trusted with that much power, including religious leaders and communities.
Partnership with Big Tech and governments also raises the risk that religions will lose trust in their moral integrity inside their own communities. This is precisely what happened among many underground Catholics in China when the Vatican approved of Communist Party appointed bishops. A sense of danger arose over the complicity between the government and the Vatican: forcing people to wonder if they would still be safe. This danger only becomes more poignant as religious institutions retain greater amounts of data on their members.  Religious institutions harvesting big data and using AI discussed above might be a double-edged sword: when does concern and potential for pastoral care become the power to manipulate people under one’s care? And what happens if a religious institution or group offers that power to Big Tech or the state? Such a partnership also risks breaking down trust with those outside their communities. Trust is lost either because of actual compromise or because of the perception of complicity in any evils that might be done by Big Tech and governments. For example, what happens if the ethics the Rome Call for Ethics produces fail to actually protect people, if in social function it serves only to approve what Big Tech or governments do? The Roman Catholic Church might then be seen not as holding unethical behaviour back, but supporting only ethical talk and not ethical practice.
Another possible breakdown in trust could come from religions getting and using big data. If someone knew a church would track their facial expressions and gauge their emotional state, they might feel uncomfortable visiting. Someone might feel uncomfortable following an Instagram clergyperson, knowing they track their personal data. In other words, religions risk losing the public’s trust if their outreach and influence, just become marketing and manipulation.
Is the lust for power one of the deadly sins?
Religions risk compromising their own public trust by imitating or participating in Big Tech and governments’ desire for the power of big data and AI. It is that human inclination to accumulate power that is so risky. So, if religions cannot face their own concerns about the ethicality of big data and AI, or worse, if they violate them outright, then the issue is not just of questionable trust, but of hypocrisy. Therefore, the question should not be whether religions can get people to trust them with big data and AI, but whether they can trust themselves.
Please return next time as we consider how Big Tech, governments and religions might put in ethical safeguards and build trust with one another and the wider public.
This article was written by Anthony Buck and reflects his personal analysis and opinion, rather than those of EARS.
Interested in similar topics? Go to our Dashboard and receive free updates.
 Cf. Heidi A. Campbell and Giulia Evolvi, ‘Contextualizing current digital religion research on emerging technologies’, Human Behavior & Emerging Technology 2 (2020): 5–17.