How to build trust into Big Data/AI: Share profits
Facebook, Google, and Amazon charge the advertisers for your information and attention, but do not share the profits. So, Anthony Buck asks: is it time for Big Tech to pay you?
There it was. Gleaming in the photo. Was it a bit overpriced? Maybe. The seller said it turned on and worked. After a bit of price-negotiating, I bought a beautiful and functional 20-year-old La Pavoni espresso machine on online marketplace ebay.co.uk.
Undoubtedly eBay took a cut from that seller’s sale price. eBay provided a service connecting buyer and seller to exchange an item for a fair price. This brokerage fee arrangement is mutually beneficial and used in many areas. However, while eBay is one online broker connecting buyers and sellers of goods, many tech firms operate as brokers without paying the sellers. This is the challenge we tackle in this weekly comment.
Sharing to build trust into digital technologies
I have outlined why we need to build trust into big data and AI, especially as it relates to religion. First we asked: ‘Can religions trust Big Tech and governments with Big Data and AI?’ and then ‘Can religions trust themselves with big data and AI?’ The powerful digital technologies of big data and AI are terrifying in their scope and in their potential for misuse, whether they are in the hands of Big Tech (e.g. Google, Facebook, Amazon, etc.),  governments,   or even religions.  There is an ongoing risk of abuse and already some concerning unethical behaviour.      
How then do we adjust the system so that it is humanising rather than dehumanising? So that it better reflects that each person is created with infinite value in the image of God, as Christianity teaches (to say nothing of the similar inclinations expressed in other religions)?
There are five things we can do to make the future of how digital technologies operate more humanising. The first and second is to make them share access and share development. Now I propose the third, most controversial, but perhaps most effective way to build trust into the way the digital landscape works: share profits.
Are we the ‘users’ or the ‘used’?
Tech giants conveniently call people on their platforms ‘users’, even though they are more the ‘used’. The ‘users’ are in fact what is sold to advertisers – though not directly. The marketing genius of calling the people being sold to advertisers ‘users’ is precisely that they often think of themselves as the people who ‘use’ Facebook, Google, Amazon, etc. They think of their use of those platforms as free services or services they trade their data to access. In fact, accessing these ‘services’ is how Big Tech accesses ‘users’. People usually never think of the fact that being ‘users’ produces the product being sold to advertisers.
‘Users’ are performing a kind of work when they give away their information, their attention, their time. Yet, Big Tech keeps all the profits. They would like people to believe it is just advertisement as the world has always had it. But that is not clear. What is clear is that advertisers pay so much for their services that Big Tech dominates the list of the companies with the highest market capitalisation.  Lots of money is being made off our information, attention, and time. But should people be the site of commercialisation? Is this building a system of dehumanising commercialisation? If so, the way towards seeing people as infinitely valuable begins with asking: do you and I deserve a share of those profits?
Building trust into digital tech by sharing profits
Using big data and AI to harvest information from ‘users’ without compensating them sufficiently is a pretty big problem. However, the even bigger problem is that the system that these digital technologies make possible puts ‘users’ at the mercy of big tech and deploys structuralised exploitation on ‘users’ in the digital world.        
People have become the cash crop of the digital world. They functionally retain little control over their information or what is placed before their attention. Who they are and what they do is what is being harvested. Once harvested, no ‘user’ can know how who they are and what they do might be used by Big Tech or to whom it might be sold and what they might do with it. As the product, not the seller, they have few rights.
So, what would sharing profits mean? It would mean each ‘user’ ought to retain access to Big Tech’s ‘services’ (since they are simultaneously data extraction technologies and compensatory services). They also must be involved in what information is being sold, to whom and for how much. Finally, the majority of the value of the sale would be paid out in either the currency of sale or a usable local currency for the ‘user’.
Sharing profits builds trust into big data and AI because it puts all parties in a more mutually beneficial and interdependent relationship. The advertiser gets to advertise, Big Tech brokers collect a brokerage fee, and the everyday ‘user’ retains control not only of their data and attention but shares in the possibility of converting them into multiple types of profit. If they want to commercialise their data or attention, they can. If they feel they would ‘profit’ more from their digital privacy, they can. Above all, this changes who gets to impose the value of a person’s life.
Profit sharing is already happening, but is it worth the risks?
Several organisations and digital technologies are already sharing profits. Today you can use a web browser that will pay you 70% of the ad revenue generated from your data and attention.  Other apps offer to enhance a ‘user’s’ ability to retain control over their data and privacy and to sell access to advertisers.  Meanwhile, there are sites or apps that will pay you to watch ads or even TV.
There are risks to profit sharing. Shubham Agarwal warns that Big Tech could still game the system and pay people as little as possible for their data or turn digital privacy into a luxury good only the rich can afford. However, perhaps this could be minimised by better laws; and people would be getting paid something, not nothing. Likewise, effectively unionising the ‘user’ base sharing profits would minimise both risks.
However, Agarwal warns of a more serious danger, namely advancing the commodification of personal information. If the problem is the commodification of who we are and what we do, then demanding to share in the profits of that commodification only reinforces the root issue of commodification. It adjusts people towards thinking in terms of commodification rather than privacy and digital rights.
Unfortunately, he is probably right about this. Even more unfortunately, it is probably already too late. Our data and attention have already been commodified, and entrenched in the commercialising vortex of modern capitalistic business practices and global economic competition. Facebook even keeps data on people not using Facebook.     We would be naive to think they were alone. Worse, as discussed before, GDPR regulations ultimately leave the balance of power on the side of Big Tech and governments, not everyday people. In sum, there is a monopoly on our information and attention, but we do not hold it.
Are we destined to be commodified?
Can this commodification be undone? I hope so, but probably not overnight. Nevertheless, sharing profits does build trust into the way that big data and AI operate. More importantly, sharing profits opens the possibility of a de-commodified future or at least one where commodification is less dehumanising than it is today. Even if sharing profits does not fix or avoid every pitfall, it is a crucial first step. But to go further, we need some accountability built into big data and AI. That is a theme we will turn to next time.
This weekly comment was written by Anthony Buck and reflects his personal analyses and opinions, rather than those of EARS.
Interested in similar topics? Go to our Dashboard and receive free updates.
 Sandra Wachter, ‘Affinity Profiling and Discrimination by Association in Online Behavioural Advertising’, Berkeley Technology Law Journal 35.2 (2020 Forthcoming). Accessed 16 Dec 2020. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3388639#
 Gen 1.26-27; Eph 4.24
 Nick Couldry, and Ulises Ali Mejias, The Costs of Connection : How Data Is Colonizing Human Life and Appropriating It for Capitalism, Culture and Economic Life (Stanford: Stanford University Press, 2019).
 João Carlos Magalhães, and Nick Couldry, “Giving by Taking Away: Big Tech, Data Colonialism, and the Reconfiguration of Social Good”, International Journal of Communication, Volume 15 (1 January 2021): 343–362, https://ijoc.org/index.php/ijoc/article/view/15995/3322.
 Nick Couldry, and Ulises A. Mejias, “Making data colonialism liveable: how might data’s social order be regulated?”. Internet Policy Review 8.2 (2019). DOI: 10.14763/2019.2.1411. https://policyreview.info/articles/analysis/making-data-colonialism-liveable-how-might-datas-social-order-be-regulated.
 Alistair Fraser, “Curating Digital Geographies in an Era of Data Colonialism”, Geoforum 104 (2019): 193-200. https://dx.doi.org/https://doi.org/10.1016/j.geoforum.2019.04.027.
 D.M. Kotliar, “Data orientalism: on the algorithmic construction of the non-Western other”, Theory and Society 49 (2020): 919–939, https://rdcu.be/cxauK.
 Instead, we may need to take the strategy of turning these digital platforms into the utilities of the future. As cities once changed electricity, gas, and water from privately owned and operated for profit companies into publicly held or publicly regulated utilities, perhaps sharing profits will offer the possibility of improving the quality of life at a reasonable level of commodification to the average person.