Skip to content

Why Privacy Matters

First they came for the socialists, and I did not speak out, because I was not a socialist.
Then they came for the trade unionists, and I did not speak out, because I was not a trade unionist.
Then they came for the Jews, and I did not speak out, because I was not a Jew.
Then they came for me, and there was no one left to speak for me.
Martin Niemöller (1946)

With technological innovation come great opportunities, but also great risks. Google has democratized flights, made it possible to find cheap travel options, helped consumers adjust prices, and lowered the cost of entry for entrepreneurs by offering services like Gmail and Google Drive that enable a cheap, convenient, and efficient way of doing business (Inevitable Human, 2019). However, monopolies over online search results, chat messages, and app store downloads allow large corporations to monitor our online behaviour and influence our decision-making process.

Machine learning, artificial intelligence, cloud computing and the Internet of Things are creating a constantly growing data cosmos. Amazon knows what we want to buy because it has access to our search history. Facebook (now Meta) knows what ads to present to its users because it can monitor the conversations on its social platforms, including WhatsApp and Instagram. Soon your fridge can tell you’re out of milk and notify Amazon to suggest you buy new milk, which you can pay for with AmazonPay and have it shipped straight to your home. This creates convenience for us consumers at first glance, but the convenience that comes with enterprise-driven technology comes with a social price, the loss of privacy.

There is a discrepancy between our perception of privacy in the analog and in the digital space. In a face-to-face conversation, for example in a restaurant, we would be quite shocked if the restaurant staff listened to our conversation to find out what dessert to serve. If a post office opened every single letter to place advertising directly in our mailbox, we would be outraged. But if Facebook (Meta) monitors messages to send us personalised ads, we’re ok with that, even if the companies that control our data are far less accountable than governments. In the USA for example, there are no federal data protection laws (Thompson & Warzel, 2019). We tend to leave our social values behind in the digital space.

It’s not about rejecting technology. But, we consumers should remain critical: Who learns from global data streams? What happens when authority fails? Which accumulation logic will shape the answers to these questions?

Shoshana Zuboff (2015) explains that in the history of capitalism, each epoch had its own dominant logic of accumulation. The 19th and 20th centuries were characterised by corporate capitalism based on mass production, which gave way to finance capitalism towards the end of the century. The 21st century has introduced a new logic of accumulation: Big Data. Around three of the world’s seven billion people are connected to the Internet, enabling computer mediation of a variety of their daily activities. The mediation of consumer information enables predictive analytics, which companies use to increase revenue and reduce costs and capital losses. As a result of pervasive computer mediation, nearly every aspect of the world is rendered in a new symbolic dimension as events, objects, processes, and people become visible, knowable, and shareable in a new way. The world is reborn as data […]’. This has created a new asset class: surveillance assets and ushered the economy into the era of surveillance capitalism (Zuboff, 2015).

This new logic of accumulation has introduced a new paradigm. Nearly 70 years ago, historian Karl Polanyi observed that the market economies of the 19th and 20th centuries depended on three mental inventions that he called “fictions”. First, human life subordinated to market dynamics and was reborn as ‘labor.’ Second, nature subordinated and was reborn as ‘real estate’. Third, exchange was reborn as ‘money.’ Polanyi argues that the very possibility of industrial capitalism depended on the creation of these three critical “fictitious commodities” that were previously freely available. Life, nature and exchange were turned into things to be bought and sold profitably. Zuboff adds that with the new accumulation logic of surveillance capitalism, a fourth fictional commodity emerges as the dominant feature of market dynamics in the 21st century: reality reborn as ‘behaviour’.

“Reality itself is undergoing the same kind of fictional metamorphosis as did persons, nature, and exchange. Now ‘reality’ is subjugated to commodification and monetization and reborn as ‘behavior.’ Data about the behaviors of bodies, minds, and things take their place in a universal real-time dynamic index of smart objects within an infinite global domain of weird things. This new phenomenon produces the possibility of modifying the behaviors of persons and things for profit and control. In the logic of surveillance capitalism there are no individuals, only the world-spanning organism and all the tiniest elements within it.” (Zuboff, 2015).

Our location and behavior become an identity, a digital DNA. French philosopher Jean-Francois Lyotard warned in the 1980s that as knowledge becomes a commodity, private companies could begin to control the flow of knowledge and decide who can access what types of knowledge and when. In just a few decades, personal location data has gone from a record no one ever knew existed to a public record that any organization can easily access. In their NY Times article “Twelve Million Phones, One Dataset, Zero Privacy,” Thompson & Warzel (2019) explain that “citizens would surely rise up in outrage if the government attempted to mandate that every person above the age of 12 carry a tracking device that revealed their location 24 hours a day. Yet, in the decade since Apple’s App Store was created, Americans have, app by app, consented to just such a system run by private companies.”

On the one hand, we gain value when we disclose our location and personal information. If you lose your phone somewhere, there is a real chance of finding it. We get highly accurate traffic navigation that anticipates delays for us and saves time commuting to work. We can hail a cab wherever we are, whenever we want (Inevitable Human, 2019). On the other hand, we expose ourselves. Thompson & Warzel tracked people in positions of power using publicly available information to show how easy it is to access personal data. The authors identified individual residential addresses and then tracked military officials with security clearances on their commutes. They followed police officers as they took their children to school. They tracked high powered lawyers as they travelled with private jets to vacation properties. They were even able to make predictions about the private lives of the people they were following. By examining movement on a map, they saw evidence of faltering marriages, evidence of drug addiction, and records of visits to mental health facilities. As Thompson & Warzel showed, surveillance capitalism leaves little room for privacy.

Shoshana Zuboff quotes US Supreme Court Justice Douglas who articulated his view of privacy in 1967: ‘Privacy involves the choice of the individual to disclose or to reveal what he believes, what he thinks, what he possesses …’ (Warden v. Hayden, 387 US 294,323, 1967, Douglas, J., dissenting, quoted in Farahany, 2012: 1271). Data privacy rights mean decision-making rights; Privacy creates choices, it should allow you to make a decision about where on the scale between secrecy and transparency you want to be in the given situation. In a free society it is up to everyone whether and how they like to share personal information. But when the basic tool to respond to that freedom, money, is only available through a smartphone, an app, or other corporate or government-controlled technological tools that expose our behavior, we become unfree, we have no choice to selectively reveal ourselves.

As Eric Hughes famously explained in 1993, when ‘[…] we desire privacy, we must ensure that each party to a transaction has knowledge only of that which is directly necessary for that transaction. When I purchase a magazine at a store and hand cash to the clerk, there is no need to know who I am. When I ask my electronic mail provider to send and receive messages, my provider need not know to whom I am speaking or what I am saying or what others are saying to me; my provider only need know how to get the message there and how much I owe them in fees. When my identity is revealed by the underlying mechanism of the transaction, I have no privacy. I cannot here selectively reveal myself; I must always reveal myself.’ (Eric Hughes; A Cypherpunk’s Manifeso, 1993).

In the age of surveillance capitalism, it is becoming increasingly difficult to live out our personal freedoms. Because the medium of money, with which we are supposed to live out our freedoms, does not guarantee privacy. This is increasing as the world moves towards a cashless society. 20% of Swedes say they have never withdrawn cash (Reid, 2019). In the UK, cash is set to fall below 10% of all payments over the next 15 years. Asia, the world’s largest and most populous continent (Asia Population Live), is home to eight of the top ten fastest growing global mobile payment markets. The problem is that cashless payments and mobile banking are highly dependent on two things: a bank account and a third party. In Southeast Asia, an estimated 73% of the population is unbanked. Mobile payments will help these people get a bank account, but at the same time create massive financial dependency on third parties and lead to financial surveillance (Twigg, 2019). 

Cashless currently means organized data. Every cashless transaction leaves behind data that companies collect to organize information about consumers. In addition, this data is obtained from the government. In China, Alipay, the mobile payment app from e-commerce giant Alibaba, is using its financial data to help the government build a social credit system. Citizens are monitored, rewarded and penalised based on their economic and social behavior. We need private electronic money, i.e. bitcoin, to fight financial surveillance and create a more inclusive and free global financial infrastructure. Financial surveillance will increase with the introduction of central bank digital currencies (CBDC). A CBDC is a blockchain-based digital currency issued by a central bank. This will allow central banks to fully monitor money flows and spending patterns, and increase the money supply at will. Christine Lagarde, President of the European Central Bank, has expressed her interest in CBDC numerous times. The Chinese government intends to launch the digital yuan during the 2022 Beijing Winter Olympics (Rooks, 2022). The Federal Reserve has also commented on its intention to launch a digital dollar, publishing a full section titled “What is a central bank digital currency?” dedicated to CBDCs on its official website.

Were it not for NSA whistleblower Edward Snowden in 2013, the public would not be aware of the extent of government-sponsored surveillance and infiltration of online privacy. The National Security Agency (NSA) is a national-level intelligence agency of the United States Department of Defense. Snowden leaked highly classified documents of widespread government surveillance by the NSA when he was an employee and contractor. A year after Snowden’s revelations, the PEW Research Center released a report on public perceptions of privacy that found 91% of US adults agree or strongly agree that consumers have lost control of their personal information (Madden, 2014). Before Snowden’s revelations, that number was significantly lower.

Regardless, the 2016 US presidential election showed that most are not yet skeptical about the information they receive online and don’t care about privacy in the digital realm, nor understand how personal information they share online is used by cooperations to influence their behavior. The “Cambridge Analytica scandal” revealed that digital advisors to the Trump campaign misused the data of millions of Facebook users by psychologically profiling users in order to target voters and thus manipulate voter decision-making. Cambridge Analytica was a British policy consultancy and data analytics company. The company was able to get information about American voters by analyzing their behavior on Facebook. The information was then sold and used to manipulate voters (Confessore N, 2018).

The “Cambridge Analytica scandal” is one of many showing how the public’s privacy is being violated. In late 2019, it was revealed that the City of London was secretly conducting a test case of using live facial recognition at Romford train station. British police installed cameras near Romford train station and monitored every person passing by without anyone’s consent while matching faces to a police watch list of criminals (Murgia, 2019). Romford railway station is believed to have been one of 10 or more proving grounds for British police. During the investigation at Romford railway station, some pedestrians who witnessed the action refused to be filmed. As a result, officers imposed fines, arguing that these individuals were actively interfering with police work. It’s safe to say that citizens have not only been deprived of their privacy rights, they have even been punished for actively demanding them.

The general consensus is that government surveillance doesn’t concern us as long as we have nothing to hide. The reality is that anyone in a profession where information needs to be protected – journalists, activists, lawyers, judges, doctors, politicians, even government officials and police officers – should fear government surveillance and the resulting accessibility of private information (Inevitable Human, 2019).

State media in China claim that Skynet, the country’s connected system of surveillance cameras with facial recognition software, is the largest video surveillance system in the world (Bischoff, P. 2021). In the city of Shenzhen, Chinese police use facial recognition technology to send jaywalkers instant fines by text. In addition, with the help of face recognition, pedestrians who violate road rules are identified, when crossing a red traffic light for example, and penalised. Authorities use giant screens to name and shame jaywalkers. The system uses 7-megapixel cameras and facial-recognition technology to identity pedestrians from a database. A photo of the offence, the offender’s family name, and part of their government identification number is then displayed on a screen (Baynes, 2018).

Authorities explained that the next step of issuing fines through text includes a direct debit of the violator’s bank account. Wang Jun, director of marketing solutions at Shenzhen-based AI firm Intellifusion told the South China Morning Post that his company is in talks with Chinese social media platforms WeChat and Weibo about such a system. The technology will also register how many times someone has broken traffic rules and affect their credit score when a set limit is reached. The Chinese government has powerful influence over all private companies and uses its influence to control its population. In China, public and private surveillance go hand in hand.

History has shown the dangers of governments. In Germany from 1933 to 1945, Jews, Sinti & Roma, homosexuals, political prisoners and others were deported to concentration camps and murdered en masse. Women did not get the right to vote in the UK until 1928 (British Library Learning, 2018). As recently as 2017, women in Saudi Arabia were granted the right to access government services without the consent of a guardian (Persio, 2017) and in 2018 King Salman of Saudi Arabia issued a decree allowing women to drive (Specia, 2019). In hindsight, many of the laws governments made were tyrannical. Therefore, enforcing the law with the help of sophisticated surveillance technology poses a serious threat to our personal freedom.

Imagine if the Nazis were in possession of sophisticated surveillance technology during their totalitarian rule. During the occupation of Poland in World War II, the Nazis established the Warsaw Ghetto. An approximately 3.4 square kilometer district in Warsaw where around 400,000 Jews from all over Poland lived. Some of its residents survived because the Nazis, when liquidating the ghetto, were unable to monitor the movements of each individual resident, allowing some to hide in underground tunnels and lairs. With smart cameras and facial recognition, the Nazis could have liquidated the entire ghetto population.

We should also critically consider and examine the unintended consequences of the creation, processing and centralisation of biometric data in the context of counter-terrorism and humanitarian contexts Imagine if the Nazis were in possession of sophisticated surveillance technology during their totalitarian rule. During the occupation of Poland in World War II, the Nazis established the Warsaw Ghetto. An approximately 3.4 square kilometre district in Warsaw where around 400,000 Jews from all over Poland lived. Some of its residents survived because the Nazis, when liquidating the ghetto, were unable to monitor the movements of each individual resident, allowing some to hide in underground tunnels and lairs. With smart cameras and facial recognition, the Nazis could have liquidated the entire ghetto population.

We should be critically consider and examine the unintended consequences of creating, processing and centralising biometric data even in counter-terrorism and humanitarian contexts (Jakobsen, K., L.). Most recently, the biometric data of all US government and military collaborators in Afghanistan have fallen into the hands of the Taliban, who are now relentlessly pursuing them. Imagine if the Nazis were in possession of sophisticated surveillance technology during their totalitarian rule. During the occupation of Poland in World War II, the Nazis established the Warsaw Ghetto. An approximately 3.4 square kilometre district in Warsaw where around 400,000 Jews from all over Poland lived. Some of its residents survived because the Nazis, when liquidating the ghetto, were unable to monitor the movements of each individual resident, allowing some to hide in underground tunnels and lairs. With smart cameras and facial recognition, the Nazis could have liquidated the entire ghetto population.

Public and private surveillance will only increase as technology advances. People must be empowered to circumvent unfair government regulations and enhance their personal liberties. The basis of our actions is money, without money we cannot live out our personal freedoms. We cannot move, organize or communicate freely. It takes censorship-resistant money, independent of government and private corporations, to enable a freedom of transaction and the resulting freedom of action that is as innate as freedom to think and speak. As a society, we need to understand that social values that have dominated the analog space need to be transferred to the digital space. The constitutional right to privacy of communication and sanctity of one’s home should apply to the Internet. It all starts with an unconfiscatable money, i.e. bitcoin (Bitcoin Magazine, David Bailey – 10th Anniversary).

Continue reading: An Introduction: Welcome to Bitcoin 

If you enjoyed this piece, you can send me a tip to:
⚡ law@getalby.com

Follow me 
leonwankum.com 
Twitter (X)
Nostr