These days it’s hard to disconnect from digital life, even temporarily. But Mozilla’s Cathleen Berger argues we should give more thought to how we can do that and control the personal data that companies are harvesting.
There’s a new kind of digital divide. It’s not the one we usually talk about when we lamentthe fact that nearly half of the world is still offline, disconnected from the internet due to economic circumstance or geographic misfortune. Rather, it’s the one that makes it a luxury or near impossibility for anyone to ever disconnect from the devices in their homes or in public.
While we are still in the process of connecting the unconnected, we are starting to realise that disconnecting comes at unbearable social, political, and economic cost. It’s a digital divide where only some have the means and knowledge to secure better privacy.
There are many dimensions to this problem.
Start with your communication patterns. You probably feel the social expectation to always be online yourself, constantly available to friends and colleagues through email, text messages, and social media news feeds. Buzzing smartphones have brought teenagers to “the brink of an emotional health crisis,” argues a recent feature in The Atlantic. And the addictive nature of services like Google, Twitter and Facebook are intentional by design.
It’s meant to be difficult to discontinue or switch services
Next, let’s look at public services. Many governments now require online access for citizens to receive basic public services like tax filings, birth certificates, or visa applications. Often there are dedicated online platforms maintained by individual agencies; in some cases such public services also require the sharing of biometric data, such as fingerprints or iris scans. Practical offline alternatives are disappearing. What is more, new technologies meant to increase the efficiency of public services are often first tried out on society’s least advantaged (and most at risk) with privacy and security features added as an afterthought—if at all.
On the business front, the largest internet companies have a level of detailed knowledge about you and everyone else that is hard to fathom. In many cases, this information is used in ways that make a person’s online experience more convenient, such as showing people relevant ads and search suggestions. When data is combined from different realms, like web searches, time spent on particular websites, friend networks, email, and app and purchase history, a snapshot of a person’s evolving needs, desires, and fears starts to form that is potentially invasive.
We live with the risk that such information could be used for political, religious, or racial discrimination. Micro-targeting aimed at influencing voters during elections is just one prominent case for this kind of data use. Of course, gathered and stored responsibly, there are countless ways data collection can benefit society. Technological developments tend to be introduced with progress and good will in mind. “Smart City” concepts, for instance, that deploy sensors and cameras to streets and public areas, are a means to help improve traffic management and decrease pollution in heavily populated areas. The “Internet of Things” brings convenient innovations into our homes, such as voice-enabled assistants that recommend nearby restaurants or smart meters that preserve energy in your home when you’re away.
What all of these smart cameras, microphones, and neat little helpers also do is collect data—about your whereabouts, your habits, or the people you meet with. Data that can be copied and sold, just as it can be leaked and end up in unauthorised hands. Think of the evolving social credit system in China, the breaches of India’s national biometric database, or everyday global tracking for marketing purposes. If we’re honest, history suggests we should be concerned about potential misuse. In this day and age, the sheer difficulty of ever regaining control of one’s digital life is clearly problematic.
How do we afford privacy?
The sad truth is, money can buy you a lot of things. For example, you can choose to buy the latest gadgets and newest devices with the best security standards: a $1,000 iOS phone or a scaled down $30 mobile running on Android? For many, the former is not even imaginable. You can also opt to pay higher insurance rates in exchange for not sharing data via your fitness apps or GPS data that helps assess your (reckless?) driving habits.
If you think even bigger, the number of people who can afford spacious living quarters that remain out of reach of public cameras is small, to say the least. Similarly, it takes a certain influence and social capital to encourage your immediate network to follow your lead on using specific services, e.g. encrypted messaging apps like Signal. Just like it takes a certain level of job security and financial stability to leave messages unanswered and simply assume that people will reach out again, if something is really important.
This problem is complex and merits many conversations. These are conversations we should not postpone any longer if we want to ensure that our fundamental right to privacy is not slowly, but surely, being sold out.
First, governments and the technology industry should work together to raise awareness, increase transparency, and provide the means to explain how our data is being used and to allow us to intervene if data is being collected, sold, or used against our will.
The EU’s General Data Protection Regulation (GDPR), coming into force on May 25, 2018, has the potential to become the gold standard for how to treat data -- not just within the EU, but worldwide. Additionally, more organisations should adopt and promote lean data practices, which aim to ensure that no unnecessary data is collected, nothing is stored for longer than crucial for the particular service offered, and everything is protected to the highest standards.
Second, projects like “Time Well Spent” deserve more attention. They call on the technology industry to refocus on strengthening user agency rather than locking them into singular platforms in a constant battle for attention. We all know the lingering concern that we might have installed a few too many apps, agreed to too many terms of services, or lost track of just how many accounts we opened up over the years -- running a step-by-step “8-day data detox” might be just what we need to regain at least some control over our digital lives.
None of the above will fix all our problems. But in order to find appropriate solutions, we have to realise and admit that we’re eroding our privacy, making it a luxury to enjoy life without constant connectivity.
Cathleen Berger is leading Mozilla’s Strategic Engagement with Global Internet Fora. In this position she is in charge of identifying emerging trends around privacy and security, digital inclusion and literacy, openness and decentralisation in order to remain aware and ahead of global tech policy developments. She tweets at @_cberger_.
The Internet Health Report, released as a draft earlier this year, assesses positive and negative developments of the Internet. Solana Larsen from the Mozilla Foundation talked to #mediadev about the open source project. (08.09.2017)