Principle 10: Public Awareness

Principle 10: Public Awareness

“With great power comes great responsibility” – at least according to Spiderman. Today, data-driven companies and states hold a lot of power, due to the knowledge they have about people and their behaviour. With this power comes a responsibility to create public awareness about how data is being collected and used in our society.

The Privacy Paradox

When asked, people generally say that they care very much about their privacy. But when we look at how people actually behave, it seems that they care very little about their privacy. This is called the ‘Privacy Paradox’.

There are several possible explanations for this paradox, but one is that people are not informed about all the ways in which their privacy is, and can be, threatened. At Seluxit, we see it as our duty to create public awareness, so that people can make better informed decisions regarding their privacy.

privacy principle

Lack of Awareness – Examples

Many people don’t know how data about them is collected. What happens to the data, and what the data are used for, remains a black box. Most people don’t know how and when their data is being collected, and what the data is being used for. However, some of the recent data scandals have increased people’s awareness of the data-driven threats to their privacy. For example, Edward Snowden’s revelations about how the National Security Agency (NSA) collects data about everyone spawned a massive public outcry. Suddenly, the public was made aware of how their privacy was threatened by the government’s collection of data.

Another recent example is the Cambridge Analytica/Facebook – scandal which occurred in the aftermath of the 2016 US presidential election and the Brexit referendum in the UK. It turned out that millions of Facebook users’ data had been analyzed in order to determine voters’ political preferences. Afterwards, voters were targeted individually with political advertisements specifically designed to make them change their minds. Most people had no idea that this was going on, or even that it was possible, before the scandal was all over the news.

principle 9 public awareness

Pro-Active Awareness

Instead of waiting for the next scandal to occur, it is much better to try to increase public awareness proactively. In fact, we believe that data companies, who have the knowledge that the public lacks, have a special duty to share some of that knowledge publically.

At Seluxit, we do several things to live up to this duty. We talk about IoT and data in public venues, we engage in educational activities, and we write about data ethics in these principles.

It is important to create awareness on issues of data ethics.

Principle 9: Compatible Purposes

Principle 9: Compatible Purposes

Any company that collects and uses personal data should be accountable to its data subjects and the society around it. But what does it mean to be accountable in a data context?

Data Analytics

‘Data analytics makes it possible to extract new knowledge from a given data set. This means that when a person, called Smith, gives a company access to his data, the company may be able to extract new knowledge from this data, even knowledge that Smith didn’t know could be extracted from the data.

According to the Principle of Consent (and the GDPR), any company should get consent from Smith every time they use the obtained data for a new purpose unless Smith has explicitly given consent for this purpose. This also includes when the company processes data.

Compatible Purposes

The best way to comply with this is to make sure that you consider the purpose of the data processing, every time data is processed. If you are doing this for a purpose that is not the same as the purpose for which you collected the data in the first place, then it is a good sign that you should stop and consider whether you need new consent.

Here, the Principle of Compatible Purposes resembles the idea in the GDPR’s Article 5.2: “… collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes…” (GDPR Article 5.2).

Let’s look at an example to see the practical implications of this.

compatible purpose

An Example

Smith is a user of a social media company. Smith gave consent to the company collecting certain personal data for the purpose of targeted marketing. Jones works as a data analyst at the company. He discovers that the users’ data can be utilized to determine their political preferences. Jones starts employing data analytics on all users’ data, including Smith’s, and now the company holds precise political profiles of all their users.

Here, the company should inform Smith about the change in purpose and ask for a new consent. After all, he never knew that his consent implied that the company would create a political profile of him.

principle 6 compatible purpose

Complying with the Principle of Compatible Purposes

It can be very hard to determine whether two purposes are compatible. In order to be on the safe side, it’s a good idea to get new consent from the data subject whenever the purpose of data processing is not exactly the same as the purpose for collecting the data.

At Seluxit, we try our best to inform the data subjects and collect new consent, every time the purpose of processing their data changes. And, we strongly encourage our customers to do the same.

The purpose of processing data should be compatible with the purpose of collecting the data.

Principle 8: Accountability

Principle 8: Accountability

Any company that collects and uses personal data should be accountable to its data subjects and the society around it. But what does it mean to be accountable in a data context?

What is Accountability?

Simply put, accountability has to do with taking responsibility for one’s actions and the consequences thereof. In a data context, it often means being able to explain and justify how, when, what and why data is collected. Legally speaking, a company must take certain organisational measures to answer these questions, and demonstrate that it complies with the relevant laws and regulations (See The EU’s independent data protection authority). But accountability also means that the company must take responsibility for unforeseen consequences of their use of data. Let us look at an example.

public data

Example

Imagine an online dating app which profiles people in order to better connect potential partners with each other and target users with relevant advertisements. Due to poor data security, a political campaign gains access to users’ personal data. The political campaign collects the data and uses them to determine people’s political preferences, in order to better target people with political advertisement, and spread fake news about opponent parties.

principle 8 accountability

Damage Control and Remedies

In the example above, the political campaign that ‘stole’ the data are certainly responsible for a lot of wrongdoing. But that doesn’t exonerate the app company from blame. The app company should have protected the users’ data better. The app company should, first of all, do all that it can to stop the problem before it unfolds further. Second, it should inform all the involved data subjects about the situations. Third, it should resolve the security problem so it doesn’t happen again, and finally, it should try to remedy all consequences of the breach. Such remedies can include accepting legal consequences, voluntarily compensating the ‘data victims’ etc.

Compliance with the Principle

At Seluxit we are accountable for all our actions, and we take the responsibility that comes with holding and using data very seriously. We comply with all relevant data regulations, including the GDPR, and should our use of data have unforeseen bad consequences in the future, we will inform everyone involved and do our best to resolve the issue.

It is Important to Take Responsibility for How You Handle Data.

Principle 7: Security

Principle 7: Security

When you tell a secret to a friend, you normally don’t want your friend to tell that secret to anyone else. Depending on what the secret is, someone’s reputation or even physical safety can be threatened, if your friend doesn’t keep the secret. The same goes when you give your personal data to a company or a state.

What is Security?

Simply put, security is about protecting data from unauthorized access, use, alteration or deletion. Different measures can be taken in order to make sure that data is secure. For instance, you can physically locate the data somewhere safe, you can encrypt the data so that the data are completely incomprehensible without the correct translation key, and so on.

Why is Security So Important?

Security is important for several reasons. One reason is that it helps protect people’s privacy. If customers give their personal data to a company, it is normally with the expectation that the company protects the data. If the data in question are very personal, a security breach can lead to an immense invasion of privacy.

Security is not only an important feature of privacy protection, it is also important for avoiding blackmailing, financial fraud, espionage, and so on.

Example #1

Smith owns and uses a wearable fitness tracker, which collects data about Smith’s activity level, heart rate, sleep patterns, location, etc. Unfortunately, the manufacturer who collects all the data has very poor data security. One day, a hacker named Jones gains access to all of Smith’s data. Jones can see that Smith’s heart rate normally goes up for 30 minutes every Tuesday night at the same location. Jones makes a quick Google search to see who lives at that location. It turns out that it is one of Smith’s colleagues. Jones now uses this information to blackmail Smith. He wants $10.000, or he will reveal to Smith’s wife, children and friends that Smith is having an affair with the colleague.

principle 7

Example #2

Smith owns a robotic vacuum cleaner, which is connected to the internet. As it moves around in Smith’s house, it collects data about when it is used, where it is used, etc. After having vacuumed Smith’s house for a few months, the manufacturer of the vacuum cleaner has a detailed map of Smith’s house, and they know when Smith is usually not home. Unfortunately, the manufacturer of the vacuum cleaner also has very poor data security, so Jones is able to get access to Smith’s vacuum data. He sells this data to a group of burglars, who breaks into Smith’s home the next day when he is not home.

general ethics example

Bad Business and Harmful Consequences

Any company or state who collects data, especially personal data, should make sure they have a sufficient level of security. For a company, the consequences of a data breach can be devastating, since it can completely undermine customers’ trust in the company. This is of course bad for the company. But as the examples above have demonstrated, security breaches can also have outright harmful consequences for the data subjects.

Make Sure that Data Is Secure.

Principle 6: Profiling

Principle 6: Profiling

Many companies and states create very detailed and precise profiles of you, for many different reasons. Some uses of profiling are completely unproblematic, while others are very unfair. The Principle of Profiling is simple: Avoid Unfair Use of Profiling.

What is Profiling?

Profiling often refers to the automatic processing of data about a person, often with the intention of making statistical predictions about the person. In order to make precise profiles, companies and states collect huge amounts of data. Statistical algorithms (data analytics) are employed on the data in order to find patterns.

These patterns can be used to predict whether people are likely to buy a certain product, vote for a certain political party, commit a crime, and so on. In short, profiling is used to make all sorts of important decisions which can have a very significant impact on people’s lives.

When is Profiling Problematic?

Profiling is not problematic in itself. We all make profiles of eachother in our minds all the time. We get information about each other, and we automatically form opinions and make predictions about each other all the time. So when is profiling problematic? Simply put, profiling is problematic when it is used in a harmful or unfair way. Let us look at an example.

An Example

Imagine that Smith and Peter both apply for long-term loans in the same bank, in order to buy new homes for their respective families. They both have the same age, the same income, and so on. In order to decide whether they are eligible for the loan, the bank profiles Smith and Peter. The algorithm recommends that Smith is not eligible for the loan, while Peter is.

Smith complains to the bank advisor Jones, explaining that he and Peter have the same age, the same income etc., so he doesn’t understand why he is not eligible why Peter is. Jones explains that historically, people living in Smith’s neighbourhood tend to have a bad track record when it comes to paying back long term loans. This, and only this, is the reason why the algorithm declined Smith’s application but approved Peter’s.

profiling example

Fairness

Most people will probably agree that the way profiling is used in the example above is unfair. The profiling of Smith and Peter is not problematic in itself, but it is used in a discriminating and unfair way, by treating them differently due to how people in Smith’s neighbourhood have behaved in the past.

All things being equal, profiling should be used fairly in a way that treats like cases alike. If profiling is used to determine whether someone is eligible for a loan, it should treat people of similar age and similar financial circumstances alike. Similarly, if profiling is used to determine whether someone is eligible for a life-saving medical procedure, it should treat people with similar symptoms and health conditions alike, and so on.

When humans make decisions, they are not always treated like cases alike either. Humans are biased in all sorts of ways. Profiling can help decrease biases, but profiling can also confirm or amplify biases.

Biased Algorithms

Besides treating like cases alike, the algorithms used for profiling should not confirm or amplify biases. In the example above, it might be true that people in Smith’s neighbourhood tend to have bad track records when it comes to paying back long term loans. But the neighbourhood itself is probably not the real cause of it. It’s more likely that some people with bad track records tend to move to that neighbourhood for some other reason, for example, that it’s cheap. When Smith’s application is turned down, the algorithm confirms that people from that neighbourhood should be denied loans.

As much as possible, profiling algorithms should be based on real causes, rather than on spurious correlations with factors like location, race, ethnicity, physical appearance etc. In general, we must avoid the unfair use of profiling.

Avoid unfair use of profiling.

Principle 5: Public Data

Principle 5: Public Data

Just because data about people are publicly available, it does not always mean that a company may collect or use these data. Publicity is not a carte blanche. At the same time, there is certain data that ought not to be completely private, such as data about people’s contagious diseases like the swine flu or COVID-19.

Waiving Rights

In the Principle of Data Ownership (Principle 1), we saw that people often lose their control rights over their data, when they make these data publicly available. When Smith walks down the street, and Jones can see Smith’s physical appearance, Smith has waived his control right over data about his physical appearance. All else equal, Jones may now tell his friend Peter about Smith’s physical appearance.

There are many cases where it is straightforward to determine whether someone has waived control rights over data. Let’s consider a few examples.

principle5-public-data

Examples

Imagine that Smith is keeping a diary. Unbeknownst to Smith, Jones opens the diary and takes pictures of the pages and uploads them on a social media platform. Much personal information about Smith is now publicly available.

Does that mean that anyone may now use this data as they wish? Clearly not. If for example Peter sees the pictures, and he knows that Smith did not want them to be published, then Peter should not distribute them further. And, Peter should not store the pictures for future use.

Similarly, if people’s security numbers – or other personal information about people – are leaked for public display, due to an accident or a cyber attack, then other companies should not access this information or store them for future use in their databases.

What these examples show is that data is not always published voluntarily by the data subject. In those cases, the data subject has not waived the rights over this data, so others should abstain from accessing, storing, using or further distributing this data.

principle 5

Borderline Cases

Most people will probably find the Principle of Data Use fairly uncontroversial. It seems intuitive enough that if people publish personal information voluntarily, then they lose their right to control the information. And if people’s personal information is published involuntarily, then they have not lost this right.

But in some cases, it is not so straightforward to determine exactly what the limit is between being private and being publicly available.

Imagine that Smith is sitting on a bench in a public park with his friend Carl. A few people are walking by, but no one seems to be paying attention to Smith and Carl’s conversation. Smith tells Carl all sorts of personal information about his sex life. Unbeknownst to Smith and Carl, Jones is eavesdropping on their conversation.

Smith has decided to talk about personal information in a public space, but does that mean that Jones can eavesdrop on the conversation, use and distribute the information further as he wishes? People may have different moral intuitions about this case.

In order to be on the safe side, and in order to avoid breaking the Principle of Trust (Principle 4), it is a good idea to only collect publicly available data if these data are clearly published voluntarily by the data subject itself. In addition, it’s a good idea to inform the data subject that the data is being collected. In a time where personal information is flowing more or less freely online, it is important to remember: It’s Not Always Right to Use Data Just Because It’s Public.

It’s Not Always Right to Use Data Just Because It’s Public.