January 16, 2021
- The issue of data privacy has come under the spotlight, with debates triggered by the police’s use of TraceTogether data and the privacy policy changes of WhatsApp
- Besides your name, age, gender, education and employment, data companies know what vehicle you own, the size of your home, your socioeconomic status, the websites you visit, your tendency to default on loans, and even your health problems
- The enormous use of personal data by corporates and governments have become part and parcel of today’s connected society and quality of life, but also risks being misused or falling into the wrong hands
- Data privacy law differs from country to country. Experts say Singapore’s Personal Data Protection Act is not designed to ensure data privacy as its chief aim, but was conceived to achieve a narrower goal of protecting personal user data
- The onus is also on the individual to keep asking questions about their personal data and not simply sign away their privacy, experts added
SINGAPORE — When messaging app WhatsApp’s new privacy policy sparked a global exodus from its services, Mr Darren Chin’s company — a local tech firm — decreed that its entire staff was to cease using the Facebook-owned platform for work.
“We’ve all been using WhatsApp for many years, but the company’s top management decided to ban it and we cannot say no,” said Mr Chin, 52.
“Everyone in the tech chat groups in my company switched to Signal, so I did so as well. I agree with the decision too: Data privacy is important to me,” the IT operations specialist told TODAY — ironically in an interview conducted over Facebook at his request.
Signal, as well as Telegram, are rival encrypted messaging apps that have surged in popularity amid the fiasco as an alternative to WhatsApp.
WhatsApp has since come forward to clarify that its new terms will not allow its parent company to access the app users' messages, and delayed its February deadline for users to accept the terms. But the new policy will still allow WhatsApp to share more information with Facebook and roll out advertising and e-commerce.
Around the same time, a similar controversy unfolded involving Singapore’s contact-tracing system TraceTogether, after Singaporeans found out earlier this month that their Bluetooth proximity data could be used for criminal investigations.
This led some users to switch off their apps or leave their tokens at home, despite the fact that such data is critical in fighting the unprecedented Covid-19 pandemic, TODAY previously reported.
The dust has yet to settle on both controversies. The Singapore Government announced on Jan 8 that it would introduce new legislation to limit how the police might use TraceTogether data to investigate only serious offences, such as murder, terrorism and rape.
Still, the events of the past two weeks have stirred up intense interest among many in Singapore on the topic of personal data, leading an enterprising few to pore through the privacy policies of their social media services or probe through the Criminal Procedure Code, the law that empowers the policing authorities here to access all kinds of data for investigative purposes.
Work and social chat groups have also been uprooted from WhatsApp as some decided to switch to a different platform, before continuing their spirited discussions on the topic.
A survey by accounting firm KPMG in 2016 found that compared to other citizens in Asia, Singaporeans felt the most defenceless over the way organisations handled and used their personal information, and were also among the most concerned about the handling and use of their personal data.
Despite this, most people still have no qualms clicking through message prompts that say “I consent to the use of my data” whenever they are using a new device, downloading a new application, or visiting a website for the first time.
Professor Simon Chesterman, dean of the National University of Singapore (NUS) Faculty of Law, said: “Something that has long puzzled those who study data protection laws is the wide gap between what people say and what they do.
“Everyone claims to care about privacy in theory, but then in practice they share the most intimate details of their life with telco companies, or through social media, the whole world.”
[See this comment on how people are unthinking in the internet age. Quote:
Yes, you can do everything allowed by Facebook or whatever else social media you are using to minimise exposure. So tell me, how do you ensure that those privacy settings and policies are enforced and maintained? What happens when those policies are changed? Can you stop them from being changed? How?
We post and share all sorts of private information on Social Media. Then get indignant or surprised that people know so much about us.]
While awareness of data privacy has grown among Singaporeans since the Personal Data Protection Act (PDPA) came into force in 2012, there is still a lack of understanding of whether their digital footprint is being used by others for fair or foul purposes.
With the help of technologists, lawyers, privacy researchers and advocates, TODAY explores the debate that lies at the intersections of commerce, law and ethics.
THE BUSINESS OF DATA
In October 2000, Google released its advertising platform AdWords for the first time, a service that allows advertisers to target potential customers by matching the ad to their Google searches.
By 2019, Google advertising had become responsible for a whopping 83 per cent of its revenue of US$160.7 billion (S$213.03 billion) — all the while allowing people to Google things on the internet for free.
To many people, it is a boon to get a customised, online experience without paying a single cent, because your device knows what you like and predicts what you want to see.
“Data has become crucial to providing enhanced user experiences, creating a more connected society and improving our quality of life,” said Ms Rachel Ler, general manager and vice-president of data protection firm Commvault.
Google’s success story mirrors that of the other Big Tech firms, Amazon and Facebook, as well as the multi-billion dollar industry of smaller-sized data brokers that has blossomed since the dawn of the internet.
Associate Professor Lim Yee Fen, an author of six books on privacy law and technology, said: “The amount of information about you held by private companies and governments is enormous — they know more about you than you know about yourself.
“It’s no question that an entity like Facebook can easily build a full profile about you, about where you live, where you work and also what you are,” said the business law professor at Nanyang Technological University (NTU)’s Nanyang Business School.
Today, there are more than 4,000 data brokerage firms around the world that collate databases of people by scraping the data from online records or buying them outright.
Acxiom, one of the largest data brokers that is based in the United States, has detailed lists of around 2.5 billion people in more than 60 countries, a figure that it proudly boasts on its corporate website.
Associate Professor Terence Sim, the assistant dean of communications at the NUS School of Computing, said: “Companies are willing to pay for such data to better target their advertisements, or to discern customers' needs, or simply to increase its base of customers.”
It is hard to pin down a dollar value on how much personal data is worth, Assoc Prof Sim said, noting that there is also a substantial degree of personal data being traded on the dark web.
The variety of data that feeds this industry is staggering — Acxiom knows 11,000 things about each person in its database, for example.
Besides your name, age, gender, education and employment, these companies know what vehicle you own, the size of your home, your socioeconomic status, the websites you visit, your tendency to default on loans, and even your health problems.
And these databases by brokers arguably pale in comparison to what Facebook, Alibaba, Google, or even your local telco and credit card company, knows about you, since their users voluntarily submit personal information in order to use these services, said experts.
Mr Alfred Siew, who co-founded tech-centric news portal Techgoondu, said: “The scary thing is the market power that some of these private companies have, especially when they expand into adjacent industries that they previously had no business in — like Facebook rolling out its own cryptocurrency, for example.”
With the help of technologists, lawyers, privacy researchers and advocates, TODAY explores the debate that lies at the intersections of commerce, law and ethics.
THE BUSINESS OF DATA
In October 2000, Google released its advertising platform AdWords for the first time, a service that allows advertisers to target potential customers by matching the ad to their Google searches.
By 2019, Google advertising had become responsible for a whopping 83 per cent of its revenue of US$160.7 billion (S$213.03 billion) — all the while allowing people to Google things on the internet for free.
To many people, it is a boon to get a customised, online experience without paying a single cent, because your device knows what you like and predicts what you want to see.
“Data has become crucial to providing enhanced user experiences, creating a more connected society and improving our quality of life,” said Ms Rachel Ler, general manager and vice-president of data protection firm Commvault.
Google’s success story mirrors that of the other Big Tech firms, Amazon and Facebook, as well as the multi-billion dollar industry of smaller-sized data brokers that has blossomed since the dawn of the internet.
Associate Professor Lim Yee Fen, an author of six books on privacy law and technology, said: “The amount of information about you held by private companies and governments is enormous — they know more about you than you know about yourself.
“It’s no question that an entity like Facebook can easily build a full profile about you, about where you live, where you work and also what you are,” said the business law professor at Nanyang Technological University (NTU)’s Nanyang Business School.
Today, there are more than 4,000 data brokerage firms around the world that collate databases of people by scraping the data from online records or buying them outright.
Acxiom, one of the largest data brokers that is based in the United States, has detailed lists of around 2.5 billion people in more than 60 countries, a figure that it proudly boasts on its corporate website.
Associate Professor Terence Sim, the assistant dean of communications at the NUS School of Computing, said: “Companies are willing to pay for such data to better target their advertisements, or to discern customers' needs, or simply to increase its base of customers.”
It is hard to pin down a dollar value on how much personal data is worth, Assoc Prof Sim said, noting that there is also a substantial degree of personal data being traded on the dark web.
The variety of data that feeds this industry is staggering — Acxiom knows 11,000 things about each person in its database, for example.
Besides your name, age, gender, education and employment, these companies know what vehicle you own, the size of your home, your socioeconomic status, the websites you visit, your tendency to default on loans, and even your health problems.
And these databases by brokers arguably pale in comparison to what Facebook, Alibaba, Google, or even your local telco and credit card company, knows about you, since their users voluntarily submit personal information in order to use these services, said experts.
Mr Alfred Siew, who co-founded tech-centric news portal Techgoondu, said: “The scary thing is the market power that some of these private companies have, especially when they expand into adjacent industries that they previously had no business in — like Facebook rolling out its own cryptocurrency, for example.”
He cited a 2012 case when Target, a chain of departmental stores in the US, sent coupons for baby clothes and cribs to a girl who was still in high school, prompting complaints from her father.
The store manager apologised for the error, but as it turned out, it was Target’s pregnancy prediction system that correctly guessed that the girl was pregnant.
“Her father just didn’t know. Target knew about his daughter’s pregnancy before he did,” said Mr Siew.
Target’s prediction model, as reported by the New York Times, allows it to expand its sales for a demographic of expecting parents, whose willingness to shop “are more flexible than at almost any other time in their adult lives”.
[Does Sherlock Holmes deductive (and inductive) reasoning impress you? Well, what Target does is the same. Their "predictive model" extrapolates from behavioural changes that that strongly suggests (or guesses) that a shopper is pregnant - maybe she buys new vitamins, or stop buying sanitary pads or a whole lot of combination of behaviour. BUT (and this is important) the predictive model can ONLY WORK if Target has a baseline or trend of your past behaviour and your current behaviour AND CAN LINK YOUR PAST AND CURRENT SHOPPING BEHAVIOUR. Which means identifying the shopper with the credit card used, or membership card presented. Personally, I try to pay in cash as often as possible, and not to "sell" my personal details for membership privileges.]
All the store needed to put together was shoppers’ profiles and data about their shopping habits, as well as home addresses in order to send discount coupons that targeted their preferences.
At the time, people were already calling such data collection and number crunching a step too far for companies.
But today, there is a wide range of scenarios that such data collection can be misused, and the users would be none the wiser, said experts.
What is worse is when firms that collect data — whose business it is to know how their users’ data is handled — often do not know either.
In 2018, the personal data and medical records of around 1.5 million SingHealth patients were hacked in a state-sponsored attack. An inquiry later found that system vulnerabilities and staff lapses had enabled the data breach to occur.
That same year, Facebook faced a strong backlash from users when the Cambridge Analytica scandal was uncovered.
The British political consulting firm — now defunct — had improperly obtained the private psychological profiles of millions of Facebook users and provided data analytics to Mr Donald Trump’s campaign in the 2016 US presidential election.
When Facebook revealed that it found out about the leaked data and had acted to stop Cambridge Analytica’s data collection, the damage was already done.
Assistant Professor Reza Shokri, who is the NUS Presidential Young Professor of Computer Science, said: “(Such misuses) could have a severe effect, for example, a loss of trust, and having a hypocritical society where citizens feel that they need to always hide something from others due to this lack of trust.”
Assoc Prof Sim added that another danger comes from a phenomenon known as “function creep”, in which personal data is used for a purpose other than the one for which it was first given.
“One example is using TraceTogether data for criminal investigation when its original purpose was for contact tracing. Function creep is a breach of privacy, unless explicit consent is obtained from the person concerned for the secondary usage of his data,” he said.
[This is a concern. BUT "The Righteous Man does not fear the knock on his door at midnight because his TraceTogether showed him in the vicinity of a Robbery or some other [serious] crime." This is concerning because it is mandatory (soon) to use the TraceTogether to contain and control the spread of this pandemic. BUT, the use of the TraceTogether data by the State, is not something we can stop or prevent (easily). However, this is not the only avenue open to the police to investigate. I am sure they use CCTV/security cameras, traffic cams, EZLink card and credit card use and all sorts of data to paint a picture or to locate a suspect at the scene of a crime. If the police use the information from your In-vehicle Unit to show that you drove your car into a carpark nearest the scene of a murder, are you going to be able to successfully get the court to rule the evidence as inadmissible? Even in a US court?]
THE AMORPHOUS CONCEPT OF DATA PRIVACY
While data privacy is a concept that originates in Western countries and is protected by their respective constitutions, the laws in Singapore, namely the PDPA, are not designed with the chief aim to ensure data privacy.
In general, data privacy concerns the right of a person to be “left alone” by data firms, though its definitions are still amorphous and being debated by scholars, said Mr Steve Tan, a partner at law firm Rajah & Tann.
Mr Tan said the notion of privacy in the US is attached to the concept of freedom and individual rights, while in Europe, it is about the honour and dignity of the person.
Internet laws in these jurisdictions thus prioritise data privacy, such as European Union’s General Data Protection Regulation (GDPR), which imposes harsh penalties on firms that collect data without following stringent conditions.
On the other hand, the PDPA was conceived to achieve a narrower aim of ensuring data protection, which is about safeguarding personal data from misuse and unauthorised access. Data protection is a subset of data privacy, Mr Tan added.
Around 2010, the authorities here had recognised that there was no such law at the time, and sought to find a balance between giving companies the ability to harness personal data for business innovation purposes and giving individuals the right of consent.
“For Singapore, the impetus for us to come up with data protection regulation was because of the digital economy and the globalisation of such an economy, where you see a proliferation of data being transferred all around the world,” said Mr Tan, an NUS adjunct professor who lectures about privacy and data protection laws at the university’s law faculty.
In a nutshell, the PDPA governs the collection, use, disclosure and care of personal data, and recognises both the rights of individuals to protect their personal data and the needs of organisations to use personal data for legitimate and reasonable purposes.
While the PDPA does not apply to the Government, Mr Tan said that even if it did, there will still be an exception when it comes to crime and security. This is similar to GDPR exemptions for law enforcement and national security.
The use of aggregated personal information and public surveillance data to fight crime and deter security threats is a growing trend around the world, though such Minority Report-esque measures are not always greeted with enthusiasm.
One oft-cited example is China’s social credit scoring system, which uses a combination of personal data and state surveillance to reward or punish behaviours, and has attracted the ire of privacy advocates.
In the Netherlands, police are trialling a data project that uses a network of sensors to measure noise levels and emotional tones in voices, which can trigger a police intervention as an increase of these levels above certain thresholds imply greater crime risk.
In 2019, Second Minister for Home Affairs Josephine Teo also spoke about the potential of using data analytics to make “predictive policing” a reality, though she gave few details of what this entails.
Some of those interviewed said more privacy safeguards are needed, which would ultimately foster trust among individuals, corporations and governments.
One key tenet of data legislation is that personal data belongs to the person who generates it, and not the organisations that collect and compile them, they said.
“Ensuring privacy is a way to establish trust and balance the power between individuals and organisations,” said Dr Shokri from NUS.
“Without privacy, the balance of power changes towards the governments and big corporations, which can significantly influence individuals in many different ways… Studies show that individuals under observation are more risk-averse and less creative.”
Nevertheless, the debate is not about the irreconcilable differences between an individual’s privacy and the greater good of society, said experts.
With TraceTogether, for example, contact tracing can be performed without the need to collect and share contacts in real-time with the authorities, said Dr Shokri.
“It is a fact that most of the ‘good’ can still be achieved with privacy in mind,” he added.
And although societies have not agreed what is the right path to take, analysts noted that the rest of the world is gradually moving towards Singapore’s version of data protection laws, rather than continue to beat the drum of protecting users’ privacy above all else.
One 2018 study from the Tilburg University in the Netherlands concluded that the EU’s GDPR law is “growing so broad that the good intentions to provide the most complete protection possible are likely to backfire in a very near future, resulting in system overload”.
The study by Associate Professor Nadezhda Purtova argued that the GDPR has become bogged down by compliance rules that constricts innovation and are impossible to maintain, since most data useful to businesses can be considered personal data and can trigger GDPR rules.
NUS’ Prof Chesterman said: “I think the world has largely shifted away from privacy — in the sense of being able to isolate yourself from the public gaze — to data protection.
“Rather than wanting to be ‘left alone’, what most of us want is some measure of control over how information about us is collected, used, and disclosed. It’s telling, for example, that not only the PDPA but also the EU’s GDPR don’t really use the word ‘privacy’ any more.”
SAFEGUARDS ARE A WORK-IN-PROGRESS
Giving individuals some control means putting in place better safeguards that can give them greater confidence over how their data is being used and ensuring that companies that hold data are transparent.
For example, the Monetary Authority of Singapore is developing “fairness metrics” for the use of artificial intelligence (AI) and data analytics in credit risk scoring.
The move comes as more financial technology (fintech) firms are adopting automated machine learning AI to assess a person’s creditworthiness based on personal data that are not usually used by traditional banks, such as his or her spending habits, professional background and lifestyle.
Rajah & Tann’s Mr Tan said: “AI decision-making is still at a very nascent level and there are of course a lot of privacy issues. The key issue is that the AI could decide wrongly, or have a bias, causing unfair outcomes.”
Last year, the Infocomm Media Development Authority (IMDA) and the Personal Data Protection Commission (PDPC) also launched a guide to help companies share data securely with trusted partners.
Commvault’s Ms Ler added that having the right tools and keeping regulations in check are key to ensuring data is being collected, used and stored properly.
“Just as companies need to gain the trust of customers and partners when they hold on to data for business purposes, transparency needs to be built on a national level as well,” she said.
Using the Government’s pledge to delete sensitive TraceTogether data once it is no longer needed as an example, she said this requires a framework of sophisticated data management to ensure that datasets are appropriately tagged, protected and accessed.
But while the state may have the muscle and know-how, many businesses that handle customers’ data still “don’t get it”.
NTU’s Assoc Prof Lim said: “The big problem is that for a lot of organisations here who claim to understand the government requirements, in practice, they only see data protection as a compliance burden or cost. They don’t realise that it could end up being a cybersecurity issue that could bankrupt them.”
Cybersecurity solutions are expensive, but the penalties and consequences of data breaches are higher. Firms with an annual turnover exceeding S$10 million can be fined up to 10 per cent of their revenue following the most recent amendments to the PDPA passed last year.
There is also the reputational hit that a business can suffer if it is found out that customer’s private data is poorly guarded.
Mr Tan, who has assisted many organisations with investigations by the PDPC, said that there are plenty of small- and medium-sized enterprises (SMEs) which are too resource-strapped to comply.
“That's the reality. A lot of SMEs are under the radar until there's a complaint or when something happens and they are taken to task. It is not because they aren’t aware of data protection laws, but are so focused on survival that they decide to take a calculated risk,” he said.
Another chink in Singapore’s data protection law is that the PDPA does not consider anonymised data as personal data, the experts noted.
Data anonymisation is a way to strip out any personal identifiers such that the person where the data comes from cannot be identified. Data analysts use such data to look for trends and patterns among a population without needing to identify the source of the data.
But anonymised data is easily re-identifiable by combining multiple sources of anonymous data, said NUS’ Assoc Prof Sim.
This is because any data related to individuals is extremely unique, even if no identifying attributes such as name or ID numbers are used.
Dr Shokri said: “On average, it is enough to know the five locations that you routinely visit during a day — your home, your workplace, the gym you go to, the restaurant you go for dinner, the school where you dropped off your child — to identify you among everyone on Earth.
“Knowing eight friends on social media also makes you unique. The same is true about your medical record, the people you call or message regularly, the movies you watch, the websites you go, the people that you meet and have contact with on a daily basis.”
There is no way data can be truly anonymised, unless noise and uncertainty are intentionally added to the dataset, thereby “damaging” the data.
All the store needed to put together was shoppers’ profiles and data about their shopping habits, as well as home addresses in order to send discount coupons that targeted their preferences.
At the time, people were already calling such data collection and number crunching a step too far for companies.
But today, there is a wide range of scenarios that such data collection can be misused, and the users would be none the wiser, said experts.
What is worse is when firms that collect data — whose business it is to know how their users’ data is handled — often do not know either.
In 2018, the personal data and medical records of around 1.5 million SingHealth patients were hacked in a state-sponsored attack. An inquiry later found that system vulnerabilities and staff lapses had enabled the data breach to occur.
That same year, Facebook faced a strong backlash from users when the Cambridge Analytica scandal was uncovered.
The British political consulting firm — now defunct — had improperly obtained the private psychological profiles of millions of Facebook users and provided data analytics to Mr Donald Trump’s campaign in the 2016 US presidential election.
When Facebook revealed that it found out about the leaked data and had acted to stop Cambridge Analytica’s data collection, the damage was already done.
Assistant Professor Reza Shokri, who is the NUS Presidential Young Professor of Computer Science, said: “(Such misuses) could have a severe effect, for example, a loss of trust, and having a hypocritical society where citizens feel that they need to always hide something from others due to this lack of trust.”
Assoc Prof Sim added that another danger comes from a phenomenon known as “function creep”, in which personal data is used for a purpose other than the one for which it was first given.
“One example is using TraceTogether data for criminal investigation when its original purpose was for contact tracing. Function creep is a breach of privacy, unless explicit consent is obtained from the person concerned for the secondary usage of his data,” he said.
[This is a concern. BUT "The Righteous Man does not fear the knock on his door at midnight because his TraceTogether showed him in the vicinity of a Robbery or some other [serious] crime." This is concerning because it is mandatory (soon) to use the TraceTogether to contain and control the spread of this pandemic. BUT, the use of the TraceTogether data by the State, is not something we can stop or prevent (easily). However, this is not the only avenue open to the police to investigate. I am sure they use CCTV/security cameras, traffic cams, EZLink card and credit card use and all sorts of data to paint a picture or to locate a suspect at the scene of a crime. If the police use the information from your In-vehicle Unit to show that you drove your car into a carpark nearest the scene of a murder, are you going to be able to successfully get the court to rule the evidence as inadmissible? Even in a US court?]
THE AMORPHOUS CONCEPT OF DATA PRIVACY
While data privacy is a concept that originates in Western countries and is protected by their respective constitutions, the laws in Singapore, namely the PDPA, are not designed with the chief aim to ensure data privacy.
In general, data privacy concerns the right of a person to be “left alone” by data firms, though its definitions are still amorphous and being debated by scholars, said Mr Steve Tan, a partner at law firm Rajah & Tann.
Mr Tan said the notion of privacy in the US is attached to the concept of freedom and individual rights, while in Europe, it is about the honour and dignity of the person.
Internet laws in these jurisdictions thus prioritise data privacy, such as European Union’s General Data Protection Regulation (GDPR), which imposes harsh penalties on firms that collect data without following stringent conditions.
On the other hand, the PDPA was conceived to achieve a narrower aim of ensuring data protection, which is about safeguarding personal data from misuse and unauthorised access. Data protection is a subset of data privacy, Mr Tan added.
Around 2010, the authorities here had recognised that there was no such law at the time, and sought to find a balance between giving companies the ability to harness personal data for business innovation purposes and giving individuals the right of consent.
“For Singapore, the impetus for us to come up with data protection regulation was because of the digital economy and the globalisation of such an economy, where you see a proliferation of data being transferred all around the world,” said Mr Tan, an NUS adjunct professor who lectures about privacy and data protection laws at the university’s law faculty.
In a nutshell, the PDPA governs the collection, use, disclosure and care of personal data, and recognises both the rights of individuals to protect their personal data and the needs of organisations to use personal data for legitimate and reasonable purposes.
While the PDPA does not apply to the Government, Mr Tan said that even if it did, there will still be an exception when it comes to crime and security. This is similar to GDPR exemptions for law enforcement and national security.
The use of aggregated personal information and public surveillance data to fight crime and deter security threats is a growing trend around the world, though such Minority Report-esque measures are not always greeted with enthusiasm.
One oft-cited example is China’s social credit scoring system, which uses a combination of personal data and state surveillance to reward or punish behaviours, and has attracted the ire of privacy advocates.
In the Netherlands, police are trialling a data project that uses a network of sensors to measure noise levels and emotional tones in voices, which can trigger a police intervention as an increase of these levels above certain thresholds imply greater crime risk.
In 2019, Second Minister for Home Affairs Josephine Teo also spoke about the potential of using data analytics to make “predictive policing” a reality, though she gave few details of what this entails.
Some of those interviewed said more privacy safeguards are needed, which would ultimately foster trust among individuals, corporations and governments.
One key tenet of data legislation is that personal data belongs to the person who generates it, and not the organisations that collect and compile them, they said.
“Ensuring privacy is a way to establish trust and balance the power between individuals and organisations,” said Dr Shokri from NUS.
“Without privacy, the balance of power changes towards the governments and big corporations, which can significantly influence individuals in many different ways… Studies show that individuals under observation are more risk-averse and less creative.”
Nevertheless, the debate is not about the irreconcilable differences between an individual’s privacy and the greater good of society, said experts.
With TraceTogether, for example, contact tracing can be performed without the need to collect and share contacts in real-time with the authorities, said Dr Shokri.
“It is a fact that most of the ‘good’ can still be achieved with privacy in mind,” he added.
And although societies have not agreed what is the right path to take, analysts noted that the rest of the world is gradually moving towards Singapore’s version of data protection laws, rather than continue to beat the drum of protecting users’ privacy above all else.
One 2018 study from the Tilburg University in the Netherlands concluded that the EU’s GDPR law is “growing so broad that the good intentions to provide the most complete protection possible are likely to backfire in a very near future, resulting in system overload”.
The study by Associate Professor Nadezhda Purtova argued that the GDPR has become bogged down by compliance rules that constricts innovation and are impossible to maintain, since most data useful to businesses can be considered personal data and can trigger GDPR rules.
NUS’ Prof Chesterman said: “I think the world has largely shifted away from privacy — in the sense of being able to isolate yourself from the public gaze — to data protection.
“Rather than wanting to be ‘left alone’, what most of us want is some measure of control over how information about us is collected, used, and disclosed. It’s telling, for example, that not only the PDPA but also the EU’s GDPR don’t really use the word ‘privacy’ any more.”
SAFEGUARDS ARE A WORK-IN-PROGRESS
Giving individuals some control means putting in place better safeguards that can give them greater confidence over how their data is being used and ensuring that companies that hold data are transparent.
For example, the Monetary Authority of Singapore is developing “fairness metrics” for the use of artificial intelligence (AI) and data analytics in credit risk scoring.
The move comes as more financial technology (fintech) firms are adopting automated machine learning AI to assess a person’s creditworthiness based on personal data that are not usually used by traditional banks, such as his or her spending habits, professional background and lifestyle.
Rajah & Tann’s Mr Tan said: “AI decision-making is still at a very nascent level and there are of course a lot of privacy issues. The key issue is that the AI could decide wrongly, or have a bias, causing unfair outcomes.”
Last year, the Infocomm Media Development Authority (IMDA) and the Personal Data Protection Commission (PDPC) also launched a guide to help companies share data securely with trusted partners.
Commvault’s Ms Ler added that having the right tools and keeping regulations in check are key to ensuring data is being collected, used and stored properly.
“Just as companies need to gain the trust of customers and partners when they hold on to data for business purposes, transparency needs to be built on a national level as well,” she said.
Using the Government’s pledge to delete sensitive TraceTogether data once it is no longer needed as an example, she said this requires a framework of sophisticated data management to ensure that datasets are appropriately tagged, protected and accessed.
But while the state may have the muscle and know-how, many businesses that handle customers’ data still “don’t get it”.
NTU’s Assoc Prof Lim said: “The big problem is that for a lot of organisations here who claim to understand the government requirements, in practice, they only see data protection as a compliance burden or cost. They don’t realise that it could end up being a cybersecurity issue that could bankrupt them.”
Cybersecurity solutions are expensive, but the penalties and consequences of data breaches are higher. Firms with an annual turnover exceeding S$10 million can be fined up to 10 per cent of their revenue following the most recent amendments to the PDPA passed last year.
There is also the reputational hit that a business can suffer if it is found out that customer’s private data is poorly guarded.
Mr Tan, who has assisted many organisations with investigations by the PDPC, said that there are plenty of small- and medium-sized enterprises (SMEs) which are too resource-strapped to comply.
“That's the reality. A lot of SMEs are under the radar until there's a complaint or when something happens and they are taken to task. It is not because they aren’t aware of data protection laws, but are so focused on survival that they decide to take a calculated risk,” he said.
Another chink in Singapore’s data protection law is that the PDPA does not consider anonymised data as personal data, the experts noted.
Data anonymisation is a way to strip out any personal identifiers such that the person where the data comes from cannot be identified. Data analysts use such data to look for trends and patterns among a population without needing to identify the source of the data.
But anonymised data is easily re-identifiable by combining multiple sources of anonymous data, said NUS’ Assoc Prof Sim.
This is because any data related to individuals is extremely unique, even if no identifying attributes such as name or ID numbers are used.
Dr Shokri said: “On average, it is enough to know the five locations that you routinely visit during a day — your home, your workplace, the gym you go to, the restaurant you go for dinner, the school where you dropped off your child — to identify you among everyone on Earth.
“Knowing eight friends on social media also makes you unique. The same is true about your medical record, the people you call or message regularly, the movies you watch, the websites you go, the people that you meet and have contact with on a daily basis.”
There is no way data can be truly anonymised, unless noise and uncertainty are intentionally added to the dataset, thereby “damaging” the data.
[Anonymised and aggregated. No individual information, only aggregated information. And there must be a minimum of aggregates - so you cannot, for example drill down to a group of just 5 or 10 individuals.]
A recent US study found that with 15 demographic attributes, it is possible to identify 99.98 per cent of Americans.
Prof Chesterman said: “One thing that we’re increasingly realising is that even small amounts of information can be pieced together to build a pretty accurate picture of you as an individual.”
But amid the furore over the TraceTogether data and WhatsApp privacy policy, several experts disagreed with the position some people took — comments such as “I have nothing to hide, therefore I have nothing to fear” are a fallacy that comes from wilful ignorance.
["The Righteous Man..."]
Assoc Prof Lim said: “It is not about hiding. It is about whether you value your personal safety or your family’s safety, it’s about preventing unsavoury characters from showing up at your door, or telemarketers bugging you on the phone, because you don’t know how your data can be misused.”
[Then there are those people who posted photos of themselves gathered in groups of more than 8 people (under Phase 3, not allowed) on social media. And there are criminals who post photos of themselves committing crimes - the D.C. insurrectionists for example. Obviously, they are aware of their first and second amendment rights, but are a bit hazy on the Fifth... probably couldn't count that high,]
But this does not also mean that people should quit technology altogether, out of an inordinate fear that their personal data will be abused.
Rather, experts said the onus is also on the individual to keep asking questions about their personal data and not simply sign away their privacy. When all is said and done, users should try to understand why they are consenting to data use and privacy policies when prompted.
Techgoondu’s Mr Siew said: “We’ve come to get used to the conveniences of things like Google Maps that it will be hard to give it up. But you also do not have to make a binary decision between having to stop using the service completely, versus using it blindly.”
Faced with consumer pressure over privacy concerns, Google has become more transparent about its privacy policies and the type of personal data that it holds, he said.
“That is why it is a good thing that for the past two weeks, people are asking questions about WhatsApp and about TraceTogether, and are holding people and organisations accountable,” said Mr Siew.
A recent US study found that with 15 demographic attributes, it is possible to identify 99.98 per cent of Americans.
Prof Chesterman said: “One thing that we’re increasingly realising is that even small amounts of information can be pieced together to build a pretty accurate picture of you as an individual.”
But amid the furore over the TraceTogether data and WhatsApp privacy policy, several experts disagreed with the position some people took — comments such as “I have nothing to hide, therefore I have nothing to fear” are a fallacy that comes from wilful ignorance.
["The Righteous Man..."]
Assoc Prof Lim said: “It is not about hiding. It is about whether you value your personal safety or your family’s safety, it’s about preventing unsavoury characters from showing up at your door, or telemarketers bugging you on the phone, because you don’t know how your data can be misused.”
[Then there are those people who posted photos of themselves gathered in groups of more than 8 people (under Phase 3, not allowed) on social media. And there are criminals who post photos of themselves committing crimes - the D.C. insurrectionists for example. Obviously, they are aware of their first and second amendment rights, but are a bit hazy on the Fifth... probably couldn't count that high,]
But this does not also mean that people should quit technology altogether, out of an inordinate fear that their personal data will be abused.
Rather, experts said the onus is also on the individual to keep asking questions about their personal data and not simply sign away their privacy. When all is said and done, users should try to understand why they are consenting to data use and privacy policies when prompted.
Techgoondu’s Mr Siew said: “We’ve come to get used to the conveniences of things like Google Maps that it will be hard to give it up. But you also do not have to make a binary decision between having to stop using the service completely, versus using it blindly.”
Faced with consumer pressure over privacy concerns, Google has become more transparent about its privacy policies and the type of personal data that it holds, he said.
“That is why it is a good thing that for the past two weeks, people are asking questions about WhatsApp and about TraceTogether, and are holding people and organisations accountable,” said Mr Siew.
----------
The truth about your WhatsApp data
January 14, 2021
NEW YORK — There was a backlash to WhatsApp in recent days after it posted what appear to be overhauled privacy policies. Let me try to clarify what happened.
Some people think the messaging app will now force those using it to hand over their personal data to Facebook, which owns WhatsApp.
That’s not quite right.
WhatsApp’s policies changed cosmetically and not in ways that give Facebook more data. The bottom line is that Facebook already collects a lot of information from what people do on WhatsApp.
The confusion was the result of Facebook’s bungled communications, mistrust of the company and America’s broken data-protection laws.
Here is what changed with WhatsApp and what didn’t:
- Facebook bought WhatsApp in 2014, and since 2016, almost everyone using the messaging app has been (usually unknowingly) sharing information about their activity with Facebook.
- Facebook knows the phone numbers being used, how often the app is opened, the resolution of the device screen, the location estimated from the internet connection and more.
- Facebook uses this information to make sure WhatsApp works properly and to help a shoe company show you an ad on Facebook.
- Facebook can’t peer at the content of texts or phone calls because WhatsApp communications are scrambled. Facebook also says that it doesn’t keep records on who people are contacting in WhatsApp, and WhatsApp contacts aren’t shared with Facebook.
[The key link is your phone number. With WhatsApp you need to provide your phone number for the app to work properly. BUT... Facebook doesn't. So IF you have provided FB with your phone number, your FB account and your WhatsApp is LINKED.
I have an avatar for BOTH FB and WhatsApp, and the avatars (photos) for both are different. No information in the WhatsApp and FB are the same, except for my name, which is not very unique. Uncommon but not unusual.]
WhatsApp has a lot of positives. It is easy to use, and communications in the app are secure. But yes, WhatsApp is Facebook, a company many don’t trust.
There are alternatives, including Signal and Telegram — both of which have gotten a surge of new users recently. Digital privacy group Electronic Frontier Foundation says Signal and WhatsApp are good choices for most people.
The reason WhatsApp recently notified app users about revised privacy rules is that Facebook is trying to make WhatsApp a place to chat with an airline about a missed flight, browse for handbags and pay for stuff.
WhatsApp’s policies changed to reflect the possibility of commercial transactions involving the mingling of activity among Facebook apps — a handbag you browse in WhatsApp could pop up later in your Instagram app, for example.
Unfortunately, WhatsApp did a terrible job explaining what was new in its privacy policy. It took me and my colleague Kashmir Hill, a data-privacy rock star, a good amount of reporting to understand.
I also want to touch on deeper reasons for the misunderstandings.
First, this is a hangover of Facebook’s history of being cavalier with our personal data and reckless with how it is used by the company or its partners. It is no wonder that people assumed Facebook changed WhatsApp policies in gory ways.
Second, people have come to understand that privacy policies are confusing, and we really don’t have power to make companies collect less data.
“This is the problem with the nature of privacy law in the United States,” Ms Hill said. “As long as they tell you that they’re doing it in a policy that you probably don’t read, they can do whatever they want.”
That means digital services including WhatsApp give us an unappealing choice. Either we give up control over what happens to our personal information, or we don’t use the service. That’s it.
CLEARING UP MORE WHATSAPP CONFUSION
Another false belief floating around about WhatsApp — and again, this is WhatsApp’s fault, not yours — is that the app is just now removing an option for people to refuse to share their WhatsApp data with Facebook.
Not quite right.
Yes, when Facebook made major changes to WhatsApp privacy policies in 2016, there was a brief moment of choice. People could check a box to order Facebook not to use their data from WhatsApp for commercial purposes.
Facebook would still collect the data from WhatsApp users, as I explained above, but the company would not use the data to “improve its ads and product experiences,” like making friend recommendations.
But that option in WhatsApp existed for only 30 days in 2016. That was a lifetime ago in digital years and approximately 4 million Facebook data scandals ago.
For anyone who started using WhatsApp since 2016 — and that is many people — Facebook has been collecting a lot of information without an option to refuse.
“A lot of people didn’t know that until now,” Ms Gennie Gebhart of the Electronic Frontier Foundation told me. And, she said, we are not to blame.
Understanding what happens with our digital data feels as if it requires advanced training in computer science and a law degree.
And Facebook, a company with oodles of cash and a stock value of more than US$700 billion (S$928 billion), didn’t or couldn’t explain what was happening in a way that people could grasp.
THE NEW YORK TIMES
"Firstly, I don't post really personal information. You think you are sharing with you friends, BUT, have you set your privacy to restrict access to your posts to just your friends? Or is it set to "public"? So anyone can see it?
And even if you intend to share it only with friends, note that Cambridge Analytica got to 50 million profiles thru friends of the original 270,000 users who used their app.
Secondly, limit the use of FaceBook, and limit your "digital footprint". I stopped "liking" stuff on FB and other social media when I realised that they used those "likes" to target ads at you. And then they can predict and influence your choices.
Here are advice from another website (more detailed than my advice) on how to limit the private info you share with FB. For one, turn off your location setting.
Same for quizzes and surveys. Most of them are bunk, and none of them will actually improve your life. And all of them are a back door into your personal information. So why?
Authentic Psychologists who carry out surveys have to comply with ethical rules to protect your privacy. Unscientific "fun" surveys don't. Cambridge used a researcher's app to carry out a "personality test". This invariably means a self-administered questionnaire. Or survey.
To tell you which muppet you are most like, or which Game of Thrones character you would be, or which Star Trek character you most resemble. None of these information will improve your life. Or if it might, you have a very, VERY sad life.
Thirdly, who are your friends? I used to "accumulate" friends on FB. Mainly to play games. I think I got to about 170+ and then I just didn't want to care anymore. And the games were getting really silly and time-consuming. And a waste of time.
I stopped accumulating "friends" (who were no more than "player characters" for the games I played. I stop playing "social media" or facebook games, and have trimmed my "friends" down to less than 120. I could do more, but most of my friends are not really active on FB. If the 270,000 original users of the app could lead to 50 million people's profile being captured, then each person would have about 185 friends. Not unusual.
Nothing is for free. There is no free lunch. Free games are usually "paid for" by ads. Why is your time worth that much to an advertiser? Or the person selling your viewership to the advertiser? It seems worthless to you? Maybe there is something else that you are selling?
And finally, maintain your privacy. And it's not just FaceBook. Any social media has vulnerabilities, and the greatest vulnerability is "social hacking" - people hacking YOU."]
And even if you intend to share it only with friends, note that Cambridge Analytica got to 50 million profiles thru friends of the original 270,000 users who used their app.
Secondly, limit the use of FaceBook, and limit your "digital footprint". I stopped "liking" stuff on FB and other social media when I realised that they used those "likes" to target ads at you. And then they can predict and influence your choices.
Here are advice from another website (more detailed than my advice) on how to limit the private info you share with FB. For one, turn off your location setting.
Same for quizzes and surveys. Most of them are bunk, and none of them will actually improve your life. And all of them are a back door into your personal information. So why?
Authentic Psychologists who carry out surveys have to comply with ethical rules to protect your privacy. Unscientific "fun" surveys don't. Cambridge used a researcher's app to carry out a "personality test". This invariably means a self-administered questionnaire. Or survey.
To tell you which muppet you are most like, or which Game of Thrones character you would be, or which Star Trek character you most resemble. None of these information will improve your life. Or if it might, you have a very, VERY sad life.
Thirdly, who are your friends? I used to "accumulate" friends on FB. Mainly to play games. I think I got to about 170+ and then I just didn't want to care anymore. And the games were getting really silly and time-consuming. And a waste of time.
I stopped accumulating "friends" (who were no more than "player characters" for the games I played. I stop playing "social media" or facebook games, and have trimmed my "friends" down to less than 120. I could do more, but most of my friends are not really active on FB. If the 270,000 original users of the app could lead to 50 million people's profile being captured, then each person would have about 185 friends. Not unusual.
Nothing is for free. There is no free lunch. Free games are usually "paid for" by ads. Why is your time worth that much to an advertiser? Or the person selling your viewership to the advertiser? It seems worthless to you? Maybe there is something else that you are selling?
And finally, maintain your privacy. And it's not just FaceBook. Any social media has vulnerabilities, and the greatest vulnerability is "social hacking" - people hacking YOU."]
No comments:
Post a Comment