A Blog by Jonathan Low

 

Apr 7, 2020

Should People Be Giving All Their Personal Data To Google And the CDC?

There needs to be a quid pro quo: yes, you can use personal data to help address the causes of the pandemic's spread.

But in return, the sensible tradeoff is to demand safeguards on data usage that last beyond this crisis. JL

Tiffany Li reports in Slate:

There are positive uses for data there may be ethical obligations to donate your data. But, we need to rethink privacy itself. We must continue to push back against companies and governments who use this crisis as an excuse for unprecedented power. So, take all my data. If it can help solve or mitigate the harms of this historic crisis for humanity, take my data and use it for that purpose—and that purpose alone.
I am a privacy lawyer and law professor. I have gone on record warning the public about companies like Google, Facebook, or Amazon taking too much of our data, especially our sensitive health data.
Now, in this time of pandemic, I have only one message for tech companies like Google and government agencies like the Centers for Disease Control and Prevention: Take all my data. If it can help solve or mitigate the harms of this historic crisis for humanity, please take my data and use it for that purpose—and that purpose alone.
We are going to need more data to fight the coronavirus and associated social ills (including loneliness from isolation). We will use more technology and do more research, and all of that will mean more collection of data—and more potential abuses of data. We can influence this rise in data collection in a privacy-protective way, but there will be privacy harms, and there will be bad actors who seek to exploit this crisis, no matter how hard we fight.
This is why it’s important that we approach privacy in the time of pandemic in three ways: First, we must recognize that there are positive uses for data, and indeed, there may be ethical obligations to donate your data (e.g., for medical research). Second, we must continue to push back against companies and government actors who use this crisis as an excuse for unprecedented power. Finally, we need to shift our societal understanding of privacy to keep pace with the modern world, so that we can continue to protect privacy in a way that is practicable and beneficial to society.
We can protect privacy in the time of pandemic and still use technology and data for public health and public good. Telecom companies around the world have been weighing whether to give consumer location data to governments, to track movements and virus spread—a move that comes with huge risks to individual privacy and of government overreach. Companies and grassroots groups are creating COVID-19 symptom trackers, like the crowdsourced ones popping up now, as well as diagnostic triage tools, like the ones being developed by Google and Amazon. Soon, home testing kits could allow people to test themselves for the novel coronavirus from home, freeing up key health care resources. All of these technological solutions come with privacy risks, which we must protect against, while still allowing for these innovations to flourish.
In addition to data-driven technologies that can help us mitigate the spread of this virus, there are also technologies that can keep society running in the meantime. Consumer products like tablets and laptops are crucial for remote work and education. Also important right now are video products like Facebook PortalHouseparty, and Zoom that allow us to connect, even if we cannot travel or gather with family and friends. We should support the public and private sectors in using technology for good, even if it means using our data—with the caveat that data use must be restricted and protected.
We must implement these and other technological solutions carefully, preserving as much privacy as possible. It is possible—technically, medically, legally, and ethically—to collect and use data to help the fight against COVID-19 while still preserving privacy protections. It may be tempting to trade privacy for the sake of controlling the spread of the virus or helping to minimize its harm to society. But we cannot turn back the clock after all of this is over and regain what privacy rights we lose. The data that is out there will be out there, potentially identifiable forever, even if we delete it from machine learning systems. The powers that companies and governments gain during this time of emergency likely will be retained after the emergency has passed.
To preserve privacy through this pandemic, we need to
rethink privacy itself. We can no longer rely on the outdated understanding of privacy as guarding your data from the eyes of the public. Data can and should be used for good, for this pandemic and in the future.
We must move past the notice-and-consent model of privacy protection. Making individual notification and consent the standard for allowing collection and use of data is not feasible in a world where data is being collected on all of us, all the time, often without our own knowledge. Our location data is already being tracked by our devices, our apps, our cellphone providers. Our face photos have been collected by companies like Clearview AI for use in facial recognition algorithms. Our addresses and phone numbers and property records are already out there, much of it being collected by shadowy data brokers. We need to shift our focus from preventing data from leaking to the public, to preserving privacy values when data is already out in the public view.
We need to change our existing laws and create new laws that will reflect new definitions of privacy. Instead of calling for simple minimization of data collection, we can change our focus to promoting other factors that will protect privacy: security of data, limitations on usage and transfer of data, individual rights over how data is used, and recourse when data is used against us. We should emphasize the use of technical privacy solutions (like differential privacy, encryption, and on-device processing) and limitations on nonessential uses of data and downstream uses, sales, and transfers of data. These technical protections can prevent data from being used against us while still allowing good faith actors to use our data for public health.
We also need protections against systemic harms related to use of data, such as unchecked law enforcement use of facial recognition, and we need algorithmic accountability protections for when algorithmic systems are used to make decisions that touch on key rights (as we have to an extent with financial credit scoring). Additionally, instead of focusing regulations on first-party collection and use of data, we can implement greater regulations for the downstream use of data and data aggregators that compound existing privacy harms. These solutions can help protect us against third parties who seek to use our data against us.
It is important that we continue to stay vigilant about threats to privacy, but it is also important to allow for necessary, potentially life-saving innovation in technology and science. We must reimagine what privacy is, and when the dust settles, we must rebuild with privacy laws that adapt to the realities of our modern, data-driven world. We do not have to choose between privacy and public health. We should fight for our right to have both.

0 comments:

Post a Comment