A Blog by Jonathan Low

 

Oct 25, 2014

The Dark Market for Personal Data

"Which office do I go to to get my reputation back?" So asked former US cabinet secretary Raymond Donovan after he was cleared of corruption charges brought against him.

There are two big problems with data: one is that we dont always know where it comes from. And the second is that we tend to believe numbers whether they have been substantiated or not.

Belief systems remain more powerful than information systems. If a set of numbers confirms what we think, we tend to dismiss any criticism of it. And even if the information can be proven to be false, or questionable, we are often still inclined to give it the benefit of the doubt - because it's data, not an 'opinion.'

One of the issues is that unlicensed but powerful brokers traffic in data - and make a handsome living doing so. They are unlicensed because, as the following article explains, there is no one to license them. Like most brokers, they make money from transactions, not from verifying the quality of what they are selling.
A number of other challenges cascade down from the first two: innocent people or organizations may be unfairly maligned, and the impact of all that falsity may create additional problems for individuals, institutions and nations. Which is not to say that data is or should be unimportant: simply that it is not inherently correct. JL

Frank Pasquale comments in the New York Times:

Uncovering problems in Big Data (or decision models based on that data) should not be a burden we expect individuals to solve on their own.
THE reputation business is exploding. Having eroded privacy for decades, shady, poorly regulated data miners, brokers and resellers have now taken creepy classification to a whole new level. They have created lists of victims of sexual assault, and lists of people with sexually transmitted diseases. Lists of people who have Alzheimer’s, dementia and AIDS. Lists of the impotent and the depressed.
There are lists of “impulse buyers.” Lists of suckers: gullible consumers who have shown that they are susceptible to “vulnerability-based marketing.” And lists of those deemed commercially undesirable because they live in or near trailer parks or nursing homes. Not to mention lists of people who have been accused of wrongdoing, even if they were not charged or convicted.
Typically sold at a few cents per name, the lists don’t have to be particularly reliable to attract eager buyers — mostly marketers, but also, increasingly, financial institutions vetting customers to guard against fraud, and employers screening potential hires.
There are three problems with these lists. First, they are often inaccurate. For example, as The Washington Post reported, an Arkansas woman found her credit history and job prospects wrecked after she was mistakenly listed as a methamphetamine dealer. It took her years to clear her name and find a job.
Second, even when the information is accurate, many of the lists have no business being in the hands of retailers, bosses or banks. Having a medical condition, or having been a victim of a crime, is simply not relevant to most employment or credit decisions.
Third, people aren’t told they are on these lists, so they have no opportunity to correct bad information. The Arkansas woman found out about the inaccurate report only when she was denied a job. She was one of the rare ones.
“Data-driven” hiring practices are under increasing scrutiny, because the data may be a proxy for race, class or disability. For example, in 2011, CVS settled a charge of disability discrimination after a job applicant challenged a personality test that probed mental health issues. But if an employer were to secretly use lists based on inferences about mental health, it would be nearly impossible for an affected applicant to find out what was going on. Secrecy is discrimination’s best friend: Unknown unfairness can never be detected, let alone corrected.


These problems can’t be solved with existing law. The Federal Trade Commission has strained to understand personal data markets — a $156-billion-a-year industry — and it can’t find out where the data brokers get their information, and whom they sell it to. Hiding behind a veil of trade secrecy, most refuse to divulge this vital information.


The market in personal information offers little incentive for accuracy; it matters little to list-buyers whether every entry is accurate — they need only a certain threshold percentage of “hits” to improve their targeting. But to individuals wrongly included on derogatory lists, the harm to their reputation is great.
The World Privacy Forum, a research and advocacy organization, estimates that there are about 4,000 data brokers. They range from giants like Acxiom, a publicly traded company that helps marketers target consumer segments, to boutiques like Paramount Lists, which has compiled lists of addicts and debtors.
Companies like these vacuum up data from just about any source imaginable: consumer health websites, payday lenders, online surveys, warranty registrations, Internet sweepstakes, loyalty-card data from retailers, charities’ donor lists, magazine subscription lists, and information from public records.
It’s unrealistic to expect individuals to inquire, broker by broker, about their files. Instead, we need to require brokers to make targeted disclosures to consumers. Uncovering problems in Big Data (or decision models based on that data) should not be a burden we expect individuals to solve on their own.
Privacy protections in other areas of the law can and should be extended to cover consumer data. The Health Insurance Portability and Accountability Act, or Hipaa, obliges doctors and hospitals to give patients access to their records. The Fair Credit Reporting Act gives loan and job applicants, among others, a right to access, correct and annotate files maintained by credit reporting agencies.

It is time to modernize these laws by applying them to all companies that peddle sensitive personal information. If the laws cover only a narrow range of entities, they may as well be dead letters. For example, protections in Hipaa don’t govern the “health profiles” that are compiled and traded by data brokers, which can learn a great deal about our health even without access to medical records.
Congress should require data brokers to register with the Federal Trade Commission, and allow individuals to request immediate notification once they have been placed on lists that contain sensitive data. Reputable data brokers will want to respond to good-faith complaints, to make their lists more accurate. Plaintiffs’ lawyers could use defamation law to hold recalcitrant firms accountable.
We need regulation to help consumers recognize the perils of the new information landscape without being overwhelmed with data. The right to be notified about the use of one’s data and the right to challenge and correct errors is fundamental. Without these protections, we’ll continue to be judged by a big-data Star Chamber of unaccountable decision makers using questionable sources.

0 comments:

Post a Comment