A Blog by Jonathan Low

 

Jan 16, 2020

Should Colleges Really Be Putting Smart Speakers In Dorms (Or Anywhere Else)?

The larger is issue is where the line is between an individual's rights and an institution's demands. Who owns such information, to what degree should the subject be able to protect and who gets to decide how it is used - and for how long?

These questions are especially urgent for young people just beginning their life's journey. JL


Kathryn Miles reports in MIT Technology Review:

Some schools believe Alexa will bolster enrollment, reduce dropout rates, increase students’ success and boost happiness. (But) the devices may record the conversations the student have before or after speaking to it. As voice identification skills improve, it will link these recordings to an individual. That’s like a school recording in perpetuity everything that’s ever been in your locker, what you and your friends said every time you opened it, and then letting commercial entities search that information."
When Mateo Catano returned for his second year as an undergraduate at Saint Louis University in the fall of 2018, he found himself with a new roommate—not another student but a disembodied brain in the form of an Amazon Echo Dot.
Earlier that summer, the information technology department at SLU had installed about 2,300 of the smart speakers—one for each of the university’s residence hall rooms, making the school the first in the country to do so. Each device was pre-programmed with answers to about 130 SLU-specific questions, ranging from library hours to the location of the registrar’s office (the school dubbed this “AskSLU”). The devices also included the basic voice “skills” available on other Dots, including alarms and reminders, general information, and the ability to stream music.
For Catano, the Dot was a welcome addition. He liked hearing the weather first thing in the morning and knowing which dining halls were open. And, if he’s being honest, he liked the company. “Living in a single, AskSLU definitely made me feel less lonely,” he says. “And I liked the status of being at the first university to do this.”
Catano’s reaction was exactly what SLU administrators were hoping for. This fall, the Jesuit institution announced plans to broaden the voice skills of its Echo Dots by including both text messaging and chatbot functions.

No idea of the long-term effects

We’re on the verge of a new era of smart speakers on campus. Schools as wide-ranging as Arizona State University, Lancaster University in the UK, and Ross University School of Medicine in Barbados have adopted voice-skill technology on campus. Some, including Northeastern University, have taken the technology a step further and now give students access to financials, course schedules and grades, and outstanding fees via voice devices.
In late 2018, Boston’s Emerson College announced it was one of 18 recipients of a grant from Amazon to advance voice--enabled technology on campuses, part of the tech giant’s Alexa Innovation Fellowship. Emerson has created a dedicated voice lab where students can interact and experiment with Alexa skills, and it plans to install Alexa devices in places like theaters and outside elevator banks.
Administrators at some of these schools told me they believe Alexa will bolster enrollment and reduce dropout rates. Several also said they believe voice technology can increase their students’ success and boost their overall happiness.However, there are plenty of people on campus who see a dark side.
“When it comes to deploying listening devices where sensitive conversations occur, we simply have no idea what long-term effect having conversations recorded and kept by Amazon might have on their futures—even, quite possibly, on their health and well-being,” says Russell Newman, an Emerson professor who researches the political economy of communication and communications policy.
Other faculty members I spoke to echoed Newman’s objections. What if data harvested from students’ conversations affected their chances of getting a mortgage or a job later on? What if it were used against foreign students to have them deported, possibly to home countries where they could be imprisoned for their political views?
Right. So given all the risks, why are colleges so eager to fill their campuses with AI-enabled microphones? What’s in it for them?

AI to the rescue

Colleges and universities face several looming crises. After years of soaring enrollment numbers, US schools are seeing declines in admissions, a trend expected to worsen over the next decade. A November 2019 special report by the Chronicle of Higher Education predicts rapid decreases at even the country’s most selective institutions. Institutional revenue has stalled—Moody’s Investors Service issued a negative outlook for higher education for fiscal year 2019, with the exception of universities in the South. For three years, the Department of Education has sought to slash billions from financial aid and support for poorer students, though Congress has rejected the cuts. State contributions to public university budgets have lagged since the last recession. Private colleges are also struggling; more than a quarter of them are now in the red. In recent years, 20 private, nonprofit colleges closed their doors, and many more are considering merging or consolidating.
Meanwhile, half of all students who enter college fail to graduate within six years. Researchers give a variety of explanations. Nick Bowman, a professor of education at the University of Iowa, points to the fact that today’s students are older than the traditional 18- to 22-year-olds. Many have full-time jobs. Some care for children or siblings or aging parents. And with an average of $35,000 in student loan debt after four years in school, the prospect of dropping out can be tempting.
For many college administrators, AI offers appealing solutions to these predicaments. Winston-Salem State University, a historically black university with many low-income and first--generation college students, has had perennial problems helping each entering class hit key deadlines like submitting high school transcripts and vaccination records, completing financial aid forms, and making housing deposits. “We realized that many of our students may not understand the college enrollment process and may not be able to rely on families or support systems to decode it for them,” says Jay Davis, the university’s head of media relations.
Two years ago, WSSU partnered with  a tech firm called AdmitHub to offer an AI chatbot named Winston to help students navigate the enrollment process. Davis says the app successfully answers about three-quarters of students’ questions, and that there’s been a dramatic increase in the number of students who meet their financial requirements and submit all the supporting documents necessary to complete their application. This year WSSU is hosting its largest first-year class in more than a decade, and Davis says Winston played a big role in that.

Access to your words, forever

I spent several hours playing around with chatbots at a handful of colleges and universities. They all aced questions about the school mascot, where I could find dinner, and when the next sporting or alumni networking event was. But they flubbed others. When I told one I was sick, it informed me the student health center would not issue a written excuse for missed classes. I asked it where the student health center was; it responded with university tour times for prospective students. I told another I felt depressed, and it referred me to a federal student financial aid program.
The campus programmers on the other side of these devices all told me that the skills would improve as more students used them—which is, of course, what makes AI so effective. But it’s also what makes threats to our privacy so real, says Vitaly Shmatikov, a professor of computer science at Cornell Tech. Tech companies, says Shmatikov, are notoriously opaque about privacy and security. What he and other scholars have learned about them is largely by way of reverse-engineering and some educated guesswork, and the findings concern Shmatikov a great deal.
For starters, he says, companies like Amazon train their speech recognition algorithms on recordings of past user interactions to make them better at, for instance, understanding the intent of a question. He says all the companies involved are “very cagey” about how much data is traveling between them. “There is no promise to the user that their data won’t leave a specific device,” says Shmatikov. “We still don’t really know just how much data voice-skill hosts like Amazon—or third parties that rely on Amazon—are harvesting, or what they’re doing with that information.” Amazon didn’t respond to multiple requests for comment.
Shmatikov says it’s reasonable to assume that a company’s cloud has date- and time-stamped recordings of students’ requests to a smart speaker, and the devices may even record the conversations the student  might have had with other people before or after speaking to it. As voice identification and location skills improve, it will become increasingly possible to link these recordings to an individual person. That’s not like a school searching your locker; it’s more like a school recording in perpetuity everything that’s ever been in your locker and what you and your friends said every time you opened it, and then letting a host of commercial entities search that information.
Officials at Arizona State University and Saint Louis University say they’re not linking information like students’ financials, health records, and grades (data known as “authenticated,” since it requires a student to link to personal accounts) until they are more confident about the security measures. The technology used at Northeastern was developed by a small team led by Somen Saha, then an employee at the university. Saha eventually created an independent company called n-Powered, which developed an app called MyHusky that’s available through Alexa. However, its privacy page also acknowledges, “We use Amazon’s platform to make this work. Amazon stores information about usage that can be purged upon request.”
Shmatikov says that using a university’s own software and restricting the use of chatbots to general questions may limit a tech company’s access to student information, but it won’t solve the problem entirely. He points to sensitive questions like whether the health center offers STD testing or prescriptions to treat conditions like schizophrenia: technically, these aren’t linked to a specific student, but it’s not too hard to figure out who is asking, and students may not realize these aren’t always anonymous queries. Plus, says Shmatikov, as long as a company like Amazon is converting student prompts to data signals, it has access to the student’s information—forever.

Scary ramifications

Privacy is a concern for any user of an AI device, but the faculty I spoke with for this story insist there are particularly scary ramifications for higher education.
“College students are perhaps the most desirable category of consumers,” says Emerson’s Newman. “They are the trickiest to reach and the most likely to set trends.” As a result, he says, their data is some of the most valuable and the most likely to be mined or sold. And for educational institutions to be complicit in the commodification of students for corporate gain is, he says, fundamentally antithetical to their missions.
Sarah T. Roberts, an assistant professor of information studies at UCLA, says schools that enter into agreements with tech companies are at least potentially putting their students’ well-being at risk. “A student’s time at a college or university is used to explore ideas and try on new identities, whether that’s political beliefs or gender and sexuality,” says Roberts. “The knowledge that they are being recorded as they do so will undoubtedly prevent students from feeling like they can speak their minds.” It’s also worth remembering, she says, that many students come from countries where it can be dangerous to reveal their sexuality or political beliefs.
At Northeastern, one student created an online petition demanding that the university remove all Alexa devices. It reads in part: “Alexas are well-documented as surreptitious listening devices that are used to help sharpen Amazon’s marketing tactics .... At the very least, Northeastern University is forcing an extraneous device in student spaces that no one asked for. At the worst, they are recklessly violating their student body’s privacy at the behest of a corporate donor.” As of early December, the petition had 125 signatures.
At Emerson, students and other faculty members have joined Newman in creating a committee to draft privacy policies for the campus. At the very least, he says,  he would like to see warning signs placed wherever a listening device is located. He says so far the administration has been cooperative, and the deployment of any devices has been delayed.
“We need a safe way to experiment with these technologies and understand the consequences of their use instead of just continuing a blind march towards surveillance for the purpose of profit-making,” Newman says. “These are sophisticated applications with lifelong consequences for the individuals who are analyzed by them, to ends as yet unknown. We all need to be really judicious and thoughtful here.”

0 comments:

Post a Comment