A Blog by Jonathan Low

 

Apr 17, 2023

The Leadership Risks of Employee Neuro-Technology Monitoring

Neuro-technology - or monitoring employees' brain waves - is simply the latest and most intrusive type of leadership workplace surveillance. 

But the risks of these kinds of oversight grow in proportion to the amount of data collected. Aside from the ethical implications, the legal and managerial threat rises as skilled employees - already inclined to distance themselves via remote and hybrid work - begin to understand and resent the uses to which this information can be put: compensation, promotions, recruitment, retention etc. The question for leaders is whether this additional data is sufficiently compelling and organizationally or financially important - given the questions about its accuracy - to offset the potential workforce alienation it may engender. JL 

Amy Marcus reports in the Wall Street Journal:

Companies have started to look into technology that could allow them to track fatigue levels, attention and focus. They could use it for managerial purposes (to) make decisions about who is going to be very expensive to continue to employ over time, whose brain is slowing and less likely to be as effective. The data could track a person’s cognitive decline to other brain metrics, through headsets that measure brain-wave activity. (But) brain waves reveal biases that we are not conscious of and can present us in our worst possible light.

Employers can track workers’ emails, computer keystrokes and calls. What happens when they routinely start tracking employees’ brains?

Nita Farahany, 46, has been studying the possibility for years. A professor of law and philosophy at Duke University School of Law, Dr. Farahany has long been intrigued by potential legal challenges posed by devices in the workplace that measure electrical activity in the brain.

 

Over the years, these electroencephalogram, or EEG, devices, along with the software and algorithms that power them, have gotten better at tracking brain-wave signals and decoding people’s emotions and cognitive skills. Some employers use the devices to monitor employees’ fatigue and offer brain-wave tracking as part of wellness programs designed to decrease stress and anxiety, Dr. Farahany says.

But the law hasn’t kept up with the science, she says. “There is no existing set of legal rights that protect us from employers scanning the brain or hacking the brain.”

Dr. Farahany’s interest in the issue stems in part from her childhood. Her parents both came to the U.S. from Iran, her father moving in 1969 for a medical residency and her mother arriving a year later. They had planned to return to Iran in 1979 but decided against it because of the political unrest. After the Shah of Iran’s ouster, her mother’s brother, who had served in the military, was arrested. Dr. Farahany’s family often discussed politics, including the way surveillance technology can be misused by governments.

In her book coming out in March, “The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology,” Dr. Farahany argues the workplace will be a crucial arena in the fight for the future of mental privacy. She spoke with The Wall Street Journal about how employers are increasingly gathering workers’ brain data and the need for limits on how the information is used. 

How are employers already tracking our brains?

The first example that I came across was a few years ago. Train drivers in China on the Beijing-Shanghai [high-speed rail] line are required to wear caps that have electrodes embedded in order to track their brain activity to see if they are focused or fatigued. There are even reports of tracking of the emotion levels of factory workers. The workers can be sent home if emotional levels signal they could be disruptive on the factory floor. I thought, I’m glad that isn’t happening in the U.S. But it turns out that it is happening in the U.S. Some companies have started to look into technology that could allow them to track fatigue levels and also attention and focus.

There are some beneficial uses. Brain-wave activity monitors can be used by employees. As your mind starts to wander, it can give you an alert and tell you, “Hey, it is time for a brain break.” 

Companies are also using it to track wellness and health. Wellness programs don’t fall under the same kinds of protections that employees have from misuse of health data. The data could track everything from a person’s cognitive decline over time to a lot of other brain metrics, through brain-training games and headsets that measure brain-wave activity. 

Can you see employers gathering the data through wellness programs and then sharing a report every quarter?

They could. They could evaluate it. They could use it for managerial purposes. They can make decisions about who is going to be very expensive to continue to employ over time, whose brain is slowing and less likely to be as effective over time. There really is no check on how they use that data right now.

In many instances, we voluntarily give up this information. And in other instances, we don’t have a choice, it is part of the process of applying for a job. What troubles you?

People may not recognize how much information you can decode already from a person’s brain. There are a lot of things that can be learned about the individual, like whether they suffer from cognitive decline or whether they have early stages of glioblastoma, a brain tumor—even their cognitive preferences.

I do worry people are unwittingly giving up [information] without realizing the full implications. That is true for privacy in general, but we ought to have a special place we think about when it comes to the brain. It is the last space where we truly have privacy.

If employers collect brain data over time, could they go back and reanalyze the raw data?

Technologists in the field a decade ago would have told people, “What are you worried about collecting neural data, there is so little we will ever be able to decode from surface-based electrodes rather than ones that are implanted in the brain.” They don’t say that anymore.

They recognize that we can already do so much more than we ever expected. As the algorithms get better and the more data we amass, the more precise the models become. 

Given that most of this data is being uploaded to cloud servers and kept there indefinitely, you can have very significant longitudinal data. I hired this person when they were 23 and they are 43 now, how effective is their brain at this point? Have they served their good useful lifetime of service to us?

One of the things in your book is the idea that the brain waves reveal biases that we are not conscious of and can present us in our worst possible light. How does that work exactly? 

Yes, potentially. The question is how effectively are they going to be able to do that in real time today. Can they set you up with a headset and probe your brain and figure out how you are reacting? Probably not. In the future can they do it? I think so.

Are there ways people can protect themselves?

We can and should require employers to do better. To say, here’s a transparent way that we’re planning on implementing [best practices] in the workplace. We’re giving the data to you to use. We’re not storing it. 

There is a real risk that people won’t have choices. You can’t choose to interview with the only company that doesn’t use brain-based metrics if everybody decides to use them. So I think it’s a combination of people looking out for themselves but also putting into place appropriate default rules at the government level and trying to encourage corporate responsibility.



0 comments:

Post a Comment