Employees should be involved in the “design, construction, testing and implementation” of any technologies used to control or monitor their return to work as the Covid-19 lockdown eases, according to experts.
Employers must do more to foster trust with staff when using data-intensive systems to track their movements or behaviour, attendees at a panel debate entitled Back to work: tracking social distancing were told.
“We know that the use of technology can be really helpful,” said Andrew Pakes, director of communications and research at Prospect, a specialist professional science and research union. “ It can make people feel secure, it can give a record, you can ensure that safety happens – but we also know that technology introduced for one reason can end up being used for another reason.”
Pakes said employers must avoid laying the technological foundations in the name of public health for an infrastructure that allows for “much more nefarious or negative activities” afterwards, and that employees should be properly consulted as part of their organisation’s data protection impact assessment (DPIA) to circumvent the issue.
“We would argue that, under Article 35 of the General Data Protection Regulation [GDPR], there should be consultation with data subjects and their representatives, and that consultation process – demonstrating that you have spoken to your workers, involved your unions – should happen before the technology is introduced,” he said.
“If you haven’t done the consultation as part of the DPIA, then you haven’t done a DPIA, and increasingly that’s going to become a contestable position.”
But when it comes to workplace technology deployments, the trust gap between employers and employees is not the same across jobs and professions, with different contexts manifesting different power relationships.
Gina Neff, associate professor at the Oxford Internet Institute and the Department of Sociology at the University of Oxford, said: “It’s one thing to talk about professional work and going back as professionals, but it’s another when we are talking about highly surveilled low-wage workers, who already experience technology at work in a very different way.”
Neff said smartphones and other devices have long been viewed as an extension of white-collar workers’ professional identity, whereas waged or hourly workers’ use of the same devices is often tightly controlled.
“We have to take these differences in class and trust in technology already in play into account,” she said. “Some of the tools and devices that I see being developed may sound great for highly motivated professional workers who feel altruistic in sharing their data, but they would absolutely be a nightmare in environments where people have already experienced tight digital control over their workloads.
“Privacy really has to be at the centre of the conversations we have about back-to-work technologies. There is no quick and easy technical panacea for solving the problems of back to work, but we absolutely know that if we don’t build tools and devices that allow people to be in charge and in control of their data, those won’t be effective.”
To mitigate the harmful effects of such uneven power relationships, Pakes repeated the need for widespread consultation with the workforce.
“There need to be ethical approvals within this, but how do we define what ‘ethical approval’ is?” he said. “Who gets to decide who is in the room to make decisions?
“You’ve seen this with AI [artificial intelligence] ethics and ethics committees – they tend to be drawn from C-suite or specialist or technical people, and we find this a lot with data protection impact assessments, too. It is regarded as a discrete, specialist process where the experts look at it. Rarely do they involve the workforce.”
Pakes said that without being involved in these conversations, “people will feel that the change is imposed on them”.
He also said it is helpful to think of data as an economic value rather than just information, because this highlights the power dynamics at play in these situations.
“The data-fication of us, our information, is driving an economic model,” he said. “I always look at data as an economic or political issue – it’s about how it is used, it’s about power and control.
“Data is a new form of capital. All of our laws for employment are based on management of individuals – people relationships. Now our data, which is more ephemeral, is the thing that derives value. We don’t yet have a narrative around that, or a set of laws that really understand how this economy is accelerating because of our ability to use data in different ways.”
During these consultations, said the panel members, organisations should also be figuring out what data they actually need to ensure safety in their operations, and move to strictly limit the collection of data that does not directly help this objective.
Leo Scott Smith, founder and CEO of Tended, a wearable technology startup that creates AI-powered internet of things (IoT) devices to monitor when accidents happen in various situations, said that technology providers, in particular, have a responsibility to help to limit data collection, because “ultimately we are the ones that can lock off what data those companies can access”.
Scott Smith added: “We should really be analysing the absolutely critical data that employers need to make their workplaces safe, and then we don’t give access to any of the other data.
“If we do that, then there isn’t much of a further discussion to have because they can’t physically access it.”
Echoing Neff, Scott Smith said any back-to-work technology must focus on privacy to be effective.
“These solutions aren’t just for employers, they’re for employees as well,” he said, “and if you want full adoption and buy-in, it needs to be done from the ground up with the employees, or else people aren’t going to use it and, ultimately, the solution will just be proven ineffective.”