So you thought you'd broken the ice with that quick chat over coffee this morning with the new hire – but your wristband has buzzed and now the human resources manager wants you to call them urgently. You've been too assertive, they warn. Could you mind your tone with the
Mood trackers: How HR departments could impose tech tyranny in the woke workplace
Hull-based Moodbeam calls its wearable "a lifeline for businesses". Wearers press one of two buttons to tell their boss their emotional state. But that's primitive compared to what Amazon, flush from the success of Alexa, has planned.
Amazon's ambitions reflect Silicon Valley's huge investment in artificial intelligence, big data analysis, and decades of research into what's called "affective computing" – detecting emotions by computer. Late last year, Amazon introduced its first Fitbit-style fitness wearable, Halo, in America. Halo infers your emotional state, but also goes quite a bit further. It polices your tone, too. If the Halo's algorithm judges you're being too assertive, it gives you a ticking off.
The Halo wearer sees a two-dimensional matrix: one scale is "positivity-negativity", and the other is energy (high or low). If your measurements fall in the middle, the words "calm, focused and balanced" float comfortingly in the centre of the display. It's as Californian as a vagina-scented candle from Goop, and draws on the psychology's circumflex model of emotional classification, developed by James Russell.
But there's a deeper problem, however, says Professor Andrew McStay, head of the emotional AI team of researchers at Bangor University, because what's Californian isn't universal. "The only people where that might be considered 'normal' is in that area of California where many of these systems are developed – and that misses quite a large section of the world out," he says.
"This is a fundamental problem with all these technologies: they're all trying to model something that is not universally agreed, which is what emotions and affective states are. Of itself, this is not innately problematic, but when marketed and when decisions are made such as in the workplace on the basis of these insights, this does become problematic."
McStay's team is researching ethics and practices on emotional recognition tools, and notes that cultural differences are huge. For example, Silicon Valley has drawn on the work of the pop psychologist Paul Ekman, who declared there are six basic universal human emotions. But not everyone agrees on these, notes McStay.
"Ekman has a basic list of emotions, like anger and disgust. But in Japan, they measure thoroughness." Being satisfied is a basic emotional state that's important to humans, but it's one Ekman omitted. "Emotion isn't something that exists in the brain, it is partly physical, but also environmental. You have to understand culture and geography, too."
Vital context may have also been missed by the new workplace mood police. And will poorly socialised or simply excitable employees get short shrift? Will they now be discriminated against for being themselves? The prospect is sufficient to concern McStay and other ethics researchers.
Shiny happy people
"Attitudinal diversity means embracing the malcontent who is brilliant, but not always calm," he says. McStay's fear is not so much the crude state of the technology – and much of it is crude, with the free speech NGO Article 19 describing much of it as "pseudoscience" – but that HR departments will place too much trust in the slick dashboards that have been sold to them by large technology consultants, tools that purport to give an objective overview of the employees.
That's "scientism not science", McStay says.
Amazon declined to offer a representative for comment (nor did Microsoft or Deloitte), but British pioneer Jag Minhas, chief executive of Sensing Feeling, notes the development with dismay.
His company monitors crowds for density and velocity, but only in the aggregate – the company neither stores nor transits images and does not record identifiable information on individuals. "It wouldn't surprise me if people cut corners, it's a competitive market," he says. For visual sensing to be sustainable as a business it has to be accepted by the public at large, not just our clients. Mood policing is not something we would go anywhere near."
Sociological battles in firms seem to be driving the change.
Once upon a time, the personnel department barely ranked higher in the corporate pecking order than clerical workers. But in America they've been pulled into the culture wars. Will they be tempted to weaponise emotional recognition such as Amazon's Halo, and turn it on undesirables?
"America's role as the world's HR department was confirmed with the extension of 1960s civil rights legislation, after a Supreme Court judgment in 1986, to sexual harassment in the workplace. That was pretty rife," says James Woudhuysen, visiting professor of forecasting and innovation at London South Bank University.
"But by the 1990s, with IT performing the payroll and administrative tasks that had once been the staple of 'Personnel', newer HR teams could devote themselves to sexual harassment in the workplace, and a lot more besides. They had a lot more time to look at performance and background. Their dominion has increased – everyone has to sign up to the same song sheet."
The fear is the newly woke corporation may be tempted to pre-empt situations by policing employees' mood and tone. Older employees may fear that the emotion recordings will be used against them, to get rid of them and replace them with younger, more compliant and cheaper replacements.
However, Max Winthrop, a solicitor and head of Employment Law at The Law Society, warns that using emotional biometric data to do so won't be easy.
"A claimant could possibly show a pattern of dismissals of a certain age with the application of this policy. That's because with discrimination cases, employment law reverses burden of proof." Since Edwards vs London Underground, the employer must show that discrimination no played not part in their dismissal.
"If you could show a pattern within an organisation where 75 per cent of people, for example, have been dismissed for conduct reasons who are in the age group 55-65, that would look anomalous. Age is a protected characteristic – and a 60-year-old could say it would never have been used against someone under 55."
"You're going to run into danger if you use any kind of algorithm or decision making that does the thinking for you, that shows some of these prejudices. Senior managers think it's a magic wand that solves problems. They don't ask: 'Does it suit our workplace?'"
The rise of emotional recognition technology is being watched by The Chartered Institute of Personnel and Development, its senior research adviser, Hayfa Mohdzaini, says.
Three quarters of innovations that were introduced that failed to focus on the customer experience turned out to result in a considerable waste of valuable resources. "If it's just about micro-managing, there are other ways of monitoring performance – either through regular check-ins, or something like sharing your work on a shared drive in the cloud."
But businesses will continue to be seduced, notes the Law Society's Winthrop.
"People behave themselves better if you operate as a person, rather than an algorithm, and you can take onboard everyone's character, whether that's personality differences or cultural differences or needs. Unfortunately businesses assume there's some magic key that will unlock running the business, rather looking a person in the eye and saying, 'if you ever do that again, you're out on your ear'".