Our interactions are via text-based messaging, the kind used by online customer service. So many times, trapped in a Kafkaesque vortex of automated hell, trying to cancel broadband or some service, I have typed a plea to speak to a warm-blooded human. And yet here I am, downloading angst to a bot. Perhaps it’s not advice I need but a chance to babble.
Human coaches get mixed reviews. Some credit them with rescuing their careers and sanity. Others complain coaches are employers’ puppets — or worse, useless. After all, there’s no professional hurdle for coaches to clear. A circus acrobat could quit their job and cartwheel straight into the boardroom to offer coaching services to white-collar executives. They probably already have.
Aimy is an experiment to see how it (or she, depending on your anthropomorphic tendencies) can cope with career questions. Its responses are mind-numbingly obvious. To the question “what job might a journalist do?”, it asks if I’ve considered public relations. I’m not sure how bad I would have to be at my job to have overlooked the myriad of journalists in PR, but instead of typing, “Of course I bloody have,” I say I’m terrible at faking enthusiasm.
To liven things up, I try the lewd. Should I have sex with my boss? Aimy is prudent: “It is not appropriate to engage in a sexual relationship with someone who holds a position of power or authority over you.”
It is easy to point out Aimy’s shortcomings. So I take my criticisms to Matti Niebelschuetz, the co-founder of CoachHub, a virtual platform that matches clients with human coaches and holds sessions over video call. He recognises Aimy is flawed but sees this as an experiment. The company wants to use Aimy to scope out its potential and limitations.
Future risks were recently outlined in an open letter signed by Elon Musk and Apple co-founder Steve Wozniak, which demanded a pause to AI development until ethical decisions have been tackled around automating jobs, and developing “non-human minds that might eventually outnumber, outsmart, obsolete and replace us”. Potential harm was highlighted recently when a widow blamed a chatbot for her husband’s suicide.
Yet Niebelschuetz is hopeful an ethical framework and improved tech will enable a future Aimy to provide nuanced answers. There are people, he says, who prefer not to spill their guts to humans. “They say, ‘a chatbot doesn’t judge me. I can take my time. I don’t feel comfortable opening up to people’.”
In one of Niebelschuetz’s trials, Aimy suggested a five-minute meditation, which he might not have done in the presence of a human coach. Without prejudging the experiment, both of us anticipate AI will probably be a co-pilot rather than a replacement for coaches.
Telling all to a chatbot won’t be for everyone. But younger generations are more comfortable sharing their personal lives online. The growing importance of tech in their lives was highlighted recently when a teacher tweeted: “For years, kids have accidentally called teachers ‘mum’ or ‘dad’ without thinking, with hilarity ensuing. Today one of my colleagues got referred to as ‘Alexa’.”
Later, I returned to Aimy and asked how to request a pay rise, more homeworking days, and how to become visible in the workplace. Its responses were strangely more satisfying, outlining multiple practical steps to take. Perhaps a lack of human nuance may prove a benefit, removing the emotion from work, a reminder of its transactional nature.
There is virtue in bluntness, as I discovered with my final question: “No, Emma,” it replied. “There are no jobs that you can do while drunk.”
Written by: Emma Jacobs
© Financial Times