I had put my hand up for a University of Auckland Department of Psychological Medicine research project done together with AI specialist Soul Machines called Conversations with an artificial human, not quite knowing what to expect.
There's plenty of buzz around AI currently, hype in fact. I arrived at the UoA building in Auckland's Newmarket half expecting a scene from the near future in which I'd be convinced thinking the AI was real.
I was reminded of Spike Jonze's uncomfortably dark movie Her in which ScarJo plays Samantha, an "operating system" that emotionally manipulates Joaquin Phoenix's Theodore.
The reality turned out to be a program running on a Windows laptop. This is where the avatar code lives, attempting to trigger emotions in subjects that a monitoring device strapped around their wrists measures.
Heart rate, body motion and temperature as well as galvanic skin response - the changes in the electric current resistance caused by emotional stress - are picked up by the device on your wrist.
A video camera set up discreetly a couple of metres to the side recorded my interactions with the artificial human, and I filled in a survey to assess my emotional state prior to the conversation with the AI (there's another one after the chat too) for comparison.
Then, the research assistant fired up Soul Machines-made avatar through a script on the laptop (with some typos scrolling past and please, no "he reads the matrix" jokes).
There was my Samantha: a disembodied head, beautiful and with an almost human-like countenance.
"She has a very expressive face," assistant professor Elizabeth Broadbent told me the day after the conversation with the AI. The same avatar is used for all participants and I quietly wondered if its gender was selected for the same reason Alexa and Siri are female, that we're "predisposed to think more positively of women's voices."
Instead of the expected free-form conversation, the avatar instead showed a predetermined set of images and questions that I would respond to. It was interesting, but not particularly natural and human-like.
I had to pause and think of my responses to the avatar's questions in detail before speaking them, making sure to say them in one go, quickly and without the usual "ummms" or pauses.
If I stopped to think, the AI would assume I was done and jump to the next question. Lots of real people do this too because they're really bad at listening but that's probably not the sort of human-like trait the AI developers sought to emulate.
Apart from feeling disgusted at some of the gruesome pictures that the research project uses, I felt mostly like I was talking to a nicely done video game character. Kind of cool, but years away from feeling human.
There's no doubt AI get there: apart from assessing participants' reactions to the avatar, Broadbent told me the data captured will be used to train the AI itself to better recognise people's emotions.
This is a fascinating area of research. Done right, it can create automated systems that are heaps better at interacting with humans, recognising how we feel and anticipating what we need.
For instance, getting upset at an AI chatbot not answering questions properly could make the program do something useful like finding a human to talk to.
One the other hand, very realistic emotion-state recognition could be confusing and even deceptive, when used in AI-driven marketing and surveillance systems.
Dating apps and "adult entertainment" featuring AI are likely to appear soon and frankly, they are scary prospects. When AI becomes super-realistic, a number of ethical and moral bridges will be crossed forever.
Either way, this is a fascinating glimpse into the future that you too can have: the project is looking for 402 participants (I was number 143) who are at least 18 years old and speak English fluently.
The whole thing takes about half an hour. If you're keen, email participate@soulmachines.com to reserve a spot.
Participant data are kept secure and anonymised, the university and Soul Machines promise.
There's also the option to receive the results once the data has been analysed in a couple of months' time, which should be very interesting.