When Microsoft's Cortana launched in 2014, a lot of the questions she received were sexual in nature. When Siri was released on smartphones in 2011, it became a game to get her to call users her "master"'.
But, you might ask, who cares? They are simply machines, without feelings or a conscience. In any case, they sometimes deserve the abuse. Who hasn't wanted to throw their smart speaker out of the window when it played K-Pop instead of Taylor Swift?
The problem is the real-world impact this can have on women and how they are viewed. A UN report last year found AI smart speakers with female voices send a signal that women are "obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like 'hey' or 'okay'".
Particularly worrying, it said, is how they often give "deflecting, lacklustre or apologetic responses" to insults.
And that's what we're teaching our children. An entire generation has grown up barking orders and insults at female-gendered smart speakers designed to be subservient.
Tech giants have done little to address the issue. The way smart speakers have been designed means there's no place for even basic niceties, like please and thank you - in fact, that would probably confuse Alexa.
Now, children expect to get what they want if they are demanding. Venture capitalist Hunter Walk, for instance, has written about how his Amazon Echo caused his 4-year-old to become bossy.
"Cognitively I'm not sure a kid gets why you can boss Alexa around but not a person," he wrote in his blog. "At the very least, it creates patterns and reinforcement that so long as your diction is good, you can get what you want without niceties."
Part of the problem is that these devices have been created largely without the input of women.
More than 75 per cent of computer programmers in the US are male - 83 per cent in the UK. They have around 80 per cent of the technical positions at Apple, Facebook, Microsoft and Google.
To these male-heavy tech teams, female voices are warmer and more pleasant. Daniel Rausch, the head of Amazon's Smart Home division, said that his team "carried out research and found that a woman's voice is more sympathetic".
Had they asked more women, however, they may have thought twice about the gender of their smart speaker. They could have better anticipated the abuse and they may even have considered how children might be affected.
Things are gradually starting to change. In 2017, Amazon installed a "disengage mode for Alexa" so that she would reply to sexually explicit questions with either "I'm not sure what outcome you expected" or "I'm not going to respond to that".
Please say please
On its Echo Dot Kids device, it included a "magic word feature" that would congratulate a child if they used the world "please".
Google made a similar change in 2018 with its "Pretty Please" feature, and last year, it introduced a range of new male voices.
The real innovation, however, is happening outside of Silicon Valley boardrooms. For instance, Beeb, the BBC's digital voice assistant has a male, northern accent to challenge gender stereotypes. Meanwhile a team at Vice Media's Virtue creative agency, has come up with the world's first gender-neutral voice assistant known as Q.
That may go some way to helping address the social issues that smart speakers create, but it may just be too little, too late.
Smart speakers are becoming increasingly ingrained in society. This year, Ofcom said children listen to Alexa more than the radio, and research firm Gartner predicts that soon we will be having more conversations with bots than with our partners.
But we still don't truly know the impact of smart speakers on the next generation. The devices are already being used instead of babysitters to read bedtime stories. They are helping with homework and to settle dinner table disputes.
These voices are shaping impressionable young minds, and making a lasting impact on the way they behave. You could argue giving voice commands is simply another way for children to programme a computer, that they know the difference between people and machines. But do we really want to take that risk? That's something trillion-dollar tech companies should have asked a long time ago.