Currently approximately 9 per cent of the engineering workforce in the UK is female, with women making up only 20 per cent of those taking A Level physics.
"We have a problem," Professor Sharkey told Today.
"We need many more women coming into this field to solve it."
His warning came as it was revealed a prototype programme developed to short-list candidates for a UK medical school had negatively selected against women and black and other ethnic minority candidates.
Professor Sharkey said researchers at Boston University had demonstrated the inherent bias in AI algorithms by training a machine to analyse text collected from Google News.
When they asked the machine to complete the sentence "Man is to computer programmers as woman is to x", the machine answered "homemaker".
A separate US built a platform intended to accurately describe pictures, having first examined huge quantities of images from social media.
It was shown a picture of a man in the kitchen, yet still labelled as a woman in the kitchen.
Maxine Mackintosh, a leading expert in health data, said the problem is mainly the fault of skewed data being used by robotic platforms.
"These big data are really a social mirror - they reflect the biases and inequalities we have in society," she told the BBC.
"If you want to take steps towards changing that you can't just use historical information."
In May last year report claimed that a computer program used by a US court for risk assessment was biased against black prisoners.
The Correctional Offender Management Profiling for Alternative Sanctions, was much more prone to mistakenly label black defendants as likely to reoffend according to an investigation by ProPublica.
The warning came as in the week the Ministry of Defence said the UK would not support a change of international law to place a ban on pre-emptive "killer robots", able to identify, target and kill without human control.
This article originally appeared on the Daily Telegraph.