Another Tesla has crashed because the driver thought its self-driving technology could actually drive the car. As we read all the stories about magical technology and then use the hyped-up products, we ought to keep in mind that the "magic" hits the market long before they live up to their promise, which in some cases they will never do. If it's new, don't expect it to work as advertised.
The Tesla in Beijing, in Autopilot mode, hit the side of an illegally parked car and kept going until driver Luo Zhen - who had taken his hands off the steering wheel - manually stopped it. The US$7500 ($10,420) repair bill was probably a tough way for Luo to learn that when he read and heard about self-driving cars, or even when he watched Tesla's Autopilot video (which tells drivers to grip the wheel at all times but shows the Model S changing lanes, taking curves and parking itself), he was essentially reading and watching sci-fi.
I'm not going to accuse Tesla of false advertising, as many did after Autopilot led to a fatal crash. The technology can do what the video shows it doing, but it can't do it in every situation, and that's why the automaker's warning about holding on to the wheel is clearly articulated.
Nor was Microsoft really misleading customers about the ability of its Skype Translator to live-translate between Mandarin and English. It can do that when you speak slowly and clearly, avoiding complicated subjects and sentence structures, the way people do in the promotional videos.
But a Tesla cannot drive itself better than an experienced human driver can drive it. Skype Translator cannot really handle normal conversation the way even a middling simultaneous translator could. Nor can "big data" predict election outcomes or real-world economic phenomena better than traditional tools. And Pokemon Go isn't quite augmented reality.