KEY POINTS:
The British Government has announced that, from this summer, a trial will be put in place at UK ports and airports. In place of the old routine, in which an immigration officer looks at you, looks at your photograph, swipes the passport and lets you through, you will now have to gaze into the all-seeing lens of a machine. Face-recognition software is going to be put in place.
One might wonder a little bit about the logic, or the timing, of this move. After all, it's only a matter of weeks since huge numbers of passengers were being held up by what were genteelly termed "teething problems" at Heathrow Terminal 5. I doubt whether all those travellers have now been, or ever will be, reunited with their baggage after it was entrusted to a similarly super-duper piece of technology. As time goes on, the demands of security, and the sudden collapse of all systems after an alert, will only increase in frequency and severity. Why, one might ask, are these changes being made now?
You might ask again when you consider the track record of this technology. The key point is what the industry calls "false negatives" - incidents where the technology fails to identify the match between photograph and human being. You can imagine the chaos created by this happening in any numbers at all - queuing, failing, queuing again, failing better ...
The numbers suggested differ very markedly. Those in charge of the technology say that these "false negatives" amount only to 3 to 5 per cent of faces inspected. Clearly, when the most optimistic estimate says that one passenger in 20, under optimum conditions, will have his or her journey seriously disrupted, we might be right to look more carefully at the case.
Other, more independent, observers think that even 5 per cent is absurdly low. Facial recognition software was introduced at the Super Bowl in America, to identify terrorists and criminals in general. It threw up so enormous a number of false negatives as to be completely useless and had to be abandoned. It doesn't seem to have been used in the past seven years. Other experts have given a more conservative estimate that the systems are only successful in making a match in 40 per cent of cases. And, I have to say, if the emphasis lies on what will hugely inconvenience most of us, the false negative, when the technology is so half-baked, the question of false positives, too, must arise, and terrorists and child-nappers sail blithely through the wonderful mass of face-mapping machinery.
Why on earth are we doing any of this? What is wrong with a man at a desk, assessing the demeanour and features of a human being, and making a judgment? Well, in most cases, it will still come down to that. You wonder who, apart from people who are simply in love with technology, is in favour of a measure which will slow everything down and increase levels of error hitherto undreamed of.
The whole affair is reminiscent of Hardy's poem on the loss of the Titanic, The Convergence of the Twain, in which the ship sails unknowing towards the iceberg. Any fool can, without even the benefit of hindsight, see the Government and an almighty cock-up on collision course. The two seem to have a delirious mutual enchantment; the rubbishy mechanism of government, the rubbishy mechanisms of technology. Here comes the wished-for cock-up; and there's nothing anyone can do about it.
- INDEPENDENT