Others justify their actions because they are being forced to do the work of a checkout person, or they have had to put up with problems in the checkout process or even that the mis-scanning of an item was a mistake or accident.
The classification however assumes that people actually cheated as a consequence of this motivation rather than just conveniently excusing something that they, along with a large number of other people were doing.
Nobel prize winning economist Gary Becker has proposed the "Simple Model of Rational Crime" to explain this type of behaviour. He put forward the view that people do a simple "cost-benefit analysis" of every given situation to decide whether they are going to be dishonest.
In deciding whether to park illegally for example, they will weigh the benefits of free parking against the risk of getting caught and the consequences of a fine if that does, in fact, happen.
What behavioural economist Dan Ariely has discovered however, is that cheating is an irrational process that a large number of us will actually do.
However, this type of dishonesty is always for small amounts. Ariely calls this amount the "fudge factor". It can be dismissed as being inconsequential in comparison to the overall amount of a transaction. This type of cheating is independent of the potential reward and the likelihood of being caught, undermining Becker's rational model of crime.
This is exactly the same type of behaviour that is seen when people download movies, get around a new's site's firewall, or even cheat in an online test.
More disturbingly, in other experiments carried out by Ariely, he showed that the more you distanced the cheating from a direct connection with a financial reward, the more likely it was to happen and by a greater amount. In other words, it "abstracted" the dishonesty that the individual was engaging in.
Of course, this is exactly what we do when technology is put in between people and the actions they are carrying out. In this case, using a computer to checkout our shopping rather than have a person do it for us. The distancing of the consumer in the act of interacting with the organisation makes it very easy for a great number of people to be dishonest and to not consider what they do as stealing.
This is exactly the same type of behaviour that is seen when people download movies, get around a new's site's firewall, or even cheat in an online test.
Because this behaviour is irrational, it can be manipulated to reduce its happening. Ariely has found that if you get people to simply sign a statement saying that they will behave morally and won't cheat, they do in fact cheat less. Just getting people to think about the ten commandments turned out to have a similar effect, regardless of whether people were religious or not.
What is important with this practice however is that it needs to be done before the task is carried out and only has a limited time that its effect will last.
The dishonesty that shoppers are engaging in is not a rationally calculated act.
Asking students to agree to act honestly before an online quiz is likely to be effective, whereas getting them to sign a statement after they have written an essay and attach it to their submitted work, is not.
In the case of the self-service checkout systems, a simple introductory screen that asked shoppers to agree that they will be honest would likely be effective in reducing cheating at the checkout. Another way would be to have a staff member who greets every shopper as they come to the checkout and reminds them that they will be there to help if needed.
These staff are trained to spot people trying to cheat the machines and will intervene if necessary, but it is not always effective. Reminding customers of a moral code prior to their use of the system is a much more inexpensive way of reaching everyone. This approach is also different from the largely ineffective warnings that films sometimes display about illegal copying being a crime.
The dishonesty that shoppers are engaging in is not a rationally calculated act and so appealing to rational arguments to prevent people behaving in this way is not going to work.
This article was originally published on The Conversation. Read the original article.
David Glance is director of UWA Centre for Software Practice at the University of Western Australia.