A puzzle that has confounded mathematicians for almost a century is closer than ever to being solved, it has emerged. But there's one slight problem.
The calculations which prove a part of what's known as "Erdos discrepancy problem" have been worked out by a computer. And the sheer amount of data - more than the entire contents of Wikipedia - is so vast that it would be practically impossible to be checked by a human brain.
The "discrepancy problem" was posed in the 1930s by renowned Hungarian mathematician Paul Erdos. It revolves around the properties of infinite sequences of numbers containing nothing but +1s and -1s. Patterns in such sequences can be measured by creating finite sub-sequences.
Professor Enrico Scalas, of the University of Sussex, explained the premise: "You have a sequence of 1s and -1s (for instance, generated by tossing a coin) and a constant C. One is looking for a finite subsequence long enough so that the sum of the elements of the subsequence is larger than C."
The difficulty lies in actually proving this is the case mathematically. That's where computers and their ability to perform complex calculations come in. With this aid, computer scientists Dr Alexei Lisitsa and Dr Boris Konev of the University of Liverpool managed to demonstrate that an infinite sequence will always have a discrepancy (the sum of the numbers in a sub-sequence) larger than two.