http://phys.org/news/2013-08-encryption-thought.html"In information theory, the concept of information is intimately entwined with that of entropy. Two digital files might contain the same amount of information, but if one is shorter, it has more entropy."
"The problem, Médard explains, is that information-theoretic analyses of secure systems have generally used the wrong notion of entropy. They relied on so-called Shannon entropy"
"But in cryptography, the real concern isn't with the average case but with the worst case. A codebreaker needs only one reliable correlation between the encrypted and unencrypted versions of a file in order to begin to deduce further correlations."
""It's still exponentially hard, but it's exponentially easier than we thought," Duffy says."
"Bloch doubts that the failure of the uniformity assumption means that cryptographic systems in wide use today are fundamentally insecure."