Year | definition | source |
---|---|---|
1985 | "In information theory, entropy is the uncertainity or "lack of organization" in a passage. In a connected passage the choice of word at any point is restricted by its context. If there is total certainty as to the word used, entropy is zero; total uncertainty means 100 percent entropy. " | Robinson, C. G. (1985, p.772) |