An information-theoretic measure (usually stated as a number of bits) of the amount of uncertainty that an attacker faces to determine the value of a secret. [SP63] (See: strength.)
An information-theoretic measure (usually stated as a number of bits) of the amount of uncertainty that an attacker faces to determine the value of a secret. [SP63] (See: strength.)
Example: If a password is said to contain at least 20 bits of entropy, that means that it must be as hard to find the password as to guess a 20-bit random number.
An information-theoretic measure (usually stated as a number of bits) of the amount of information in a message; i.e., the minimum number of bits needed to encode all possible meanings of that message. [Schn] (See: uncertainty.)