[Discuss] ssh keys question
Kent Borg
kentborg at borg.org
Fri Jun 17 22:27:45 EDT 2016
On 06/17/2016 09:05 PM, Bill Horne wrote:
> Out of curiosity, please tell me how entropy is measured,
I count decisions.
In my case, I take 32-buts from urandom, and map them into dictionary
words. Those same words could be mapped back into those exact 32-bits of
binary. Like base64 and back, only in this case the mapping is
specifically designed to be read out loud. There is a company mapping
every little square on earth into three-word phrases. But the fact GPS
coordinates can be mapped into words doesn't make finding a hidden
easter egg dropped randomly on the globe any easier.
In my case I am using mnencode, something I ran across years ago.
Essentially dice-ware.
> and how many bits of entropy are in the string "ysywlmtihtg". TIA.
I have no idea. It has everything to do with how it was generated. It
might be the initial letters of a catchphrase from The Hobbit (low
entropy), or it might have been randomly generated (high entropy). It is
the number of bits that drove your choices in selecting "ysywlmtihtg"
that matter. Did you choose between your four favorite Hobbit quotes?
Then it is maybe only 2-bits of entropy.
In the case of a password like coral-iceberg-neptune if one does not
know how it was generated, if one assumes it is lowercase plus other
ASCII symbols, it might appear to be 56^20, or well over 100-bits of
entropy, but if one knows the pattern it is far fewer--know how you
generated it and count the low bound--at worst this example is 32-bits,
assume the worst. "Seems random" is not as good as "actually random".
-kb
More information about the Discuss
mailing list