From: Paul Rubin on
Peter Fairbrother <zenadsl6186(a)zen.co.uk> writes:
> Gaussian) and a proportion of plaintexts give the same ciphertext
> under say four different keys... Do a whole lot of work, and find
> these. Note, that that doesn't technically count as work to break an
> instance of the cipher, as you only have to do it once for every
> instance.

If that doesn't "count", then building a complete codebook for AES
doesn't count either. I think by most reasonable criteria, it counts.
From: Peter Fairbrother on
Paul Rubin wrote:
> Peter Fairbrother <zenadsl6186(a)zen.co.uk> writes:
>>> There is a famous theorem going all the way
>>> back to Shannon, that says that a random boolean function on n inputs
>>> takes O(exp(n)) gates to implement (i.e. takes exponential time to
>>> compute).
>> I don't know about the history, or the definitions, but speaking for
>> every randomly chosen boolean function that's simply not true - some
>> of them will be implementable in less.
>>
>> It's also untrue as an average, for the same reason.
>
> Stated more precisely (if I have it right): for any c < 1, as n
> increases, the probability that a randomly selected boolean function on
> n inputs can be implemented in less than 2**(cn) gates approaches zero.
> I shouldn't have said O(exp(n)) instead of "exponential in n".
>
> I spent a little time looking up the theorem, which I'd known secondhand
> but had never looked at carefully. The proof is a simple counting
> argument (Shannon 1949 [1]; the version below is basically copied from
> Kulikov [2]). An n-input boolean function can be viewed as a truth
> table with 2**n entries of 1 bit each. So there are 2**(2**n) such
> functions.
>
> Now given some number t, suppose you have t 2-input gates each of which
> computes an arbitrary 2-input function (so there are 16 types of gates).
> Each input comes from either 1) one of the t gate outputs; 2) a
> constant (1 or 0); or 3) one of the n input variables. So there are
> t+n+2 possible input sources

yes, so far

or (t+n+2)**2 assignments of inputs
> to gates.

?? I don't follow that.

There are 2t gate inputs, and t+n+2 possible sources, shouldn't it be
2t(t+n+2) possible assignments?

(that's ignoring that a gate can't use it's output as it's own input, so
it's 2t(t+n+1), and there can't be any cycles, and so on - so it should
be even less)

> Since there are 16 choices of gate the total number of
> possible t-gate circuits is <= F(t,n) = (16*(t+n+2)**2)**t.

??

>
> If t=2**n/(10n) then F(t,n) is approx 2**(2**n/5) which is vanishingly
> small compared with 2**2**n when n is large enough.


Where does the 10 come from?

I'm confused.

This may be because I'm not very well, which is both good news and bad
news - I mostly only post here when I'm not well, as I haven't the time
otherwise - the good news is that you get to see my lovely posts, and
the bad news is that I'm not at my best when I post.





On another point, can anyone remember the formula for calculating the
entropy of a "random" permutation?

What I mean is that a permutation can be described by a list of the
letters in it, but the last letter doesn't have to be described as it's
fixed, the second-to-last is a choice of two, and so on.

I did know it once, but have forgotten, my google-fu is weak today, and
I don't feel like working it out from scratch.

Tom, perhaps? I think I remember you posting something on these lines.





-- Peter Fairbrother
>
> Obviously the average case is also exponential. But, it shows what a
> horrible state complexity theory is in, since we don't have any
> *specific* functions (those whose inverses are in NP) whose complexity
> is proven to be worse than linear.
>
> References
>
> 1. C. E. Shannon, `The Synthesis of Two-Terminal Switching Circuits,''
> Bell System Technical Journal, Vol. 28 (Jan., 1949), pp. 59-98.
>
> 2. http://cs.ioc.ee/~tarmo/tday-meintack/kulikov-slides.pdf (found by
> gooble search), see the first few slides.
From: WTShaw on
On Jan 25, 8:32 am, Peter Fairbrother <zenadsl6...(a)zen.co.uk> wrote:

>
> On another point, can anyone remember the formula for calculating the
> entropy of a "random" permutation?
>
Since the chances of drawing a given permutation, a series of n
different items, from a pile, etc., is equal to any other permutation,
anything like entropy, randomness, etc., is equilivalent for all
options.
From: Peter Fairbrother on
WTShaw wrote:
> On Jan 25, 8:32 am, Peter Fairbrother <zenadsl6...(a)zen.co.uk> wrote:
>
>> On another point, can anyone remember the formula for calculating the
>> entropy of a "random" permutation?
>>
> Since the chances of drawing a given permutation, a series of n
> different items, from a pile, etc., is equal to any other permutation,
> anything like entropy, randomness, etc., is equilivalent for all
> options.

Let's agree to disagree on that - it doesn't answer my question.

Let me rephrase - can anyone remember the formula for calculating the
Shannon entropy of a "random" permutation?

If you permute an alphabet, you can represent the permutation as an
ordered list of the letters, which would take n.log_2 n bits (where n is
the number of letters in the alphabet).

But you don't have to include the last letter in the list, as you know
what it has to be - the second-last letter only takes one bit, as there
are only two choices - and so on. It's less than the above figure.

I think the formula is something like (1-1/e) n.log_2 n bits. In fact it
may be exactly that, which I just worked out from scratch, but I may
have made a mistake.

-- Peter Fairbrother
From: Paul Rubin on
Peter Fairbrother <zenadsl6186(a)zen.co.uk> writes:
> Let me rephrase - can anyone remember the formula for calculating the
> Shannon entropy of a "random" permutation?

Basically you use Stirling's approximation of n! for that:

n! (approx) = (n/e)^n sqrt(2 pi n)

So log2 (n!) (approx) = n log2 (n/e) + 0.5 log2 (2 pi n)

That is equal to the Shannon entropy of the uniform probability
distribution on the space of permutations on n objects, which I
think is what you are asking.