From: Tom St Denis on
On Jan 15, 7:11 am, Sebastian Garth <sebastianga...(a)gmail.com> wrote:
> AES is a good algorithm, and so are RSA and Diffie-Hellman. Primary
> strength, again, being key length. Those three work with large
> integers and primes, and if the key length is long enough, they are
> more than secure. Using an 8192 bit prime, for example, would be ample
> for the next few thousand years (or at least until quantum computing
> becomes viable)! The algorithms to avoid are cypher-type schemes.
> These are worthless, and a dime a dozen...

You watched Swordfish recently haven't you?

AES/RSA/DH are not even the same types of algorithms. AES is
symmetric and the other two are asymmetric. RSA works with composite
moduli, while DH typically works with prime moduli. AES technically
uses prime like elements but not over integers, it's also not an
algorithm where you can generalize the key length by just making the
numbers bigger.

And there is no rational means to conclude that an 8192-bit prime will
be secure for the next thousand years. Recall that's exactly what
they thought of RSA-129 [before the invention of QS and MPQS]. Assume
factoring doesn't advance. There isn't enough energy nor space in the
universe to operate the GNFS algorithm [the fastest known thus far] to
factor a 8192-bit RSA key [assuming that's what you meant...]. Assume
factoring does advance, then it's not reasonable to conclude it
couldn't break some order of scope boundary and make 8192-bit numbers
tractable. Therefore, it's not a valid conclusion that only after a
thousand years we'd have to worry about 8192-bit factorizations.

To this day 1024-bit RSA keys are still considered secure against the
GNFS under the premise they are not irrevocable. Even before the 768-
bit factorization people were recommending 2048-bit RSA keys for CA
roles [and root of trust sort of things]. This factorization changes
nothing. Heck, for ephemeral applications I'd still use a 768-bit key
[not by design, but I wouldn't scoff at it]. It's not practical to
factor that size for most "private" transactions [e.g. signing on to a
website to buy a $12 album].

Tom
From: Peter Fairbrother on
Mok-Kong Shen wrote:
> Peter Fairbrother wrote:
>
>> I don't know about screwing up the implementation, which could do
>> horrible things - but yes, there is established theory that multiple
>> encryption with statistically independent keys is at least as secure as
>> the first encryption, see:
>>
>> M. Maurer and J. L. Massey, Cascade ciphers: The importance of being
>> first, Journal of Cryptology, vol. 6, no. 1, pp. 55�61, 1993.
>>
>> www.isiweb.ee.ethz.ch/archive/massey_pub/pdf/BI434.pdf
>>
>> so if one pass is secure enough you're okay.
>>
>> (I have a tiny doubt about their result, which I've been working on for
>> some years now - but in practice it won't mean much even if it pans out)
>
> Wouldn't an 'definition' of 'statistically independent keys' (not very
> simple to verify absolutely in practice, I surmise) by itself
> ('logically') imply that the 2nd cipher couldn't weaken the work of the
> 1st cipher? (For in the other case one could say then that the keys
> aren't 'independent' after all.)
>
> When your work bears fruit, please let us know as soon as possible, for
> that would be very interesting and valuable.

I don't think that's going to be for some time if ever, and it probably
won't be very valuable (although it might be interesting), but here's a
rough outline of my thoughts ... my doubts are more intuitive than
clearly expressible, and originally based only on a nebulous doubt of
the proof-by-example in the paper mentioned ...

.... if anyone wants to finish it feel free, but credit me in the paper
please! ...

.... and this is probably going to get long, and I'll be including some
beginner's stuff. And I'm only looking at block ciphers here, not stream
ciphers.


Consider a block cipher as a set of permutations of plaintext into
ciphertext.

[ In simple terms, we could write the possible plaintexts out in
numerical order, then write out the corresponding ciphertexts in order -
that ordered list is a permutation, and there is a different permutation
for each key. ]

If there are B different possible plaintexts in a block, there are B! (B
factorial) possible permutations. If there are K possible keys then
there are K actual permutations.

The cipher, in effect, chooses one the possible permutations, and
assigns one to each key. For a "secure" cipher, one usual requirement is
that these choices and assignations "look" random for all purposes.

So far, so normal.

[ There is another requirement for a cipher - in order to be able to
uniquely decrypt, the cipher cannot assign the same permutation to two
different keys.

[]-> Therefore it is impossible to have a truly random choice over the
entire set of possible permutations.

This also implies that B! must be greater than K - not usually a
problem, for the average block cipher B! is very very much larger than K.

Later we'll consider a case where B! is about equal to K. This doesn't
happen with today's real block ciphers, which is why it isn't likely to
be a valuable result in practice (although it isn't a result at all, as
yet). ]


First, let's go back a bit - for a secure cipher the choices of
permutations should look random for all purposes.

But for any real cipher the choices are fixed by the cipher itself, in
the sense that they don't change - and there are patterns in them even
if they are truly chosen at random.

In any finite fixed set of random numbers there are patterns of some
kind, if you look for them.

[]-> It is therefore impossible to create a real cipher where the
permutations look random for all purposes - in some sense patterns can
be detected in them.

Incidentally, this is why rainbow tables can be used to break most
(all?) ciphers, although in practice they may often need more resources
than brute force.



On to cascaded ciphers. The second cipher also has patterns in it, and
these can interact with the patterns of the first cipher.

Usually, for real ciphers, this isn't a problem, in the sense that it
takes more resources than brute force to break them, and it's usually
possible to prove that an attack based on this kind of interaction is
statistically overwhelmingly likely to take more resources than brute
force on the first cipher, when the keys are independently chosen.

But for cases where B! is about equal to K in both ciphers, I *think*
birthday problems negate that statistical proof. I haven't worked out
the details though, and I may very well be wrong.



-- Peter Fairbrother
From: Simon Johnson on
> People need to separate what feels good to what actually is good.  And
> there are in fact good reasons not to homebrew a multi-cipher solution
> up yourself.  So all in all it's better to stick to known usage models
> and/or protocols.
>

You're obviously not smart enough to be a cryptographer.

AES is horribly weak. I personally use a 32 round fiestel with an f-
function comprising two random 32x32 bit permuations, mixed with a
PHT. The round keys are chosen entirely at random. This eliminates the
need for those pesky key schedules, which seem to be the source of so
many breaks.

In a world with ubiquitous, cheap RAM. Why would I settle for these
inferior ciphers with such a clear algebraic structure?




From: Tom St Denis on
On Jan 15, 10:45 am, Simon Johnson <simon.john...(a)gmail.com> wrote:
> > People need to separate what feels good to what actually is good.  And
> > there are in fact good reasons not to homebrew a multi-cipher solution
> > up yourself.  So all in all it's better to stick to known usage models
> > and/or protocols.
>
> You're obviously not smart enough to be a cryptographer.
>
> AES is horribly weak. I personally use a 32 round fiestel with an f-
> function comprising two random 32x32 bit permuations, mixed with a
> PHT. The round keys are chosen entirely at random. This eliminates the
> need for those pesky key schedules, which seem to be the source of so
> many breaks.
>
> In a world with ubiquitous, cheap RAM. Why would I settle for these
> inferior ciphers with such a clear algebraic structure?

Yeah but is it bijective so you can outwit the clowns at the NSA who
fool the "sheeples" into using AES?

Also for a purely random F function iirc you need 7 rounds for
complete security. So that's only 7 * 2 * 2^32 * 4 or 28GiB of
storage. Easily doable :-)

Tom
From: Tom St Denis on
On Jan 15, 10:50 am, Tom St Denis <t...(a)iahu.ca> wrote:
> On Jan 15, 10:45 am, Simon  Johnson <simon.john...(a)gmail.com> wrote:
>
> > > People need to separate what feels good to what actually is good.  And
> > > there are in fact good reasons not to homebrew a multi-cipher solution
> > > up yourself.  So all in all it's better to stick to known usage models
> > > and/or protocols.
>
> > You're obviously not smart enough to be a cryptographer.
>
> > AES is horribly weak. I personally use a 32 round fiestel with an f-
> > function comprising two random 32x32 bit permuations, mixed with a
> > PHT. The round keys are chosen entirely at random. This eliminates the
> > need for those pesky key schedules, which seem to be the source of so
> > many breaks.
>
> > In a world with ubiquitous, cheap RAM. Why would I settle for these
> > inferior ciphers with such a clear algebraic structure?
>
> Yeah but is it bijective so you can outwit the clowns at the NSA who
> fool the "sheeples" into using AES?
>
> Also for a purely random F function iirc you need 7 rounds for
> complete security.  So that's only 7 * 2 * 2^32 * 4 or 28GiB of
> storage.  Easily doable :-)

Pre-lunch unit fail: its' 2^32 * 4 bytes so it's really 56 * 4 GiB or
224GiB.

Remember kids, no sarcastic math on an empty stomach...

Tom