Prev: RSA key size and safety
Next: MBOL AAOT MBCL LUAT MKAT
From: Maaartin on 5 Oct 2009 09:00 On Oct 5, 10:58 am, Mok-Kong Shen <mok-kong.s...(a)t-online.de> wrote: > Maaartin wrote: > > Sure... the number of ones is fine after the transposition, the only > > problem could be the bijectivity. > > It passed a small test I wrote. It found no solution for even n. So > > I'm happy with it for now. > I don't quite understand your problem. Maybe just 'cause I don't have a problem. > The bijectivity is in this > case the same as the non-singularity of the matrix, which however > is particularly easy to check with gaussian elimination, since the > matrix is boolean. Exactly what I did. But I'm not absolutely sure I made no error there. > Doing the computation of the avalanche measure for the function > like the one I found -- I mean when the function is given -- is > actually very straightforward and fast. Thus I don't see why this > apparently useful information about any such functions used in > crypto should be ignored. That's simple: There're many known attacks against existing crypto algorithms. The two simplest (and maybe most important) are differential and linear attack. The higher DPMax (LPMax) the better the differential (linear) attack work (not exactly, but that doesn't matter). This means, that DPMax and LPMax are very important. AFAIK, there's no attack related to avalanche. So avalanche is not important. OTOH, I think that optimizing DPMax and LPMax leads to good avalanche, too.
From: Mok-Kong Shen on 5 Oct 2009 10:05
Maaartin wrote: > Mok-Kong Shen <mok-kong.s...(a)t-online.de> wrote: [snip] >> The bijectivity is in this >> case the same as the non-singularity of the matrix, which however >> is particularly easy to check with gaussian elimination, since the >> matrix is boolean. > Exactly what I did. But I'm not absolutely sure I made no error there. This is a basic problem of using computers to 'prove' something. How can one be sure of correctness of programming. Despite advances in CS, almost no software in practice could be verified in the strict sense of the word. Among the best examples of achievement in that direction is a recent news that some 7000 lines of code have been verified with a verification system plus some six man-years or so. >> Doing the computation of the avalanche measure for the function >> like the one I found -- I mean when the function is given -- is >> actually very straightforward and fast. Thus I don't see why this >> apparently useful information about any such functions used in >> crypto should be ignored. > > That's simple: There're many known attacks against existing crypto > algorithms. The two simplest (and maybe most important) are > differential and linear attack. The higher DPMax (LPMax) the better > the differential (linear) attack work (not exactly, but that doesn't > matter). > > This means, that DPMax and LPMax are very important. AFAIK, there's no > attack related to avalanche. So avalanche is not important. OTOH, I > think that optimizing DPMax and LPMax leads to good avalanche, too. I mean that how good the avalanche is constitutes an issue that should not be ignored. Otherwise, why was considerable effort done in that direction during the design of DES? If a cipher is very poor in avalanche, then it could evidently be plausibly considered to be weak. That conversely very good avalanche doesn't by itself mean strong encryption strength, is certainly equally entirely evident. If computing an avalanche measure were extremely costly, I could imagine of cases where one would like to economize. But, as I said, it isn't. Thanks, M. K. Shen |