Prev: A published list of rare tetragraphs?
Next: A discrepancy in literatures on permutation polynomials
From: unruh on 12 Jan 2010 18:21 On 2010-01-12, Sebastian Garth <sebastiangarth(a)gmail.com> wrote: > On Jan 12, 11:06?am, Michael B Allen <iop...(a)gmail.com> wrote: >> I need to encrypt some data and give the password to an escrow >> attorney so that only under certain conditions (e.g. dirt nap) a list >> of beneficiaries will have the ability to recover this data. But I am >> going to make the encrypted package publicly available along with the >> source code of the decryption program. So I need the encryption method >> used to be particularly good. >> >> My first thought is to simply encrypt the data multiple times using >> different algorithms and key sizes (e.g. AES128 -> RC4 -> AES256) >> using different segments of a randomly generated 32 character >> alphanumeric password. The rational is that if / when an algorithm is >> broken, the enclosed encrypted layer would look random and thus not >> give the attacker any feedback as to their success. They would have to >> successfully crack all layers simultaneously. Is this reasoning valid? >> >> Mike > > Short answer: probably...but it would likely be overkill. Running the > data through, say, a 4096-bit RSA would be more than sufficient. If in > doubt, though, just increase the key length. That is about as secure as a single pass through AES 128. And will take you a long time to encrypt. RSA is pretty useless for actual encryption of some document.
From: Mok-Kong Shen on 12 Jan 2010 18:23 Michael B Allen wrote: > I need to encrypt some data and give the password to an escrow > attorney so that only under certain conditions (e.g. dirt nap) a list > of beneficiaries will have the ability to recover this data. But I am > going to make the encrypted package publicly available along with the > source code of the decryption program. So I need the encryption method > used to be particularly good. > > My first thought is to simply encrypt the data multiple times using > different algorithms and key sizes (e.g. AES128 -> RC4 -> AES256) > using different segments of a randomly generated 32 character > alphanumeric password. The rational is that if / when an algorithm is > broken, the enclosed encrypted layer would look random and thus not > give the attacker any feedback as to their success. They would have to > successfully crack all layers simultaneously. Is this reasoning valid? I think that, if one cascades two algorithms A and B and A and B are of different nature and the keys employed for the two are random and independent, then the combined strength is evidently greater than either of the components (assuming, of course, that neither component has strength 0). Perhaps some expert could give a literature reference where there is a "clear" proof of this intuitively clear fact. M. K. Shen
From: unruh on 12 Jan 2010 18:25 On 2010-01-12, Michael B Allen <ioplex(a)gmail.com> wrote: > On Jan 12, 4:24?pm, Sebastian Garth <sebastianga...(a)gmail.com> wrote: >> On Jan 12, 11:06?am, Michael B Allen <iop...(a)gmail.com> wrote: >> >> > I need to encrypt some data and give the password to an escrow >> > attorney so that only under certain conditions (e.g. dirt nap) a list >> > of beneficiaries will have the ability to recover this data. But I am >> > going to make the encrypted package publicly available along with the >> > source code of the decryption program. So I need the encryption method >> > used to be particularly good. >> >> > My first thought is to simply encrypt the data multiple times using >> > different algorithms and key sizes (e.g. AES128 -> RC4 -> AES256) >> > using different segments of a randomly generated 32 character >> > alphanumeric password. The rational is that if / when an algorithm is >> > broken, the enclosed encrypted layer would look random and thus not >> > give the attacker any feedback as to their success. They would have to >> > successfully crack all layers simultaneously. Is this reasoning valid? >> >> > Mike >> >> Short answer: probably...but it would likely be overkill. Running the >> data through, say, a 4096-bit RSA would be more than sufficient. If in >> doubt, though, just increase the key length. > > Hi Sebastian, > > It is highly desirable that the decryption process be very simple as > it will be exercised occasionally by legal types who may have limited > technical savvy. In particular I want to: > > 1. Use an alphanumeric password that can be easily communicated > using a trivial methods such as in a document or email or perhaps even > verbally. Using a certificate is troubling. Those are your weak points. Not the encryption technique. > > 2. The decryption program needs to highly portable . So I need to > use whatever crypto is available on a Microsoft Windows machine which > probably amounts to RC4, AES128, AES192, AES256, SHA1 and MD5 (I don't > think XP has "BICOM" and I have no idea what that is anyway). It is Scott's own encryption program that he has been floggin here for years. Unfortunately there is only one person in the world who likes it. > > So combined with Maaartin's recommendation not to derive keys from the > same master key, I'm thinking of something like the following > encryption procedure: > > 1. Generate random 64 character alphanumeric password P. > 2. Use P[0-15] to generate 16 byte hash H1 > 3. Encrypt plaintext using AES128 initialized with H1 to yield > encrypted data E1 The hashes add no strength whatsoever. > 4. Use P[16-31] to generate 16 byte hash H2 > 5. Encrypt E1 using RC4 initialized with H2 to yield encrypted data > E2 > 6. Use P[32-63] to generate 32 byte hash H3 > 7. Encrypt E2 using AES256 initialized with H3 to yield encrypted > data E3 > > So no part of the password is reused and I'm using 2 different > algorithms with 3 different key sizes. If this is in fact overkill, > that is fine. Writing a program to encrypt something 3 times is only > fractionally more difficult than encrypting it once. Why not do it 1948 times instead of just three times? After all you do not care if it is overkill. From a security point of view you are like the viewer of the magician with the rabbit- staring fixedly in entirely the wrong direction. > > My only concern would be if the cyphertext at each step will look > completely random. Meaning if someone successfully decrypted the > AES256 outer layer, could they know that it was successful or would an > invalid decryption product look as equally random as the correct RC4 > cyphertext? > > Mike
From: biject on 12 Jan 2010 20:12 On Jan 12, 4:25 pm, unruh <un...(a)wormhole.physics.ubc.ca> wrote: > On 2010-01-12, Michael B Allen <iop...(a)gmail.com> wrote: > > > > > On Jan 12, 4:24?pm, Sebastian Garth <sebastianga...(a)gmail.com> wrote: > >> On Jan 12, 11:06?am, Michael B Allen <iop...(a)gmail.com> wrote: > > >> > I need to encrypt some data and give the password to an escrow > >> > attorney so that only under certain conditions (e.g. dirt nap) a list > >> > of beneficiaries will have the ability to recover this data. But I am > >> > going to make the encrypted package publicly available along with the > >> > source code of the decryption program. So I need the encryption method > >> > used to be particularly good. > > >> > My first thought is to simply encrypt the data multiple times using > >> > different algorithms and key sizes (e.g. AES128 -> RC4 -> AES256) > >> > using different segments of a randomly generated 32 character > >> > alphanumeric password. The rational is that if / when an algorithm is > >> > broken, the enclosed encrypted layer would look random and thus not > >> > give the attacker any feedback as to their success. They would have to > >> > successfully crack all layers simultaneously. Is this reasoning valid? > > >> > Mike > > >> Short answer: probably...but it would likely be overkill. Running the > >> data through, say, a 4096-bit RSA would be more than sufficient. If in > >> doubt, though, just increase the key length. > > > Hi Sebastian, > > > It is highly desirable that the decryption process be very simple as > > it will be exercised occasionally by legal types who may have limited > > technical savvy. In particular I want to: > > > 1. Use an alphanumeric password that can be easily communicated > > using a trivial methods such as in a document or email or perhaps even > > verbally. Using a certificate is troubling. > > Those are your weak points. Not the encryption technique. > > > > > 2. The decryption program needs to highly portable . So I need to > > use whatever crypto is available on a Microsoft Windows machine which > > probably amounts to RC4, AES128, AES192, AES256, SHA1 and MD5 (I don't > > think XP has "BICOM" and I have no idea what that is anyway). > > It is Scott's own encryption program that he has been floggin here for > years. Unfortunately there is only one person in the world who likes it. > I am not sure what Unruh is on. But BICOM is not my encryption program. It was written my someone who wanted to make a bijective ppm compress with full Rijndael AKA AES. I mention it because it the only full blown bijective implimentation of AES that I am aware of again its not my code. My own encryption program is SCOTT19U and its hard to use correctly and it so twisted that it will not work on all machines. Its the kind of nightmare code that when ever the compiler changes it may not compile. You would need to know alot about C to use this on other machines and you would need help in creating a good password. Its so twisted even a famous Phd crypto person here thought he had an attack to defeat it. Another expert tried the attack and failed. The first expert had to admit he really didn't take a good look at the code. So I suspect you would have trouble following it. The other program I mentioned is BWTS this is a bijective indexless BWT that would mix the complete document it would be used after the first and second encryption program. This would increase the Unicity Distance of whatever you encrypted. Even with 3 seperate and different encryption programs the Unicity distance using modern ciphers is quite short. I am the author of this code and it is also used in some other libraries so it may become a standard compared to my other code. At one time Unicity distance was of great concern. Know adays most don't know much about it but it is of a valid concern to those serious about safe secure encryption. Or you can do what the herd here does. Use what they say are hope its safe. Good Luck with that. What ever you use for your first pass. The later passes should not change the length of your file. David A. Scott -- My Crypto code http://bijective.dogma.net/crypto/scott19u.zip http://www.jim.com/jamesd/Kong/scott19u.zip old version My Compression code http://bijective.dogma.net/ **TO EMAIL ME drop the roman "five" ** Disclaimer:I am in no way responsible for any of the statements made in the above text. For all I know I might be drugged. As a famous person once said "any cryptograhic system is only as strong as its weakest link"
From: Peter Fairbrother on 13 Jan 2010 03:42
Michael B Allen wrote: > Nonsense. Encrypting a file in multiple passes is hardly much more > complex than doing one pass. My impression from reading the answers to > my question is that, if I completely screwed up the implementation, > multiple passes would simply be no more secure than one pass. I don't know about screwing up the implementation, which could do horrible things - but yes, there is established theory that multiple encryption with statistically independent keys is at least as secure as the first encryption, see: M. Maurer and J. L. Massey, Cascade ciphers: The importance of being first, Journal of Cryptology, vol. 6, no. 1, pp. 55�61, 1993. www.isiweb.ee.ethz.ch/archive/massey_pub/pdf/BI434.pdf so if one pass is secure enough you're okay. (I have a tiny doubt about their result, which I've been working on for some years now - but in practice it won't mean much even if it pans out) -- Peter Fairbrother |