From: David Mark on 23 May 2010 14:34 nick wrote: > On May 23, 7:43 am, Sean Kinsey <okin...(a)gmail.com> wrote: > >> I'm sorry to say that your attempt to 'compress' code has failed. Did >> you ever take into consideration that gzip (used to served compressed >> files) also use LZW (and in a more efficient way than you are)? > > Yeah, I thought about that but I figured the point of javascript > compressors was that they would be used in environments where gzip > compression on the server is not an option (many shared hosts, which > many people seem content to use, for some reason don't use gzip). Mine doesn't; still I wouldn't use something like this. The largest benefactors of GZIP are dial-up users and nodems have built-in compression. ;) > >> A quick test I did with an input file of 56.3KB: >> Direct compression using 7-Zip into a .gz archive = 12KB >> Compression using pressjs and then compressed into a .gz archive: >> 20.9KB > >> And the same using a minified version of the same script >> Direct compression using 7-Zip into a .gz archive = 4.51KB >> Compression using pressjs and then compressed into a .gz archive: >> 7.68KB > > I wonder if encoding to base64 would yield better compression ratios > afterwards? I seriously doubt it. > Maybe still not as good as using gzip on the uncompressed > file though. Almost certainly not. > > I just did a similar test with Dean Edwards' "packer" with the "Base62 > encode" and "Shrink variables" options on and it manages to get a > similar gzip-compressed size to the gzip-compressed size of the > original... If I can achieve a similar gzip-compressed size after > pressing, I think this should be at least as useful as packer (not > sure what this group's opinion of packer is, though). Packer is a complete waste of time. > >> Not to mention the added overhead of having to decompress the file >> after the UA has downloaded the file. > > True, although the size overhead is only about 1200 bytes (and > shrinking), and the processing overhead is negligible. Define negligible. > >> The only scenario where this method would be beneficial is where gzip >> is not used on the server, bad caching directives are used causing the >> file to be downloaded in full each time, and the extra time used >> downloading is higher than the extra time needed to decompress. >> Hopefully that isn't a too-common scenario. > > It's more common than you might think (shared hosting). Shared hosting doesn't automatically fit that bill. Mine doesn't have GZIP, but I don't use bad "caching directives". And again, modem-based compression makes all of these "packers" a waste of time.
From: nick on 23 May 2010 15:00 On May 23, 2:34 pm, David Mark <dmark.cins...(a)gmail.com> wrote: > nick wrote: > > On May 23, 7:43 am, Sean Kinsey <okin...(a)gmail.com> wrote: > >> I'm sorry to say that your attempt to 'compress' code has failed. Did > >> you ever take into consideration that gzip (used to served compressed > >> files) also use LZW (and in a more efficient way than you are)? > > Yeah, I thought about that but I figured the point of javascript > > compressors was that they would be used in environments where gzip > > compression on the server is not an option (many shared hosts, which > > many people seem content to use, for some reason don't use gzip). > Mine doesn't; still I wouldn't use something like this. The largest > benefactors of GZIP are dial-up users and nodems have built-in > compression. ;) I had never heard of it before, but I found a good article on it here: http://ixbtlabs.com/articles/compressv44vsv42bis/ It looks like text compresses particularly well in their tests. Is this kind of thing usually enabled by default, or do modem users have to jump through a bunch of hoops to set it up? I also found a lot of instructions for enabling modem compression. > > I wonder if encoding to base64 would yield better compression ratios > > afterwards? > I seriously doubt it. Only one way to find out. ;) > > Maybe still not as good as using gzip on the uncompressed > > file though. > > Almost certainly not. Might be close. Packer-packed scripts can be slightly smaller than their non-packed equivalents when gzipped. > > I just did a similar test with Dean Edwards' "packer" with the "Base62 > > encode" and "Shrink variables" options on and it manages to get a > > similar gzip-compressed size to the gzip-compressed size of the > > original... If I can achieve a similar gzip-compressed size after > > pressing, I think this should be at least as useful as packer (not > > sure what this group's opinion of packer is, though). > Packer is a complete waste of time. Heh, thought you might say that. Packer has failed to work properly with at least one script I've written... not sure who's fault that was, but I've never felt comfortable using it. > > [...] the processing overhead is negligible. > Define negligible. I don't notice any time going by at all, and I'm using an old laptop from 2003 with one gig of ram downclocked to 1.07 GHz so it doesn't catch on fire. I guess that's not a very scientific test though.
From: nick on 23 May 2010 15:02 GG did something weird with the quotes, sorry about that.
From: David Mark on 23 May 2010 15:07 nick wrote: > On May 23, 2:34 pm, David Mark <dmark.cins...(a)gmail.com> wrote: >> nick wrote: >>> On May 23, 7:43 am, Sean Kinsey <okin...(a)gmail.com> wrote: > >>>> I'm sorry to say that your attempt to 'compress' code has failed. Did >>>> you ever take into consideration that gzip (used to served compressed >>>> files) also use LZW (and in a more efficient way than you are)? > >>> Yeah, I thought about that but I figured the point of javascript >>> compressors was that they would be used in environments where gzip >>> compression on the server is not an option (many shared hosts, which >>> many people seem content to use, for some reason don't use gzip). > >> Mine doesn't; still I wouldn't use something like this. The largest >> benefactors of GZIP are dial-up users and nodems have built-in >> compression. ;) > > I had never heard of it before, but I found a good article on it here: > > http://ixbtlabs.com/articles/compressv44vsv42bis/ > > It looks like text compresses particularly well in their tests. Yes, extremely well. > > Is this kind of thing usually enabled by default, or do modem users > have to jump through a bunch of hoops to set it up? I also found a lot > of instructions for enabling modem compression. It typically works right out of the box. Has for decades. Skip the articles about modem init strings. They haven't been a concern for the average user in decades. > >>> I wonder if encoding to base64 would yield better compression ratios >>> afterwards? > >> I seriously doubt it. > > Only one way to find out. ;) > >>> Maybe still not as good as using gzip on the uncompressed >>> file though. >> Almost certainly not. > > Might be close. Packer-packed scripts can be slightly smaller than > their non-packed equivalents when gzipped. But then you have to download Packer and wait for it to decompress the content. It's a waste of time. > >>> I just did a similar test with Dean Edwards' "packer" with the "Base62 >>> encode" and "Shrink variables" options on and it manages to get a >>> similar gzip-compressed size to the gzip-compressed size of the >>> original... If I can achieve a similar gzip-compressed size after >>> pressing, I think this should be at least as useful as packer (not >>> sure what this group's opinion of packer is, though). > >> Packer is a complete waste of time. > > Heh, thought you might say that. Packer has failed to work properly > with at least one script I've written... not sure who's fault that > was, but I've never felt comfortable using it. Even if it worked flawlessly, it would still be a waste of time. The fact that it introduces an additional point of failure is just a "bonus". ;) > >>> [...] the processing overhead is negligible. > >> Define negligible. > > I don't notice any time going by at all, and I'm using an old laptop > from 2003 with one gig of ram downclocked to 1.07 GHz so it doesn't > catch on fire. I guess that's not a very scientific test though. No.
From: David Mark on 23 May 2010 15:10
nick wrote: > GG did something weird with the quotes, sorry about that. That's alright. Thunderbird did something very weird with the post as well. It threw an error (too many IP connections to the server or some such BS), put it in the Sent folder, but didn't send it. Just so happens I noticed and pasted it into GG. That's probably what caused whatever weirdness you are referring to. |