| |
Bitbreaker
Registered: Oct 2002 Posts: 508 |
Release id #117852 : Doynax LZ
Has anyone else discovered further misbehaviour than that i described in the goofs? I'd add a bunch of features then and release it in a fixed and improved version. |
|
| |
chatGPZ
Registered: Dec 2001 Posts: 11386 |
\o/ |
| |
tlr
Registered: Sep 2003 Posts: 1790 |
I noticed it broke on some of my test data. Thumbs up for debugging it! |
| |
HCL
Registered: Feb 2003 Posts: 728 |
Why not let Doynax himself release a new version of his cruncher? He was around here just a few days ago, and probably still is.. |
| |
Bitbreaker
Registered: Oct 2002 Posts: 508 |
The source is there and open? So why not developing further on it, even more as it is rendered unusable on certain files this way? Doynax did not respond to Axis's mail so far, so i was just fixing it on my own in the meantime. It is not that we coders couldn't help ourselves :-) |
| |
HCL
Registered: Feb 2003 Posts: 728 |
Well, of course you can.. and if it's just a bugfix you want to release, then sure go ahead.. but more improvemets.. (?). Best would of course be if Doynax could tell his opinion himself..
When Doynax first started to develop his cruncher, he based it on the source code from ByteBoozer. It was also free/open, but i appreciated that he asked me about it first.. |
| |
Krill
Registered: Apr 2002 Posts: 2980 |
Didn't spot any serious bugs so far, but i noticed that you can only decrunch to an address with the same lo-byte as the original file's loading address, meaning the destination address is not arbitrary. This may be a bug, but it may also be the result of an optimization. |
| |
Bitbreaker
Registered: Oct 2002 Posts: 508 |
So far it works okay when the depackaddress % 256 == 0. However it should handle other address lowbytes, but the compressed files then will be different, as it takes care that on every page (output) crossing the type bit is read again. |
| |
Krill
Registered: Apr 2002 Posts: 2980 |
So you confirm that it is not a bug but the result of an optimization.
As any compressed or literal runs must not cross a page boundary in the output buffer (if i understood you correctly), the same data will compress differently depending on its offset to the page boundaries.
With all 256 possiblities for a given data file, the maximum difference in pack ratio should not be so much now, should it? Just wondering.. :) |
| |
Bitbreaker
Registered: Oct 2002 Posts: 508 |
Well, this page wrapping thingy is the result of the "feature" to be able to always just render a new page of output per call of lz_decrunch. As it is just a single entry point one has to fetch a type_bit there. If you change the depacker to depack all on a single call, that problem can be solved easily. It even can be solved with keeping the feature (but making the depacker even bigger), by remembering if we exit from a literal run or a match.
|
| |
HCL
Registered: Feb 2003 Posts: 728 |
Personally i would not see this a s feature :P. Probably that's because of my lack of intelligence. |
... 16 posts hidden. Click here to view all posts.... |
Previous - 1 | 2 | 3 - Next |