tar-package was: Re: modules of cabal-install
duncan.coutts at worc.ox.ac.uk
Mon Feb 23 06:34:44 EST 2009
On Mon, 2009-02-23 at 12:03 +0100, Christian Maeder wrote:
> Antoine Latter wrote:
> > On Wed, Feb 18, 2009 at 5:20 AM, Christian Maeder
> > <Christian.Maeder at dfki.de> wrote:
> >> Hi,
> >> I would like to reuse the module Distribution.Client.Tar
> >> from the cabal-install package.
> > There's a 'tar' package on Hackage:
> > http://hackage.haskell.org/cgi-bin/hackage-scripts/package/tar
> Thanks, for pointing this out. This package depends (in contrast to
> cabal-install) on additional packages binary and unix-compat, which
> isn't a real problem, but only seems unnecessary.
Yes, I agree, I've already removed them.
> The tar package does also not support compression (via zlib), but maybe
> this could/should be a separate package.
Yes it's easy to compose. The docs in the new code give examples. I
don't think it's right to have the package depend on zlib and bzlib and
lzma and ... etc when it is trivial to just compose (de)compression in
Tar.unpack dir . Tar.read . GZip.decompress =<< BS.readFile tar
BS.writeFile tar . GZip.compress . Tar.write =<< Tar.pack base dir
> (There's also an older package "htar" with "compression".)
Oh, that's really just a test program for the tar package. The current
one uses the zlib and bzlib packages for compression.
> The sources in cabal-install seem most up-to-date (because of
> cabal-install-0.6.2) and it would make sense to take this sources and
> replace those in the tar-package.
Yes, that's what I was doing over the weekend.
> Does the tar-package have any advantage (i.e. speed or portability) over
It does now! :-) It's now better structured, has better error checking
and is better documented.
> The module structure Codec.Archive.Tar looks a bit nicer, but
> re-exporting the internal data structures seems unnecessary to me.
> Where are the actual repositories with the most recent sources?
darcs get http://code.haskell.org/tar/
Let me know what you think about the API and documentation. You mention
above about exporting internal data structures. As far as I can see
everything that is exported in the current code is needed. Let me know
if you think it is too much or too little.
Currently I get round-trip byte-for-byte compatibility with about 50% of
the .tar.gz packages on my system (I'm on gentoo so there's lots of
those). The ones that are not byte-for-byte equal after reading/writing
are still readable by other tools (and probably normalised and closer to
standard compliant) but it needs investigating in more detail.
The checking API is incomplete (security, tarbombs, portability) and
there are no tests for the lazy streaming properties yet (ie that we can
process arbitrary large archives in constant space).
More information about the Libraries