[Haskell-cafe] workarounds for Codec.Compression.Zlib errors
duncan.coutts at worc.ox.ac.uk
Wed Nov 26 17:28:21 EST 2008
On Wed, 2008-11-26 at 14:38 +0000, Eric Kow wrote:
> Hi everybody,
> This advisory is for people who have installed darcs 2.1.2 via the
> Cabal build method. As you may have noticed, the cabalised darcs
> sometimes fails with errors like
> Codec.Compression.Zlib: incorrect data check
> Why this happens
> Older versions of darcs can to produce gzipped files with broken CRCs.
> We never noticed this because our homegrown wrapper around the C libz
> library does not pick up these errors.
I should note that one moral of this story is to check that your FFI
imports are correct. That is, check they import the foreign functions at
the right Haskell types. In this case the mistake was that the foreign
function returned a C int, but the Haskell foreign import declaration
stated that the C function returned IO () rather than IO CInt.
This is where a tool really helps. The hsc2hs tool cannot check the
cross-language type consistency while c2hs can. It reads the C header
files and generates the FFI imports at the correct Haskell types.
The downside is that c2hs is not shipped with ghc, it is a bit slower
and it's not quite so good with structures.
I think there is a need for a tool like c2hs but that works in a
checking mode rather than in a generating mode. It would use much of the
same code as c2hs but it would read the C header files and the .hs file
(via ghc api) and check that the FFI imports are using the right types.
That way it could be run to check a package without the checker tool
being needed at build time on every platform. The downside would be that
some C header files differ between platforms and c2hs handles this fine
while a checker tool might say it's ok on one platform and that may not
carry over to another. Still, it would be an improvement on just using
raw FFI imports (or hsc2hs, which is really the same thing).
More information about the Haskell-Cafe