[web-devel] http-enumerator : Issue with chunked encoding and rawBody
Erik de Castro Lopo
mle+hs at mega-nerd.com
Mon Oct 17 05:24:25 CEST 2011
Hi all,
I've been working on simple HTTP proxy written using Warp, Wai and
http-enumerator. This had led to a number of patches being pushing
into the last of those 3 packages.
This proxy is supposed to be a pure pass through for which I added
the rawBody field to the http-enumerator Response data type. However
I have now figured out that while this does work for Gzipped data
it doesn't work for data with transfer-encoding set to chunked.
The symptom of 'not working' is that any client which tries to
pull data from a server that serves up chunked data just hangs and
eventually times out. This happens both with a client written using
http-enumerator and with wget.
Digging around a bit in the http-enumerator sources I find this:
let body' x =
if not rawBody && ("transfer-encoding", "chunked") `elem` hs'
then joinI $ chunkedEnumeratee $$ x
else case mcl >>= readMay . S8.unpack of
Just len -> joinI $ takeLBS len $$ x
Nothing -> x
The problem seems to be with the "Nothing -> x" case. As an
experiment I replaced that last line with:
Nothing -> joinI $ chunkedEnumeratee $$ x
which solves the hang/timeout, but means the data received by the
client is no longer chunked even though the header says it should
be. I therefore think I need something like:
Nothing -> joinI $ chunkedPassthuEnumeratee $$ x
which requires a new Enumeratee that reads chunked data and passes
it on still chunked. Basically something with a signature like this:
chunkedPassthruEnumeratee :: MonadIO m =>
Enumeratee S.ByteString S.ByteString m a
Does this make sense or am I missing a much easier way to do this?
Cheers,
Erik
--
----------------------------------------------------------------------
Erik de Castro Lopo
http://www.mega-nerd.com/
More information about the web-devel
mailing list