[web-devel] http-enumerator : Any way to get the headers first?
Michael Snoyman
michael at snoyman.com
Mon May 16 20:35:57 CEST 2011
On Mon, May 16, 2011 at 11:23 AM, Erik de Castro Lopo
<mle+hs at mega-nerd.com>wrote:
> Hi all,
>
> I'm continuing work on the HTTP proxy I'm writing. The absolute
> bare basics are working with Warp, Wai and http-enumerator as long
> as the upstream web server doesn't send gzipped or chunked data. For
> these later two cases, httpe-enumerator helpfully gunzips/unchunks
> the data. That however causes a problem.
>
> If my proxy simply passes the HTTP headers and data it gets from
> http-enumerator and passes then on to the client, the client barfs
> because the headers claim the data is gzipped or chunked and the
> data actually isn't chunked/gzipped.
>
> There are a number of possible solutions to this:
>
> a) Strip the content/transfer-encoding header and add a
> content-length header instead. I think this is probably
> possible with the API as it is, but I haven't figured out
> how yet.
>
> b) Rechunk or re-gzip the data. This seems rather wasteful
> of CPU resources.
>
> c) Modify the Network.Http.Enumerator.http function so that
> de-chunking/gunzipping is optional.
>
> d) Expose the iterHeaders function that is internal to the
> http-enumerator package so that client code can grab the
> headers before deciding how to handle the body.
>
> Are there any other options I haven't thought of yet?
>
> From the options I have, I actually think d) makes the most sense,
> Would a patch exposing iterHeaders be accepted?
>
> Cheers,
> Erik
>
>
Short answer is that I'm fine exposing iterHeaders, as long as we put a big
fat "advanced users only" comment on it. I agree with you that (b) seems
like a bad idea. (a) is definitely possible, if a bit tricky, but it defeats
the whole purpose of chunking and gzipping, so it probably shouldn't be
considered.
I would guess that (c) is really the best option, though I'm guessing you
shied away from it because it involved more substantial changes to
http-enumerator. Maybe we should consider adding an extra httpAdvanced
function that takes additional settings, such as whether or not to
automatically de-chunk/de-gzip. I wouldn't be surprised if we come up with
more such cases in the future.
Michael
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.haskell.org/pipermail/web-devel/attachments/20110516/57b02f1c/attachment.htm>
More information about the web-devel
mailing list