Tue, 29 Apr 2003 17:38:01 +0100
This is a multi-part message in MIME format.
> The attached short program (compile with "ghc VServer.hs -o v=20
> -package net")
> is supposed to set up a server on port 15151, wait for a=20
> connection, read
> the first character from the connection, and print it out. =20
> if I test it, by running it, and starting up "telnet [machine] 15151"
> somewhere else, and then type some random text, EG=20
> "foo[RETURN]", it does
> not work. It looks as if the problem is that VServer.hs issues the
> hSetBuffering handle (BlockBuffering (Just 4096))
> on the connection, because when I change it to
> hSetBuffering handle NoBuffering
> the program works.
> However this is not what I want to do!! Because setting=20
> NoBuffering on the
> handle is going to mean that when the Server *outputs*=20
> something, it will
> potentially be done very expensively character by character. How do I
> get block buffering on the Server's output, but not have input to the
> server held up?
Hmm. I rather think that hGetChar should always return a character
immediately if there is one available, regardless of the buffering mode.
Looking at the source, it appears that hGetLine behaves like this, as
does lazy reading with hGetContents. I can't see any reason for waiting
for the buffer to be completely full before returning anything.
If you have a source tree handy, try the enclosed patch. If not, make a
copy of hGetChar from the sources in libraries/base/GHC/IO.hs, apply the
patch, and compile it separately (you'll need to import GHC.Handle
explicitly, amongst other things).