Bug in IO libraries when sending data through a pipe?

Jens Petersen petersen@redhat.com
11 Mar 2002 10:59:30 +0900

Volker Wysk <post@volker-wysk.de> writes:

>         (zu, von) <- createPipe
>         vonh <- fdToHandle von
>         hSetBuffering vonh NoBuffering
>         mpid <- forkProcess
>         case mpid of
>            Nothing -> do   -- child
>               -- connect pipe's read end to stdin
>               -- and close its write end
>               dupTo zu (intToFd 0)
>               fdClose zu
>               hClose vonh
>               executeFile prog True par Nothing
>               ... -- (print error message)
>            Just pid -> do   -- parent
>               fdClose zu          -- close pipe's read end
>               -- ** here **
>               hPutStr vonh txt    -- write text to forked process
>               hClose vonh         -- close pipe's write end
>               -- wait for child process to finish
>               (Just ps) <- getProcessStatus True True pid
>               if ps == Exited ExitSuccess
>                   then return ()
>                   else ...) -- (error message)
> The problem is that the child process doesn't receive all the data which
> the parent sends. It's as if "hPutStr vonh txt" sends the data lazily
> somehow, and "hClose vonh" closes the pipe prematurely.
> It varies from run to run exactly which data gets through. If I cause the
> child process to read all its input immediately, the problem doesn't
> seem to occur. Normally, it does so gradually, which takes a few seconds.
> I'm using GHC 5.02.2

Quite possibly could be a bug.  Lazy IO is rather subtle I
think, specially when done across pipes.  I faced some
similar problem with in POpen recently.  You can see how I
solved it (worked round it?) by comparing the latest release
1.00 with the previous one 0.00.1:


In comparison Posix.runProcess allows attaching file handles
to the in, out and error pipes, which can be written to and
read from eagerly I suppose.