[Haskell-cafe] Network.Curl and thread safety

Iustin Pop iustin at google.com
Wed Jan 5 10:14:21 CET 2011


Hi all,

I'm not able to find out how one can use Network.Curl with the
threaded runtime safely.

I have this simple example:

import Network.Curl
import Control.Concurrent
import Control.Concurrent.MVar

getUrl :: (Monad m) => String -> IO (m String)
getUrl url = do
  (code, body) <- curlGetString url [CurlSSLVerifyPeer False,
                                     CurlSSLVerifyHost 0,
                                     CurlTimeout 15,
                                     CurlConnectTimeout 5]
  return $ case code of
             CurlOK -> return body
             _ -> fail $ "Curl error for " ++ url ++ " error " ++ show code

runGet :: String -> MVar (Maybe String) -> IO ()
runGet url mv = do
  body <- getUrl url
  putMVar mv body

main = withCurlDo $ do
  let urls = replicate 10 "https://google.com/"
  mvs <- mapM (\_ -> newEmptyMVar) urls
  mapM_ (\(mv, url) -> forkIO (runGet url mv)) $ zip mvs urls
  mapM (\mv -> takeMVar mv >>= print) mvs
  threadDelay 10000000


When using curl linked with GnuTLS and running it with the
multi-threaded runtime, it fails immediately with:
  $ ./main 
  main: ath.c:193: _gcry_ath_mutex_lock: Assertion `*lock ==
  ((ath_mutex_t) 0)' failed.
  Aborted (core dumped)

Reading the Network.Curl docs, it seems that using withCurlDo should be
enough to make this work (although I'm not sure about the description of
the function and "no forking or lazy returns").

Using Network.Curl built against the OpenSSL headers does successfully
retrieve all URLs, but fails a bit later (during the threadDelay,
probably in GC) again with segmentation fault in libcurl.4.so.

I've tried changing forkIO to forkOS, but still no luck, with either SSL
library.

Any ideas what I'm doing wrong?

regards,
iustin



More information about the Haskell-Cafe mailing list