Stop using "Int" for microsecond delays in "base"

Bas van Dijk v.dijk.bas at
Sat Mar 26 20:44:24 CET 2011

On 26 March 2011 20:16, Henning Thielemann
<schlepptop at> wrote:
> Paul Johnson schrieb:
>> The "base" library has the "threadDelay" primitive, which takes an Int
>> argument in microseconds.  Unfortunately this means that the longest
>> delay you can get on a 32 bit machine with GHC is just under 36 minutes
>> (2^31 uSec), and a hypothetical compiler that only used 30 bit integers
>> (as per the standard) would get under 10 minutes.  It is a bit tricky to
>> write general-purpose libraries with this.
> Isn't it just a
> waitLong :: Integer -> IO ()
> waitLong n =
>   let stdDelay = 10^7
>       (q,r) = divMod n stdDelay
>   in  replicateM_ (fromInteger q) (threadDelay (fromInteger stdDelay))
>        >> threadDelay (fromInteger r)
> _______________________________________________
> Libraries mailing list
> Libraries at

Or take the one from concurrent-extra:

However, I agree that it would be nice to reduce the need for these
functions by having a larger sized Delay type.

I don't think it's that hard to change. When using the threaded rts
the threadDelay function is defined by the GHC event manager which
stores time in seconds since the epoch (1970) in a Double. It gets the
current time from the gettimeofday function[1]. When not using the
threaded rts the threadDelay function is defined inside the rts. I'm
not sure how hard it is to change that one.



More information about the Libraries mailing list