Stop using "Int" for microsecond delays in "base"

Wolfgang Jeltsch g9ks157k at acme.softbase.org
Wed Mar 30 14:31:33 CEST 2011


Am Samstag, den 26.03.2011, 17:20 +0000 schrieb Paul Johnson:
> The "base" library has the "threadDelay" primitive, which takes an Int 
> argument in microseconds.  Unfortunately this means that the longest 
> delay you can get on a 32 bit machine with GHC is just under 36 minutes 
> (2^31 uSec), and a hypothetical compiler that only used 30 bit integers 
> (as per the standard) would get under 10 minutes.  It is a bit tricky to 
> write general-purpose libraries with this.  I think that there should be a
> 
>     type Delay = Int64
> 
> declaration, and that threadDelay and related functions should take that 
> as an argument type.
> 
> Paul.

Maybe, we should stop using Int altogether. Why not use Integer if we
want integers, and try to implement Integer and its operations more
efficiently? Int8, Int16, Int32, and Int64 are good when doing system
programming, CInt is good when interfacing with C, but what is Int good
for?

Best wishes,
Wolfgang




More information about the Libraries mailing list