[Haskell-cafe] Ord methods too strict?

V.Liepelt V.Liepelt at kent.ac.uk
Fri Jan 4 15:45:38 UTC 2019


Hi Li-yao,

Thanks for you interesting thoughts.

There is also an opposite situation, where evaluating the second argument allows us to free a lot of memory, as that argument would otherwise remain a thunk with references to many things.

Good point, but this is one of the tradeoffs with laziness. In the case of GHC, will the runtime not eventually free the memory of the second argument and everything it points to? It surely must do this one way or another since otherwise laziness would mean that while a process is running it will keep leaking space.

But to compare two pairs `(a,b) <= (x,y)` we would have to first traverse `(a,b)` to decide whether it's the smallest element, before evaluating the second argument to WHNF and comparing component-wise. Besides the extra complexity (both in code and in run time), this would not be less strict than the current default, since the first argument could get completely evaluated before even looking at the second one: `(False, _|_) <= (True, True) = _|_` (currently, this is `True`).

So with the laziest possible implementation (below—do check for accuracy, I might have missed something) these are the cases where the result is defined for Bool and undefined for Lazy Bool:

(F,u) <= (T,F)
(F,u) <= (T,T)
(F,u) <= (T,u)

vs. (undefined for Bool and defined for Lazy Bool)

(F,F) <= (F,u)
(F,F) <= (u,F)
(F,F) <= (u,T)
(F,F) <= (u,u)
(T,F) <= (T,u)

So it does seem that we can make it marginally more lazy overall. I’m not saying we should, I was just genuinely surprised that Ord is strict.

Definitions:

F <= _ = T
T <= q = q
F <  q = q
T < _  = F

(p, q) == (r, s) = p == r /\ q == s
(F, F) <= _ = T
(p, q) <= (r, s) = p < r \/ p == r /\ q <= s
(T, T) <  _ = F

We also would not know whether the type (a, b) has a smallest element without stronger constraints than (Ord a, Ord b), so this optimization is really only practically feasible for types with a nullary first constructor.

Hm, perhaps we are thinking about this in different ways. From the definition above, we don’t need any stronger constraints than Ord.

Best,

Vilem

On 2 Jan 2019, at 15:06, Li-yao Xia <lysxia at gmail.com<mailto:lysxia at gmail.com>> wrote:

Hi Vilem,

> This is not just about crashing. (I’m using `undefined` as a way of
> making strictness explicit.) `False >= veryExpensiveComputation`
> should return `True` immediately without any unnecessary computation.

There is also an opposite situation, where evaluating the second argument allows us to free a lot of memory, as that argument would otherwise remain a thunk with references to many things.

> Also it doesn’t seem like a special case: this makes sense for any partially ordered Type with a top and/or bottom element.

That may be acceptable with Bool because its values are small. But to compare two pairs `(a,b) <= (x,y)` we would have to first traverse `(a,b)` to decide whether it's the smallest element, before evaluating the second argument to WHNF and comparing component-wise. Besides the extra complexity (both in code and in run time), this would not be less strict than the current default, since the first argument could get completely evaluated before even looking at the second one: `(False, _|_) <= (True, True) = _|_` (currently, this is `True`). We also would not know whether the type (a, b) has a smallest element without stronger constraints than (Ord a, Ord b), so this optimization is really only practically feasible for types with a nullary first constructor.

Li-yao

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.haskell.org/pipermail/haskell-cafe/attachments/20190104/fff5bbad/attachment.html>


More information about the Haskell-Cafe mailing list