[Haskell-cafe] "Quantization" of computations, or step-wise termination, as an extension of Logic Monads
Juan Casanova
juan.casanova at ed.ac.uk
Fri Jul 12 02:56:46 UTC 2019
Thank you for your very detailed and very relevant response. It made
me think about some things.
> I am curious to have a look at your library.
I guess I'll just attach it here and end up sharing it with the whole
list. Just hope noone feels overwhelmed by too much code to look at.
> I think all of these points can be addressed by starting from an
> infinite binary tree.
>
>
> data Tree a = Empty | Leaf a | Branch (Leaf a) (Leaf a)
>
> instance Monad Tree where ...
>
>
> In particular:
>
> - infinite branching can be modeled by an infinite sequence of
> finite branches: Branch 1 (Branch 2 (Branch 3 (...)));
This is correct, but not complete in general, although the logic monad
library does deal with this. The problem is, if you have a conceptual
infinite branching, in which each branch is infinite in depth, the
process of expressing the infinite branching as an infinite sequence
of finite branching will, if done naively, be unfair towards later
branches. So, if I want to search all the multiples of all powers of
2, the "naive" transformation will "concatenate" all those lists /
branches, but because the first one is infinite, any search performed
in that way will not reach the latter ones. In other words, things
that were at depth 2 before will now be at depth infinity because you
have prepended them with an infinite amount of finite branches. This
can be overcome by doing the transformation from infinite branching to
binary/finite branching diagonally, as I described and as the logic
monad library does.
So, yeah, it can be modelled that way, but the process of translation
is very relevant for fairness.
> - the problem of completeness is addressed by, uh, postponing it.
> Okay that's not really addressed, but the point is that the data is
> all there at least (not lost in a diverging computation), and the
> search problem is now a very concrete tree traversal problem, which
> does not require any fancy monadic abstraction a priori, but also
> does not prevent you from building one up as other requirements make
> themselves known.
>
>
> Regarding the issue of fairness you mention, a key point is that
> "filter" should only replace Leaf with Empty, and not try to do any
> merging like rewriting (Branch Empty t) to t. More precisely, as
> soon as a function satisfies the equation
>
> f (Branch Empty t) = f t,
>
> then it has to diverge on trees with no leaves (or it has to discard
> leaves, so it's useless if completeness is a concern). I'm guessing
> that's where the logic monad goes astray.
This is a very good point, one that I had not thought of. In some
sense, the problem is precisely that, that the logic monad library
does not represent the computation itself (neither as a tree or as a
list), but instead only the results, and therefore a filter must
satisfy that equation, and therefore it will diverge.
>
> One way to think of this problem is in terms of "productivity":
> "filter" for Logic may not be productive if its argument represents
> an infinite search space. In total programming languages, i.e.,
> those which enforce productivity, Agda and Coq being chief examples,
> you would actually be stopped because the Logic monad cannot
> represent infinite search spaces (such as log_multiples in your
> sample).
>
> Coinductive types are a promising alternative here, with two main
> solutions off the top of my head being coinductive trees, as I
> sketched above, and coinductive lists/streams with an extra "delay"
> constructor. I am curious to see how your library relates to that.
Indeed, my library relies essentially on lists/streams with an extra
"delay" constructor. I must say the word "coinductive" got me a bit
puzzled there, but after looking it up a bit, I think I understand
what you mean. I definitely know what inductive means, I just did not
know there was such a concept as coinduction.
You can see the library in the attachment, but the definition of my
data structure is (I differentiate Empty from Halt but that is pretty
inconsequential, it just alters how I deal with joining several
infinites. In a sense it's a type of cut. I also wrap errors within
it, which may be bad practice.):
data EnumProc t = Empty | Halt | Error String | Produce t (EnumProc t)
| Continue (EnumProc t)
where the Continue constructor is the extra "delay" one. Whenever a
function that works on this structure needs to do a recursive call (or
mutually recursive call, etc.), instead of just doing the recursive
call, I prepend it with "Continue", so that I can ensure that it will
be in head normal form, and therefore I can perform the computation
one step at a time.
Now, I see how you feel that Trees are a more natural way to describe
the computation than infinite lists with delay, but I'm not sure if
there are any actual advantages to using that approach instead of mine.
In other words, I think what you are suggesting is essentially what I
did, and what you are saying are perhaps better argumented reasons as
to why you need to do it. What I wonder is if there is indeed a way to
do these kind of things with the logic monad library (and perhaps
other standard libraries) without having to implement my own?
Thanks again,
Juan
--
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: EnumProc.hs
Type: text/x-java
Size: 40671 bytes
Desc: not available
URL: <http://mail.haskell.org/pipermail/haskell-cafe/attachments/20190712/0eedf353/attachment.java>
More information about the Haskell-Cafe
mailing list