[Haskell-cafe] Stacking monads

David Menendez dave at zednenem.com
Fri Oct 3 16:51:13 EDT 2008


On Fri, Oct 3, 2008 at 3:43 PM, Andrew Coppin
<andrewcoppin at btinternet.com> wrote:
> David Menendez wrote:
>>
>> It wasn't until fairly recently that people realized
>> that you could do useful things if you had "return" and "ap", but not
>> (>>=), which why we have some unfortunate limitations in the Haskell
>> prelude, like Applicative not being a superclass of Monad.
>>
>> This leads to all the duplication between Applicative and Monad. In a
>> perfect world, we would only need the Applicative versions.
>>
>
> OK. So it's broken "for compatibility" then? (Presumably any time you change
> something from the Prelude, mass breakage ensues!)

Exactly. Since the Prelude is specified in the Haskell 98 report, you
can't add or subtract things without losing Haskell 98 compatibility.

We *could* define a new Prelude that did things more sensibly, but
then code either has to pick which Prelude to support or else jump
through extra hoops to be cross-compatible.

>> Would you be willing to share the implementation of ResultSet? If
>> you're relying on a list somewhere, then it should be possible to
>> switch the implementation to one of the nondeterminism monad
>> transformers, which would give you the exception behavior you want.
>>
>
> Consider the following:
>
>  factorise n = do
>   x <- [1..]
>   y <- [1..]
>   if x*y == n then return (x,y) else fail "not factors"
>
> This is a very stupid way to factorise an integer. (But it's also very
> general...) As you may already be aware, this fails miserably because it
> tries all possible values for y before trying even one new value for x. And
> since both lists there are infinite, this causes an endless loop that
> produces (almost) nothing.
>
> My ResultSet monad works the same way as a list, except that the above
> function discovers all finite solutions in finite time. The result is still
> infinite, but all the finite solutions are within a finite distance of the
> beginning. Achieving this was Seriously Non-Trivial. (!) As in, it's several
> pages of seriously freaky code that took me days to develop.
>
> AFAIK, nothing like this already exists in the standard libraries.

Now I'm even more curious to see how you did it. I spent some time a
few months ago developing a monad that does breadth-first search. It
would be able to handle the example you gave almost without change.

Some other possibilities:

(1) logict <http://hackage.haskell.org/cgi-bin/hackage-scripts/package/logict>

This defines a backtracking monad transformer (the NondetT I mentioned
in my previous message), and provides a "fair" variant of (>>=) that
you could use to define factorise. It's not as foolproof as the other
options.

(2) control-monad-omega
<http://hackage.haskell.org/cgi-bin/hackage-scripts/package/control-monad-omega>

This is a monad similar to [] that uses a "diagonal" search pattern.

(3) Oleg Kiselyov's fair and backtracking monad
<http://okmij.org/ftp/Computation/monads.html#fair-bt-stream>

This uses a search pattern that I don't fully understand, and only
satisfies the Monad and MonadPlus laws if you ignore the order of
results, but think it's at least as robust as Omega.



>>> If you look at run_or, you'll see that this is _not_ a simple state
>>> monad,
>>> as in that function I run two actions starting from _the same_ initial
>>> state
>>> - something which, AFAIK, is impossible (or at least very awkward) with a
>>> state monad.
>>>
>>> Really, it's a function that takes a state and generates a new state, but
>>> it
>>> may also happen to generate *multiple* new states. It also consumes a Foo
>>> or
>>> two in the process.
>>>
>>
>> That's what happens if you apply a state monad transformer to a
>> nondeterminism monad.
>>
>> So it might be possible to rewrite your code along these lines:
>>
>>    type M = StateT State []
>>
>>    run :: Foo -> M ()
>>
>>    runOr :: Foo -> Foo -> M ()
>>    runOr x y = mplus (run x) (run y)
>>
>>    runAnd :: Foo -> Foo -> M ()
>>    runAnd x y = run x >> run y
>>
>> The type "StateT State [] alpha" is isomorphic to "State -> [(alpha,
>> State)]", which means that each of the computations in mplus gets its
>> own copy of the state.
>>
>
> What does mplus do in this case? (I know what it does for Maybe, but not for
> any other monad.)

"mplus a b" returns all the results returned by "a" and "b". For
lists, it returns all the results of "a" before the results of "b". I
suspect it corresponds to "merge" in your code.

For true backtracking monads (that is, not Maybe), mplus also has this property:

    mplus a b >>= f == mplus (a >>= f) (b >>= f)

There is a school of thought that Maybe (and Error/ErrorT) should not
be instances of MonadPlus because they do not satisfy that law.

>> 2. StateT State (NondetT (Either ErrorType)) alpha
>>
>> (NondetT isn't in the standard libraries, but I can provide code if
>> needed.)
>>
>> Left uncaught, an exception raised in any branch will cause all
>> branches to fail.
>>
>
> That looks more like it, yes.

That's what I figured. You'll need a transformer, then, which rules
out Omega. Since you don't care about catching exceptions, you can
just do something like

    type M = StateT State (LogicT (Either ErrorType))

    throwM :: ErrorType -> M a
    throwM = lift . lift . Left

Or, if you want to try my breadth-first monad, I can send you a copy.
It supports exception handling out of the box.

-- 
Dave Menendez <dave at zednenem.com>
<http://www.eyrie.org/~zednenem/>


More information about the Haskell-Cafe mailing list