DDC compiler and effects;
better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)
Gregory Michael Travis
mito at panix.com
Thu Aug 13 12:03:54 EDT 2009
"John A. De Goes" <john at n-brain.net> sed:
> Hmmm, bad example. Assume memory instead. That said, reordering/
> parallelization of *certain combinations of* writes/reads to
> independent files under whole program analysis is no less safe than
> sequential writes/reads. It just "feels" less safe, but the one thing
> that will screw both up is interference from outside programs.
This reminds me of something I've heard of: the commutative monad. I
might have seen the phrase in some presentation by Simon Peyton-Jones.
I took it to mean some sort of monad that sequenced computations, but
was also capable of reordering them according to rules specified by
the implementer of the monad. Thus, you could re-order operations on
separate files, because they can't possible interfere with each other.
This strikes me as an important idea, something that would allow you
to incrementally optimize while keeping safety.
For example, let's say you implement a complex distributed system,
using a single distributed state monad to manage global state. This
would be very slow, since you would no doubt have to have a central
node controlling the order of all side-effects. But this node could
re-group and re-distributed the side-effects. This would be done by
using the type system to carefully separate effects that couldn't
I'm not expressing this as well as I'd like. From the programmer's
perspective, though, it could have a nice property: you start with a
slow, but trivial state monad. Then you add rules and clauses that
indicate when two effects can have their order switched; all the
while, the meaning of the program is unchanged.
This is not unlike nodes distributed database that have to receive
updates from other nodes and figure out how to apply all the updates
in a coherent order, or throw an exception if it cannot.
More information about the Haskell-Cafe