[Haskell-cafe] Game of life in haskell.

Jon Harrop jon at ffconsultancy.com
Tue Feb 2 22:45:52 EST 2010


On Wednesday 03 February 2010 01:08:33 Richard O'Keefe wrote:
> On Feb 3, 2010, at 3:04 PM, Jon Harrop wrote:
> > Fair enough. If you want to model either an infinite board or a very
> > sparse one then a sparse data structure will be asymptotically faster. But
> > they won't be simpler and asymptotically efficient mutable data
> > structures will be faster again.
>
> Did I say a word about mutability?  The whole point of my message
> was that the original message WASN'T about mutable or immutable
> data structures but about dense (2d array) or sparse data structures.

When Serguey said that arrays were not idiomatic Haskell because they 
are "hard to update functionally" I assumed he was talking about mutable vs 
immutable data structures.

> >> A simple Haskell analogue, which has the right asymptotic cost,
> >
> > That will be a lot faster than using Map or Set but you're still
> > paying a lot for the indirections between cons cells. Mutating an array
> > in-place would be significantly faster and no more difficult to code
> > correctly because this is such a trivial algorithm.
>
> Again, you seem to be responding to something that wasn't said.
> No claim was made that the list of (row,[col...]) pairs approach
> was *FAST*,

Were you not talking about performance when you said "the right asymptotic 
cost"?

> or indeed that the code to manipulate it was notable for simplicity, only
> that the data structure in itself is simple. 

Ok.

> I once had Prolog code running faster than the Fortran code it was
> based on.  Mutable arrays are not in themselves a guarantee of
> efficiency, and while I don't imagine that you believe that, the
> paragraph I quoted came uncomfortably close to saying it.

My statements were specifically about Conway's Game of Life.

> In fact mutating an array in place probably would be a bit trickier
> because you _can't_ mutate the row you are just calculating; you will
> need the old version for the next row.

Yes, of course. You would read from and write to different mutable data 
structures, swapping over their roles. Like double buffering or Cheney's to 
and from spaces.

> And in a sparse-compressed representation, the new and old rows will seldom
> be the same size. 

Just use a conventional resizeable array and it will amortize all those costly 
allocations for you without complicating the code at all.

> For what it's worth, the sparse-compressed code that someone pointed
> to, and which I alluded to, is written in C.  Remember, the original
> message didn't say *arrays* were a poor choice,

He said:

  "Arrays are not fully "idiomatic" for Haskell as they are hard to
update functionally."

Is that not saying that arrays were a poor choice?

> but that the "obvious" *2d array of boolean* was a poor choice.  And that's
> the (only) claim I defended.

The OP was solving a dense problem on a finite board. Is a 2d array of 
booleans really such a bad choice in that context? Even if you change the 
problem to a sparse one, how large must it be before a solution like 
Serguey's becomes competitively efficient in practice?

-- 
Dr Jon Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?e


More information about the Haskell-Cafe mailing list