[Haskell-cafe] repa parallelization results

Dominic Steinitz dominic at steinitz.org
Tue Mar 17 10:21:55 UTC 2015


Anatoly Yakovenko <aeyakovenko <at> gmail.com> writes:

> 
> https://gist.github.com/aeyakovenko/bf558697a0b3f377f9e8
> 
> so i am seeing basically results with N4 that are as good as using
> sequential computation on my macbook for the matrix multiply
> algorithm.  any idea why?
> 
> Thanks,
> Anatoly
> 

Hi Anatoly,

repa is good for things that live on a grid e.g. forward / backward
Euler, Crank Nicholson, convolution e.g. of images, multi-grid methods
where each cell is updated based on local information (so we are in
the world of comonads). I imagine it would also be good for Ising
models (but maybe using Swendson-Yang or Wolff). It is not good where
the update is based on global information e.g. simulating the solar
system.

You might compare your results in repa againt yarr
https://hackage.haskell.org/package/yarr.

Here are some examples of repa / yarr that could be of use

https://idontgetoutmuch.wordpress.com/2014/02/10/
laplaces-equation-in-haskell-using-a-dsl-for-stencils-3/

https://idontgetoutmuch.wordpress.com/2013/08/06/
planetary-simulation-with-excursions-in-symplectic-manifolds-6/

https://idontgetoutmuch.wordpress.com/2013/02/10/
parallelising-path-dependent-options-in-haskell-2/

https://readerunner.wordpress.com/2014/04/30/
multigrid-methods-with-repa/

If I knew how to cross-post this to

https://mail.haskell.org/cgi-bin/mailman/listinfo/numeric

when using gmane I would do so.

Dominic.





More information about the Haskell-Cafe mailing list