[Haskell] Announce: Adaptive Simulated Annealing interface for haskell

John Meacham john at repetae.net
Wed Jan 19 18:59:45 EST 2005


I have made a basic binding to Lester Ingber's Adaptive Simulated
Annealing code. Adaptive simulated annealing is a general purpose
non-linear optimization algorithm which can (be statistically
guarenteed to) find the minimum of arbitrary many-dimensional functions. 

I am hoping this will turn into the beginning of general very easy to
use optimization libraries for haskell. The idea being general
optimization is incredibly useful for things like presentation,
quick-n-dirty code generators, graph drawing and a lot of other things
but getting a problem into a form suitable for an optimizer is usually a
lot of work. If a general always-available pretty good optimizer were
available, it could drastically speed up the development of complex
haskell apps. 

I am mainly interested in feedback on my Optimize.Parameter class, which
I think may be a novel method of interfacing with an optimizer from a
strongly typed language. Based on the type of the function you pass it,
which can even include haskell union types such as Maybe and Either, it
will transform your problem into a suitable function of many properly
constrained continuous and integral variables. This function can then be
passed to any off-the-shelf non-linear solver. I include a very basic 
interface to a C implemention of Adaptive Simulated Annealing as it
tends to be a robust general purpose algorithm (and is free), however a
commercial or specialized back end would work just as well. 

the main interface to the solver consists of the single function

minimize :: Parameter z x => z -> (x -> Double) -> IO x

which given a function of x -> Double, will return the x where that
function takes on its globally minimal value.

Parameter z x means that your parameter is of type x and it takes
configuration constraints of type z

an example of use is: 

f :: (Bool,(Double,Double)) -> Double
f (tf,(x,y)) = if  tf then x*x + y*y else x + x*x - log y

g :: Either (Int,Int) Int -> Double
g (Left (x,y)) = realToFrac $ x * y
g (Right r) = pi*(realToFrac r)^2

v1 <- minimize (empty,(limit (-10) 3, limit 4 7)) f
v2 <- minimize (empty,(limit (-10) 3, limit 4 7)) (negate . f)
v3 <- minimize ((limit 10 100,limit 3 1000), limit 4 1000) g
v4 <- minimize ((limit 10 100,limit 3 1000), limit 4 1000) (negate . g)

results in:
print v1 => (False,(-0.5000008601544335,6.999999999568565))
print v2 => (True,(-9.99999999999775,6.9999999999934595))
print v3 => Left (10,3)
print v4 => Right 1000

(This run was specifically done with low iteration count)

See the Parameter instances for details on the various constraints which
can be passed to the first argument of minimize.

This is a work in progress, I thought I'd get some early feedback to see
if the ideas were interesting at all or if I am repeating someone elses
work. I am in general a big fan of simple interfaces to complex back
ends and am quite enjoying the ability of haskell types to convey all
the  meta-information you usually need to explicitly specify in other
languages.


my code: 
http://repetae.net/john/recent/out/HsASA.html
Lester's ASA page:
http://www.ingber.com/

        John



-- 
John Meacham - ⑆repetae.net⑆john⑈ 


More information about the Haskell mailing list