[Haskell-cafe] Re: A suggestion for the next high profile Haskell project

Dan Weston westondan at imageworks.com
Tue Dec 19 18:28:25 EST 2006


Maybe we should take to heart lessons learned from the Erlang 
development experience in Joe Armstrong's "Erlang Tutorial":

1  Concentrate on essential features
2  You need a protector
3  You will never displace an existing technology if it works
    -- Wait for the failures
4  Move quickly into the vacuum after a failure
5  Develop new unchallenged application areas
6  5% of all real system software sucks
    -- Don't worry: ship it and improve it later

Although lesson numbers 4-6 favor a high-level language like Haskell, 
lesson numbers 2-3 are the real deal killer for my "real-world" 
employer, a leading visual effects company.

Protectors won't stake their reputation on less than a sure thing (and 
do so retroactively on success if they can!) And most development 
expands the functionality of an existing code base already written in 
some language you are expected to continue in (less risky), not bold new 
initiatives (very risky).

Haskell tutorials (and this mailing list) don't prepare you for lesson 
number 1. Emphasis is understandably on the new/arcane/obscure/clever, 
not on the you-can't-do-this-in-C++/Java/Ruby/...

That's why I use only Haskell at home, and (unfortunately) only C++ at work.

One way to crack the door open is a robust, easy-to-use, currently 
maintained, well-documented FFI so I can drop in Haskell functionality 
into existing applications (e.g. hooking up Qt signals to Haskell 
slots). My experience with greencard (couldn't build with cabal), 
H/Direct (abandoned?), and c2hs (bidirectional?) is not encouraging. 
Even martialing a simple list seemed cumbersome.

Dan

ls-haskell-developer-2006 at m-e-leypold.de wrote:
> Tomasz Zielonka <tomasz.zielonka at gmail.com> writes:
> 
>> Anyway, don't concentrate on this particular example. All I say is that:
>> - sometimes I get efficient programs in Haskell right away (I my case
>>   quite often, but YMMV)
>> - sometimes efficiency doesn't matter
>> I don't think it is contradictory, especially because the two
>> "sometimes" can have a non-empty symmetric difference.
> 
> I'd like to add, that speed/efficiency is rather less important in
> "real world programming" than most people realize. Assume I write
> program for some clients with a total install base of N machines.
> 
> Assume I don't optimize.
> 
> If the program runs fast enough but could run faster, I'm wasting my
> time to optimize. I'd have to charge more money to the licensees /
> clients without actually delivering a better experience.
> 
> Assume now the unoptimized program doesn't run fast enough -- it's
> slow enough to seriously affect the user experience and the clients
> start grumbling.
> 
> Now, something has to be done: Either the customers have to upgrade
> their machines (I call that the MS experience ;-) or I have to
> optimize. The latter also incurs cost which finally has to be factored
> in the licensing / service fees for the program (if not I'd not be
> getting revenue from the program and had to close my shop :-). 
> 
> Now let's ask: When is it better to optimize instead of urging the
> customer to upgrade?
> 
> Obviously when
> 
>   Cost(Upgrade) > Cost(Optimisation)
> 
> for the customer. Those costs are
> 
>   Cost(Upgrade)      = Price of more memory or new PC which is fast enough.
> 
>   Cost(Optimisation) = Hours_spent * Fee_per_Hour / N
> 
> (remember: N was the number of licensed Nodes).
> 
> So finally for optimisation to be the better decision, we must have:
> 
>   Price_New_PC > Hours_spent * Fee_per_Hour / N
> 
> With hardware prices being as cheap as they are, hourly rates being
> not cheap (thanks to that :-), only optimizations pay that are either
> easy (done fast) or if you have a large number of customers / clients.
> 
> The same of course applies to using C instead of a type safe and/or
> grabage collected language: You have to weight the win against the
> increased cost in developement time and debugging. It seldom pays. If
> it does, using a FFI with haskell and optimizing hot spots would
> perhaps had paid even better.
> 
> Generally users are more likely to like smart programs (that take a bit
> to run but do everything right without requiring constant interaction)
> than to like fast programs that are not so smart. Haskell (FP in
> general) is good to write smart programs, C isn't.
> 
> The call for efficiency in constant factors ("C is n to k times faster
> than <type safe garbage collected language>") is in my opinion almost
> always pure snake oil. I consider myself a rather experienced and good
> C programmer (but not only C) and have fallen for it often enough. I
> regretted it almost every time.
> 
> Regards -- Markus
> 
> 
> PS: Talking about smart programs: Is there a library anywhere that
>     could be used to implement expert systems in Haskell or to
>     evaluate Horn clauses in Prolog style?
> 
>  
> _______________________________________________
> Haskell-Cafe mailing list
> Haskell-Cafe at haskell.org
> http://www.haskell.org/mailman/listinfo/haskell-cafe
> 
> 




More information about the Haskell-Cafe mailing list