[Haskell-cafe] Imperative vs. declarative (was: Bool is not...safe?!)
PY
aquagnu at gmail.com
Mon Jul 9 06:09:48 UTC 2018
Hello, Olaf!
08.07.2018 23:56, Olaf Klinke wrote:
> Your own example of factorial is very much declarative in the above sense, because it only declares what the factorial function is, in terms of the relationship between factorial(n) and factorial(n-1). Of course the functional programmer must have a mental model of the runtime's behaviour in mind. (Recursively calling the function, in this case.) But what happens on the lower, imperative level when computing factorial(n) is not relevant for the definition of the function.
My point was that in Haskell we define how to calculate result from
arguments, exactly as in C# and with the same pattern-matching. But in
Prolog I coded relation, so Prolog know how to calculate not only
factorial but also argument from the result like we have 2 different
evaluation coded in Prolog. Currently it's obvious: there are different
classification. I showed only what I myself studied as a student :)
Another interesting note is: are XML, HTML, CSS declarative language?
When I was student they were called formats and not languages. Because
Haskell execution/evaluation is based on lambda calculus, classical
Prolog on formal logic on 1st order predicates, but on what calculation
model are based XML? There are a lot of XML or CSS parsers on any
language :) So, they don't subscribe evaluation model but only data.
But another contra-example: XML -> DocBook -> Postscript. Is it format
or language? :) I think currently there are a lot of hybrid-languages:
OOP+FP (F#, C#, Ocaml, CL...), FP+PL (Mercury, Curry...). Also there are
a lot of PL libraries, for example: yieldProlog for Python :) So, there
are a lot of cases when it's difficult to make right classification. So,
I understand that classification becomes more unclear and difficult,
that's true. Like that may be different way to classify them.
Olaf, but I have another question. You was talking about commutative
monads. I check it, something like this:
|doa <-ma b <-mb f a b is equal to: ||dob <-mb a <-ma f a b|
&& and || are commutative sure. But question is: why in this case in C/C++, Bash (what else) order has matter, even more: order is fixed in standard.
`e1 && e2` is equals to `if e1 then e2`. And there is a lot of code
which relies on this. Why they implements boolean operations in such
way? Order has not matter for +, -, *, etc in the same languages (they
are commutative). Why so many languages have not commutative bool
operations? When i think about it, i find next example: Haskell function
is pure, but is it really true? :) In practical world we can have 2
functions one like `f a b = a + b`. And another `g` may be some wavelet
transformation or calculation of some super big fractal. No side effects
(effects in external world), but when you evaluate `f` - you can not see
effect. But when you calculate `g` - you can even touch the effect on
CPU case with fingers (it will be hot!) :-) So, there is difference to
write `f && g` or `g && f`. If some code relies on order of execution
and use `&&` instead of `if` - it has matter. May be bool operations
were not implemented commutative in those languages because it allows to
write "multi-ifs" (a & b & c & d ...) in short circuit way? I never
though about this early :) I remember that there were orelse and
andalso in Basic and Ocaml... So, seems there is such tradition in CS:
to have mandatory non-commutative and's/or's and optionally commutative
and's/or's ?
===
Best regards, Paul
||
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.haskell.org/pipermail/haskell-cafe/attachments/20180709/329971e4/attachment.html>
More information about the Haskell-Cafe
mailing list