Negation
johndearle at cox.net
johndearle at cox.net
Tue Feb 9 08:39:53 EST 2010
Monadic operators are atomic in that they form an atom. Binary operators do not. Perhaps I should have used the word unary instead of monadic, hmm. It is best sometimes to never turn back. What is done is done! There is an ambiguity. One is a partial order whereas the other is a total order. Despite the apparent clarity the question is are there mitigating factors?
I do not wish to reveal all the mysteries of the universe in one sitting (in other words I have no intention of discussing the precise mechanisms involved), but having multiple uses for a symbol complicates the grammar. Hyphen is badly overloaded. The rules as they are may serve to discourage certain patterns. OK, I'll spell it out. Ambiguity is not a one way street. In the usual course of the compiler, something might be unambiguous (with respect to the compiler). The compiler exhibits what I shall call direction bias. This is why it appears in a sense to be unambiguous. We usually explain this away by saying that though it is unambiguous, it is unclear. This is merely informal speech that results from a lack of understanding of the nature of the problem.
On occasion despite the direction bias of the machine in real world problems we often encounter this ambiguity that occurs in the opposite direction. Typically, we merely dismiss the ambiguity as not even being a legitimate expression of ambiguity once we realize that in the conventional direction it is unambiguous. We will conclude that we were confused when in fact we were not. Our confusion is our conclusion that we were confused.
So in a sense it is unambiguous and in another it is ambiguous in a manner that is context sensitive. For example, if you are trying to extend the grammar of the language you may have to account for the various ways in which hyphens are used. In other words you have to account for the ambiguities. This has been an area of research for me. As a practical matter it is often possible to account for them if you grok the language and how it was implemented, and have nothing better to do with your time than to work out all the possible implications of a proposed change to the language which is what all of you are doing. Since this sort of thing only crops up on occasion we dismiss it as unreal.
You/we could use tilde for minus sign much like Standard ML does. It was a brilliant stroke and it isn't heresy. It is conceivable that an alternative albeit inferior approach to achieve a similar outcome was taken that everyone is now stuck with, but there is more to the story.
Someone gave an example involving modular arithmetic. If negation were meaningless with respect to an operation that operation could be regarded as more atomic as in more primitive than negation. You essentially skip over the expression concluding that it can't apply because it cannot meaningfully apply. Negation is meaningful (though not wholly meaningful) with respect to modular arithmetic and so there is no reason for it to be regarded as more primitive than additive inverse "negation". There are no type distinctions. An integer is an integer is an integer though I could see how someone might think of modular arithmetic as the arithmetic of the finite and therefore smaller and something that fits inside of the infinite. The type of the result of modular arithmetic is not a pure integer. It has a more restrictive type even though the distinction is easily overlooked. The domain and codomain does not form the Cartesian product of integers. It is bounded by the modulus, thus a dependent type.
Can the degree to which a type is broad or narrow be used to signify the default order of evaluation, known as precedence? There is reason to believe so. Since one type is more restrictive than another on occasion the operation will be meaningful and on others meaningless. By way of analogy (and efficiency) more restrictive types should be evaluated first and therefore have a higher precedence compared to their less restrictive counterparts even if the type distinctions are invisible to the compiler.
It needs to be appreciated that the Haskell language was created by type theorists who were not necessarily concerned with how they do it in C.
More information about the Haskell-prime
mailing list