John A. De Goes john at n-brain.net
Thu Feb 19 09:24:38 EST 2009

```On Feb 14, 2009, at 11:28 PM, wren ng thornton wrote:
> John A. De Goes wrote:
>> On Feb 13, 2009, at 2:11 PM, Jonathan Cast wrote:
>>> The compiler should fail when you tell it two mutually contradictory
>>> things, and only when you tell it two mutually contradictory things.
>> By definition, it's not a contradiction when the symbol is
>> unambiguously typeable. Do you think math textbooks are filled with
>> contradictions when they give '+' a different meaning for vectors
>> than matrices or real numbers???
>
> Yes. Yes, I do.

If you really think you have discovered a contradiction in tens of
thousands of mathematical textbooks, then you should write a paper and
submit it to the AJM.

Now me, I DON'T think you've discovered a contradiction. I don't even
think YOU believe that. Rather, you're fixated on using a unique,
precise meaning for each symbol. Somehow this is associated for you
with some notion of "purity".

But I'm guessing, if I take a look at all the source code you have
ever written in your entire life, you will not have as many unique
symbols as you have functions and operators. You probably reuse names
and operators just like the rest of us.

> It is precisely this abuse of notation which makes, for instance,
> material).

Hmmm, I don't find statistics books difficult to read.

> Scalars, vectors, and matrices are fundamentally different here and
> the operations on them should be unambiguous, regardless of context.

It's customary to use a unique typeface and/or font for each domain.
So you know the type of the variables by inspection, and the meaning
of the operators flows from that.

Matrices, for example, are generally denoted in all uppercase
(smallcaps), with an italic font, and often with the letters 'M' or
'N' and subscripts. Vectors are usually all lower-case and italic,
sometimes with tiny arrows above them, and usually they're represented
with the letters u, v, and w (and subscripted versions thereof).

With unique domains, reuse of the symbols such as '+' for vector and
matrix addition is unambiguous and perfectly sensible because it
suggests that at some level, the operation reduces to scalar addition
(which is correct).

Compare that to using unique symbols for every possible operator and
function. That would be brain overload because you would have to
memorize each symbol and function separately.

> For another example, consider matrices vs their transposes. Many
> folks can't be bothered to type a single character to clarify when
> things should be transposed before multiplying.

Now that's just plain sloppiness, and is quite orthogonal to this
discussion.

Regards,

John A. De Goes
N-BRAIN, Inc.
The Evolution of Collaboration

http://www.n-brain.net    |    877-376-2724 x 101

```