# gcd 0 0 = 0

**Alan Bawden
**
Alan@LCS.MIT.EDU

*Sun, 16 Dec 2001 02:34:05 -0500 (EST)*

From: "Simon Peyton-Jones" <simonpj@microsoft.com>
Date: Fri, 14 Dec 2001 01:18:56 -0800
...
If someone could write a sentence or two to explain why gcd 0 0 = 0,
(ideally, brief ones I can put in the report by way of explanation),
I think that might help those of us who have not followed the details
of the discussion.
Well, Serge and I have both offered variations on the real reason why
mathematicians agree that gcd 0 0 = 0. I prefer my simpler
specialized-for-integers version, but I suspect even that is more
mathematics than you want. At the other extreme, you could follow Knuth
and simply state: "it is convenient to set gcd(0, 0) = 0", but that seems a
bit unconvincing.
If I were in your shoes, I'd simply pass the buck to the authorities by
saying something like: "Mathematicians agree that gcd(0, 0) = 0".
Incidentally, somebody wrote here that Common Lisp defines `(gcd 0 0)' to
be 0, but it looks to me like all the language definition really does is
state that `(gcd)' (no arguments) is 0 because 0 "is an identity for this
operation" (which is technically false, but never mind), from which it is
natural for the reader to conclude that `(gcd 0 0)' must be 0.