#ifdef considered harmful
Alastair Reid
alastair at reid-consulting-uk.ltd.uk
Wed Apr 7 02:09:45 EDT 2004
> Will such a preprocessor work on source code like cpp does or on a syntax
> tree? The latter would be better, IMO; the former would probably have
> little or no advantage over cpp.
A bunch of people have answered this already but your comment reminded me of
one of the cpp problems we might want to avoid (by omitting a large chunk of
cpp functionality). One of cpp's flaws is that it has absolutely no notion
of scope so if you define a macro 'bar' then code like this can do surprising
things.
foo bar = baz
What do people use macros for?
1) Aliases and abstraction:
Macros are useful in C because C is quite poor at defining aliases for
existing things. (e.g., typedefs can't be parameterized, you can't define an
alias for a variable, creating an alias for a function is expensive unless
you (can) mark it inline).
Haskell has great ways of creating aliases for things: type synonyms and
function definitions. There's no strong need for more. (A few exceptions
listed at the end of this mail.)
2) Defining symbols to be used in #ifs
This is useful for conditional compilation in both C and Haskell.
So, I'd like to float the idea that the Haskell preprocessor should _only_
support conditional compilation, that macro expansion should only occur in
lines that begin with '#' not in "normal" lines and that macros should always
evaluate to expressions with simple scalar types (ints, strings, etc.)
Similarly, I'd like to drop token splicing (e.g., foo##bar in ANSI C or
foo/**/bar in K&R C). It's mostly useful for overcoming limitations in C and
isn't much used to help you write #ifs.
These two changes make it possible to replace the usual horrible mix
of macro expansion, parsing and evaluation with a conventional interpreter.
#defines assign values to variables and
#if evaluates expressions involving those 'variables'.
I'm not sure if it would still be useful to be able to define parameterized
macros like
#define max(x,y) ...
but I doubt it would cost much to allow it.
Advantages:
1) hspp implementation is a lot simpler.
[It's probably simple enough that the preprocessor could
easily be built into Hugs if we wanted. This isn't the primary
motivation for this suggestion but it would be nice.]
2) hspp semantics easier to understand so programmers not caught
out by name capture and all the other problems that cpp can
cause.
3) hspp only affects lines that start with # in the first column.
This further lessens chance of programmer being caught out.
4) No need to add extra parentheses round every use of a macro argument
and round macro bodies as in:
#define max(x,y) ((x)>(y)?(x):(y))
There's only one place I know of where this restriction would have any
significant effect. The FFI libraries have a lot of very similar definitions
for all the new FFI types to define the Storable instances. IIRC, a cpp
macro is used to significantly reduce the amount of code that has to be
written and to make it easier to read/maintain. It would be a shame to lose
this use of cpp but, if this is the only loss, I think it would be worth it.
--
Alastair Reid www.reid-consulting-uk.ltd.uk
ps Inessential syntax tweak: If we are willing to change cpp syntax instead of
just restricting it, we might require "=" between the lhs and rhs of a macro
definition. This makes parsing of macros a little easier. cpp macros are a
bit funny in that you get very different results if you write a space between
the macro name and the arguments. For example:
#define max(x,y) x>y?x:y
and
#define max (x,y) x>y?x:y
The first is a parameterized macro, the other isn't. With this change, the
following would be equivalent.
#define max(x,y) = x>y?x:y
and
#define max (x,y) = x>y?x:y
More information about the Libraries
mailing list