compiling on solaris 9
skaller at users.sourceforge.net
Sat Sep 15 07:58:54 EDT 2007
On Sat, 2007-09-15 at 11:09 +0200, Sven Panne wrote:
> On Friday 14 September 2007 16:33, Robert Andersson wrote:
> > stdint.h seems to be unavailable on solaris 9. Looking at the
> > rts/posix/OSMem.c file we find
> > /* no C99 header stdint.h on OpenBSD? */
> > #if defined(openbsd_HOST_OS)
> > typedef unsigned long my_uintptr_t;
> > #else
> > #include <stdint.h>
> > typedef uintptr_t my_uintptr_t;
> > #endif
> Uh, oh... Using long doesn't look very 64bit-safe (it could be int, long, long
> long, who knows?). IIRC some systems without <stdint.h> provide at least
> <inttypes.h>, so we should perhaps use this as a fallback?
The RIGHT way to do this is rather messy .. but there is only
one right way to do it.
1. Measure the size (and alignment, while at it) of all the
integer types. (trial execute and run).
2. Find out which headers are present (trial compilation)
3. For stdint.h if present, detect which typedefs
are provided. (trial compilation)
4. For the ones provided, AND size_t, ptrdiff_t, check
their size (and signedness). (trial execution)
5. For the fundamental types, write a set of C++ functions
overloaded on the types, which print which one was chosen.
Yes, this step is mandatory, and it must be done in C++.
6. Test what the typedefs found are actually aliased to
using C++ (there is no other portable way to do this).
7. NOW, define YOUR OWN typedefs and macros to model C99
types when available, otherwise whatever else you have
otherwise pick something.
There is a (usually valid) assumption that the C and C++
compilers agree (and you actually have a C++ variant).
With that caveat, the processes above will work on a hosted
environment (only), to determine the correct bindings for symbols
without ANY knowledge of the underlying OS or C/C++ compiler:
there's no #if rubbish involved based on the OS.
You end up with 'my_int32_t' 'my_intptr_t' etc etc, which
provide the guarantee that
* they alias any related symbol actually provided by the system
* they're always present
* they're well defined even if there is no long long or
long double (or _complex etc) types
* the exact integers type my_(u)int<N>_t are provided for
N = 8,16,32,64 on all machines.
To ensure the N=64 case, an emulation library may be needed.
It is vital that application software DOES NOT define any of the
standard symbols. NEVER NEVER define intptr_t for example,
even if it is undefined.
It's horrible. But the above is utterly essential and it is the ONLY
Prebuild configurations can of course be used .. the algorithm above
can used to prepare them when possible.
If you have cross compilation situation, only half the autoconfig
process works (trial compilation, but not trial execution).
This kind of test:
typedef unsigned long my_uintptr_t;
is almost always wrong. Integers sizes are not determined by the
host OS. They're also not determined by the processor. They're
also not determined by the (generic) compiler kind either.
Such tests are fine for guessing what to test, but don't replace
actually doing the tests.
Please *don't* make the mistake of trying to use #conditional
compilation in C code based on supplied macros (_WIN32 etc)
to make choices. Ocaml is currently broken on XP64 for that
reason: the compiler works .. all the C libraries are bugged
due to the wrong assumption 'long' was the size of a pointer.
[In 64 bit Windows .. it's 'long long']
Don't make assumptions .. use type names defined by Haskell
with specific semantics .. then calculate them in the
John Skaller <skaller at users dot sf dot net>
Felix, successor to C++: http://felix.sf.net
More information about the Glasgow-haskell-users