64-bit windows version?
Peter Tanski
p.tanski at gmail.com
Thu Jun 21 11:08:40 EDT 2007
On Jun 21, 2007, at 4:16 AM, Simon Marlow wrote:
> Peter Tanski wrote:
>>> skaller wrote:
>>>
>>>> Why do you need mingw? What's wrong with MSVC++?
>> The largest problem is the build system: GHC uses autoconf with
>> custom makefiles.
>
> So autoconf won't work with MSVC++, that is indeed a problem. But
> this doesn't mean we have to stop using Makefiles and GNU make -
> the rest of the build system will work fine, provided it's told
> about the different conventions for names of object files etc. I
> don't see a compelling enough reason to stop using GNU make. The
> build system doesn't even need to invoke CL directly, since we can
> use GHC as a "driver" (isn't this the way we agreed to do it before?).
>
> We use autoconf in a pretty limited way (no automake), so I don't
> think it will be hard to work around, even if we have to just hard-
> code all the configuration results for Windows.
The make system does work well and must be kept in order to port GHC
to a new posix platform--too many parallel projects (pun intended)
work with the current system. I have not kept a good count of
monthly configuration-based bugs but there are at least a few a
month, for known platforms, including OS X (a significant user base)
and Mingw. If I could change one feature of the current system I
would set up a wiki page with specific build requirements (I mean
location, program/library with function declaration), and for known
systems use autoconf only to determine what the $(build) system is
and to ensure those programs are available, then jump into make which
would call pre-set configuration makefiles for that system.
I spent a good amount of time writing the replacement library build
system in GNU Make (min. 3.8--the current min is 3.79.1) to blend
seamlessly with the current system. It does use a custom "configure"
script written in Python (more consistently portable, no temporary
files of any kind in $(srcdir))--John, that is where I originally
used Interscript: to bake configuration settings into the setup
files. The configuration determines what system it is on and the
relative-path location of certain requirements if they are not
already available--for testing the processor type and os support
(when it can't read from something cool like /proc/cpuinfo) it does
build small programs but all building is done in the build directory
which may be located anywhere you want. It then sets those
parameters for configuration files that already contain other presets
for that platform; general guesses may go into the main GHC autoconf
and I will keep them very simple (new architectures get the generic C
library by default). I simply can't convince myself that it is
better to use a guess-based system for architectures I already know,
especially when it also makes cross-compiling more complex than
necessary. For Windows it uses a VS project and calls that from a
DOS-batch file (for setup parameters) so you can run it from the
command line.
What I hope you would agree on for Windows-GHC is a build that ran
parallel to the autoconf-make system. Of course that would require
some maintenance when things change in the main system but I could
write update scripts for trivial changes; I believe anything more
complex should be carefully checked in any case. VS is troublesome
(its project files are written in XML, but that may be automated).
If you would rather use a Make-like system I could do it in Jam and
then you would add only a few extra Jamfiles to the current system.
As a bonus either VS or Jam would reduce build times, especially re-
build times, would and probably reduce the number of configuration
bugs we see around here. I would not suggest CMake, SCons or WAF;
John wisely advised against anything invasive.
Cheers,
Pete
More information about the Glasgow-haskell-users
mailing list