64-bit windows version?
Simon Marlow
simonmarhaskell at gmail.com
Thu Jun 21 11:48:23 EDT 2007
Peter Tanski wrote:
> The make system does work well and must be kept in order to port GHC to
> a new posix platform--too many parallel projects (pun intended) work
> with the current system. I have not kept a good count of monthly
> configuration-based bugs but there are at least a few a month, for known
> platforms, including OS X (a significant user base) and Mingw. If I
> could change one feature of the current system I would set up a wiki
> page with specific build requirements (I mean location, program/library
> with function declaration), and for known systems use autoconf only to
> determine what the $(build) system is and to ensure those programs are
> available, then jump into make which would call pre-set configuration
> makefiles for that system.
So you'd hard-wire a bunch of things based on the platform name? That sounds
like entirely the wrong approach to me. It makes the build system too brittle
to changes in its environment: exactly the problem that autoconf was designed to
solve.
> I spent a good amount of time writing the replacement library build
> system in GNU Make (min. 3.8--the current min is 3.79.1) to blend
> seamlessly with the current system. It does use a custom "configure"
> script written in Python (more consistently portable, no temporary files
> of any kind in $(srcdir))--John, that is where I originally used
> Interscript: to bake configuration settings into the setup files. The
> configuration determines what system it is on and the relative-path
> location of certain requirements if they are not already available--for
> testing the processor type and os support (when it can't read from
> something cool like /proc/cpuinfo) it does build small programs but all
> building is done in the build directory which may be located anywhere
> you want. It then sets those parameters for configuration files that
> already contain other presets for that platform; general guesses may go
> into the main GHC autoconf and I will keep them very simple (new
> architectures get the generic C library by default). I simply can't
> convince myself that it is better to use a guess-based system for
> architectures I already know, especially when it also makes
> cross-compiling more complex than necessary. For Windows it uses a VS
> project and calls that from a DOS-batch file (for setup parameters) so
> you can run it from the command line.
Adding a dependency on Python is already something I want to avoid. One way we
try to keep the GHC build system sane is by keeping the external dependencies to
a minimum (yes I know the testsuite requires Python, but the build itself doesn't).
However, I admit I don't fully understand the problem you're trying to solve,
not having tried to do this myself. The GHC build system now uses Cabal to
build libraries (actually Cabal + make + a bit of autoconf for some libraries).
Why can't this method work for building libraries on Windows native? We must
port Cabal to Windows native anyway, and then you have a library build system.
> What I hope you would agree on for Windows-GHC is a build that ran
> parallel to the autoconf-make system.
What I hope is that we don't have to do this :-)
> Of course that would require some
> maintenance when things change in the main system but I could write
> update scripts for trivial changes; I believe anything more complex
> should be carefully checked in any case. VS is troublesome (its project
> files are written in XML, but that may be automated). If you would
> rather use a Make-like system I could do it in Jam and then you would
> add only a few extra Jamfiles to the current system.
If we were to use something other than GNU make, we should do it wholesale, or
not at all, IMO.
Cheers,
Simon
More information about the Glasgow-haskell-users
mailing list