Fwd: Is anything being done to remedy the soul crushing compile times of GHC?
ttuegel at gmail.com
Thu Feb 18 13:27:15 UTC 2016
On Thu, Feb 18, 2016 at 6:43 AM, Herbert Valerio Riedel
<hvriedel at gmail.com> wrote:
> On 2016-02-18 at 13:32:59 +0100, Andrey Mokhov wrote:
>> Interesting! In the new Shake-based build system we also need to
>> automagically generate .hs files using Alex et al. My first
>> implementation was slow but then I realised that it is possible to
>> scan the source tree only once and remember where all .hs/.x/etc files
>> are. This brought down the complexity from quadratic to linear in my
>> case -- maybe this could be reused in cabal too?
> This sounds very risky; are you suggesting to blindly traverse the
> hs-src-dirs *before* you even know what is searched for?
> If so, this will cause problems; we're already seeing problems with this
> in the cabal nix-local-branch, which currently (it's a bug) recursively
> traverses the project folder for computing a source-hash over content
> that would not even make it into the source-dist...
> There must be a better way to solve this...
I think what Andrey meant was, the first time we run the
pre-processors, cache the locations of all the files that need to be
pre-processed. On subsequent runs, we only need to check
pre-processors the files in the cache. I've been thinking along these
lines, too. This does break if you change what pre-processor is used
to build a file, but I think that's OK because it happens so rarely.
We would give the user a way to invalidate the cache, i.e. run the
pre-processors manually by a `cabal pre-process` command.
More information about the ghc-devs