Again: Uniques in GHC

p.k.f.holzenspies at utwente.nl p.k.f.holzenspies at utwente.nl
Tue Oct 7 06:32:21 UTC 2014


Dear Joachim,

> Some of this may not be true, but to my knowledge (part of) that
> interface reading code is (or was?) used by haddock when generating
> its .haddock file.

Ah, well, I didn't know this. How does haddock use this code? If Haddock uses the GHC-API to do this; problem solved, because we're back at the specific compiler version that generated it. Otherwise... we may be in trouble.

> Why? You can just serialize Uniques always as 64 bit numbers, even on
> 32-bit systems. This way, the data format is the same across
> architectures, with little cost.

Ah, the cost is this; if we start relying on the 64-bitness for uniqueness (which may well happen; there are categories - currently characters - used only for four compile-time known Uniques, waisting 30 - 8 - 2 = 20 bits), this will break the 32-bit compilers. Arguably, their breakage should reject the change leading to the waisted Uniques. Seems a risk, though. Similarly to how currently Uniques are 64-bits, but serialised as 30. Alternatively, 32-bit GHC could use 64-bit Uniques, but that is going to give you quite the performance hit (speculating here).

> But that would only work on 64 bit systems, right?

Yes, this approach to a parallel GHC would only work on 64-bit machines. The idea is, I guess, that we're not going to see a massive demand for parallel GHC running on multi-core 32-bit systems. In other words; 32-bit systems wouldn't get a parallel GHC.

Regards,
Philip

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.haskell.org/pipermail/ghc-devs/attachments/20141007/630c59ae/attachment.html>


More information about the ghc-devs mailing list