panic when compiling SHA

Ben Lippmeier benl at ouroborus.net
Tue Jan 7 10:27:10 UTC 2014


On 07/01/2014, at 9:26 , Adam Wick <awick at galois.com> wrote:

>> Not if we just have this one test. I'd be keen to blame excessive use of inline pragmas in the SHA library itself, or excessive optimisation flags. It's not really a bug in GHC until there are two tests that exhibit the same problem.
> 
> The SHA library uses SPECIALIZE, INLINE, and bang patterns in fairly standard ways. There’s nothing too exotic in there, I just basically sprinkled hints in places I thought would be useful, and then backed those up with benchmarking.

Ahh. It's the "sprinkled hints in places I thought would be useful" which is what I'm concerned about. If you just add pragmas without understanding their effect on the core program then it'll bite further down the line. Did you compare the object code size as well as wall clock speedup?


> If GHC simply emitted rotten code in this case, I’d agree: wait for more examples, and put the onus on the developer to make it work better. However, right now, GHC crashes on valid input. Which is a bug. So I’d argue that the ticket should be re-opened. I suppose, alternatively, the documentation on SPECIALIZE, INLINE, and bang patterns could be changed to note that using them is not officially supported.

Sadly, "valid input" isn't a well defined concept in practice. You could write a "valid" 10GB Haskell source file that obeyed the Haskell standard grammar, but I wouldn't expect that to compile either. You could also write small (< 1k) source programs that trigger complexity problems in Hindley-Milner style type inference. You could also use compile-time meta programming (like Template Haskell) to generate intermediate code that is well formed but much too big to compile. The fact that a program obeys a published grammar is not sufficient to expect it to compile with a particular implementation (sorry to say).


> If the problem is pretty fundamental, then perhaps instead of panicking and dying, GHC should instead default back to a worse register allocator. Perhaps it could print a warning when that happens, but that’s optional. That would be an easier way to fix this bug if there are deeper algorithmic problems, or if fixing it for SHA would simply move the failure line a little further down the field. (Obviously this route opens a performance regression on my end, but hey, that’s my problem.)

Adding an INLINE pragma is akin to using compile-time meta programming. I suspect your meta programming is more broken than GHC in this case, but I'd be happy to be proven otherwise. Right now the panic from the register allocator is all the feedback you've got that something is wrong, and the SHA library is the only one I've seen that causes this problem. See above discussion about "valid input".

Ben.



More information about the ghc-devs mailing list