[GHC] #12492: internal error: allocation of 1048608 bytes too large
GHC
ghc-devs at haskell.org
Mon Aug 15 00:47:42 UTC 2016
#12492: internal error: allocation of 1048608 bytes too large
-------------------------------------+-------------------------------------
Reporter: erikd | Owner:
Type: bug | Status: new
Priority: normal | Milestone: 8.0.2
Component: Runtime | Version: 8.0.1
System |
Keywords: | Operating System: Unknown/Multiple
Architecture: | Type of failure: None/Unknown
Unknown/Multiple |
Test Case: | Blocked By:
Blocking: | Related Tickets:
Differential Rev(s): | Wiki Page:
-------------------------------------+-------------------------------------
I'm getting this error error when trying to use the `ghc-datasize` package
from Hackage to calculate the in memory size of a data structure. The full
error is:
{{{
load-test: internal error: allocation of 1048608 bytes too large (GHC
should have complained at compile-time)
(GHC version 8.0.1 for x86_64_unknown_linux)
Please report this as a GHC bug:
http://www.haskell.org/ghc/reportabug
}}}
However, this message is bogus because this is not a static data structure
known at compile time, but data loaded from disk. Currently, my test
program just loads this complex data structure (about 27 megabytes of CSV
with embedded JSON on disk) into memory and then calls `recursiveSize`
from `GHC.Datasize` on it. I get this same error message with both
ghc7.10.3 and ghc-8.0.1.
If I remove the call to `recursiveSize` there is no error.
If I `Control.Deepseq.force` the data structures before calling
`recursiveSize` there is no error.
So, who is to blame? `ghc-datasize` is just Haskell code that calls public
APIs so I think the blame lies with GHC.
I'll work on getting a small re-producable test case.
--
Ticket URL: <http://ghc.haskell.org/trac/ghc/ticket/12492>
GHC <http://www.haskell.org/ghc/>
The Glasgow Haskell Compiler
More information about the ghc-tickets
mailing list