[GHC] #9675: Unreasonable memory usage on large data structures
GHC
ghc-devs at haskell.org
Mon Oct 13 09:02:33 UTC 2014
#9675: Unreasonable memory usage on large data structures
-------------------------------------+-------------------------------------
Reporter: Polarina | Owner:
Type: bug | Status: new
Priority: normal | Milestone:
Component: Compiler | Version: 7.8.3
Resolution: | Keywords:
Operating System: Linux | Architecture: x86_64 (amd64)
Type of failure: Compile- | Difficulty: Unknown
time performance bug | Blocked By:
Test Case: | Related Tickets:
Blocking: |
Differential Revisions: |
-------------------------------------+-------------------------------------
Comment (by nomeata):
Thanks for your input.
The way it breaks optimisation is what I meant with “making some code more
complicated”: With a sparse list of bound variables, it would have to
check which fields the optimization would like to use and then generated
them on demand. I see that that might make the code very ugly..
The compulsory unfolding you mention would still be of the the shape of a
huge pattern match, right? So the quadratic behaviour wouldn’t be
eliminated.
A comparison of the heap profile with and without `-dverbose-core2core`
(using this as a poor man’s deepseq after each phase) shows that there
might be a space leak, as you guess. (Uploading both graphs.)
@Polarina: A work-around for you might be to not use a data constructor,
but a newtype around a `Vector (Int -> Int)` (inner type mostly
irrelevant). You store the functions therein and your accessor functions
would use `unsafeIndex` and `unsafeCoerce` to get them out again. You can
wrap that highly unsafe code in a module that can be used in a totally
safe way.
--
Ticket URL: <http://ghc.haskell.org/trac/ghc/ticket/9675#comment:8>
GHC <http://www.haskell.org/ghc/>
The Glasgow Haskell Compiler
More information about the ghc-tickets
mailing list