[Haskell-cafe] Garbage collecting pointers
mads_lindstroem at yahoo.dk
Fri Mar 26 16:28:11 EDT 2010
For some time I have been thinking about an idea, which could limit
Haskell's memory footprint. I don't know if the idea is crazy or clever,
but I would love to hear peoples thoughts about it. The short story is,
I propose that the garbage collector should not just reclaim unused
memory, it should also diminish the need for pointers, by replacing
nested data structures with larger chunks of consecutive memory. In
other words, I would diminish the need for pointers for arbitrary
recursive data types, just as replacing linked lists with arrays
eliminate the need for pointers.
I will explain my idea by an example of a data type we all know and
data List a = Cons a (List a) | Nil
each Cons cell uses two pointers - one for the element and one for the
rest of the list. If we could somehow merge the element and the rest of
the list into consecutive memory, we would be able to eliminate the
pointer to the rest of list. On 64 bit architectures merging would save
us 8 bytes of "wasted" memory. If we could merge n elements together we
could save n*8 bytes of memory.
64 bit pointers are wasteful in another way, as nobody has anywhere near
2^64 bytes of memory. And nobody is going to have that much memory for
decades to come. Thus, we could use the 8 most significant bits for
something besides pointing to memory, without loosing any significant
ability to address our memories. This would be similar to how GHC uses
some of the least significant bits for pointer tagging.
I suggest using the 8 most significant bits for what could be termed the
length of the pointer. A length of zero would mean that the pointer,
pointed to a Cons-cell just like we have them today. A length of one
would mean that one extra element were embedded into the cons cell. That
is, we would have this block of consecutive memory:
<Cons marker> <pointer to 1. element> <Cons marker> <pointer to 2.
element> <pointer to rest of list>
We could also call this a chunk of length two. A pointer of length two,
would mean two embedded list elements (a chunk of length three). And so
By embedding the "length" of the list in the pointer, it becomes
possible that one pointer could point to the beginning of a chunk of n
consecutive elements, while another one could point to the middle of the
same chunk (but the pointer would be of a different length). This would
enable sharing without having to break up into smaller chunks.
How should chunks be created? Or if a functions creates a lot of one
long chunks, how are they turned into longer chunks. I imagine that it
would be the job of the garbage collector, to merge the different
elements in a list into larger chunks. Of cause some code, like reading
a file, could create chunks directly.
I have used list as an example. But I imagine that GHC could do this for
any recursively defined data type.
I can see few, but big disadvantages also. Mainly, my scheme would
complicate dereferencing a pointer, which make the compiled code larger
and it might make branch prediction harder. The latter would only be a
problem on architectures that are pipelined, but popular processors from
Intel and AMD are both heavily pipelined.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Size: 197 bytes
Desc: This is a digitally signed message part
Url : http://www.haskell.org/pipermail/haskell-cafe/attachments/20100326/e4741ca7/attachment.bin
More information about the Haskell-Cafe