Profiling and Data.HashTable
Ketil Malde
ketil+haskell at ii.uib.no
Mon Oct 17 03:07:11 EDT 2005
Jan-Willem Maessen <jmaessen at alum.mit.edu> writes:
> The practical upshot is that, for a hash table with (say) 24
> entries, the GC must scan an additional 1000 pointers and discover
> that each one is [].
Would a smaller default size help? In my case, I just wanted HTs for
very sparse tables.
> [Curious: what (if anything) is being used to test Data.HashTable?
> I'd be willing to undertake very small amounts of fiddling if I could
> be sure I wasn't slowing down something which mattered.]
I'd be happy to test it (or provide my test code). My program isn't
too beautiful at the moment, but is tunable to distribute the word
counts over an arbitrary number of hash tables.
BTW, could one cheat by introducing a write barrier manually in some
way? Perhaps by (unsafe?) thaw'ing and freeze'ing the arrays when they
are modified?
-k
--
If I haven't seen further, it is by standing in the footprints of giants
More information about the Glasgow-haskell-users
mailing list