Benchmarking lexical scan during the downsweep for the ImplicitQualifiedImport proposal

Tristan Cacqueray tdecacqu at redhat.com
Sun Aug 28 18:54:31 UTC 2022


Hello all,

In the context of https://github.com/ghc-proposals/ghc-proposals/pull/500,
I am looking for a strategy to measure the impact of resolving modules
dependencies using a lexical scan during the downsweep phase.

For example, in this branch: https://gitlab.haskell.org/TristanCacqueray/ghc/-/tree/make-lexical-analysis
I've added a tokenizer pass in the GHC.Driver.Make.getPreprocessedImports function,
and using `/usr/bin/time -o /dev/stdout -v _build/stage1/bin/ghc compiler/GHC/Tc/TyCl.hs 2> /dev/null`
I measure:

  LexicalAnalysis found 91 qualified name, and lasted: 150msec
  Elapsed (wall clock) time (h:mm:ss or m:ss): 0:00.72
  Maximum resident set size (kbytes): 142396

Using a clean build I get:

  Elapsed (wall clock) time (h:mm:ss or m:ss): 0:00.68
  Maximum resident set size (kbytes): 140012


Now my question is, how would you measure the time and space cost of
such change. For example, what profiling tool would you recommend and
how can I test a modified ghc to build ghc itself?

Thanks in advance,
-Tristan
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 515 bytes
Desc: not available
URL: <http://mail.haskell.org/pipermail/ghc-devs/attachments/20220828/768d971e/attachment.sig>


More information about the ghc-devs mailing list