[GHC] #11293: Compiler plugins don't work with profiling
GHC
ghc-devs at haskell.org
Tue Nov 29 00:19:01 UTC 2016
#11293: Compiler plugins don't work with profiling
-------------------------------------+-------------------------------------
Reporter: ezyang | Owner:
Type: bug | Status: new
Priority: normal | Milestone:
Component: Profiling | Version: 7.11
Resolution: | Keywords:
Operating System: Unknown/Multiple | Architecture:
| Unknown/Multiple
Type of failure: None/Unknown | Test Case:
Blocked By: | Blocking:
Related Tickets: | Differential Rev(s):
Wiki Page: |
-------------------------------------+-------------------------------------
Comment (by ezyang):
A similar problem applies to TH, although here you can use `-fexternal-
interpreter` to ship the code to a GHC built with profiling to solve the
problem.
Let's talk about how to fix this.
1. The first design decision we have to make is whether or not we should
maintain a separate package database for profiled libraries. This is more
clear and direct: an entry in the traditional package database indicates
hi/o files; an entry in the profiling package database indicates p_hi/p_o
files; it also brings a bit of clarity on the implementation side because
it's a bit more obvious if you're doing a lookup in the host or target
database, and so it's clearer you need to do something different in these
cases. On the other hand, changing this would cause a lot of churn on the
tool-side. For now, I think that we should not split it up; if we did,
`Packages` would need a bit of changing.
2. `Packages` knows nothing about profiling/non-profiling loading; the key
player here is `LoadIface`. Here things get tricky. Right now, both
`loadPluginInterface` and `loadSysInterface` go through the same code path
`loadInterface` (via `loadInterfaceWithException`). We need to extract
these into two distinct codepaths. The normal, non-plugin codepath should
query and update the HPT and EPS as normal. However, the plugin codepath
needs to query its OWN copy of EPS (and HPT, I suppose, if you want to
support loading a plugin from the same package you're building). The hacky
way to do this is to add some extra fields to `HscEnv` (but it's unclear
to me if you also want to play this trick for dynamic/non-dynamic, so
maybe you want to make a sub-datastructure for this business, so you can
be polymorphic in the choice). The correct distinction is probably to have
a host HPT/EPS (things that can be loaded into this GHCl plugins) and a
target HPT/EPS (the thing you're compiling; other interfaces.)
3. Adding a new copy of HPT and EPS has deep implications: all code that
interacts with this needs to be updated to look in the correct HPT/EPS. Of
particular interest is `compiler/main/DynamicLoading.hs`; it interacts
with interface loading via the plugin interfaces, but it gets a TyThing
using `lookupTypeHscEnv`. With two copies of HPT/EPS, it will need a
`lookupHostTypeHscEnv` invocation instead. Additionally, in GhcMake the
HPT needs to be updating only the thing that it is actually building for.
There is a bit of trickery needed here, because a common trick is for
Cabal to build a package once without profiling, and then once again with
profiling; those local interfaces need from the non-profiling pass somehow
need to get shoved in the host/plugin HPT if we are to see them. Perhaps
best not to do it at first. In any case, lots of functions need to be
looked at, and a decision made.
That's it, I think.
--
Ticket URL: <http://ghc.haskell.org/trac/ghc/ticket/11293#comment:1>
GHC <http://www.haskell.org/ghc/>
The Glasgow Haskell Compiler
More information about the ghc-tickets
mailing list