[GHC] #11501: Building nofib/fibon returns permission denied
GHC
ghc-devs at haskell.org
Fri Dec 23 20:39:03 UTC 2016
#11501: Building nofib/fibon returns permission denied
-------------------------------------+-------------------------------------
Reporter: rem | Owner:
Type: bug | Status: new
Priority: normal | Milestone:
Component: NoFib benchmark | Version: 7.10.3
suite |
Resolution: | Keywords:
Operating System: Linux | Architecture: x86_64
| (amd64)
Type of failure: None/Unknown | Test Case:
Blocked By: | Blocking:
Related Tickets: | Differential Rev(s):
Wiki Page: |
-------------------------------------+-------------------------------------
Comment (by bgamari):
Replying to [comment:19 gracjan]:
> @bgamari: `ghc --make` will only be used to test generated code.
Compilation speed will have own tests that compile single files. Unless we
will want to test compilation speed for many modules at once.
>
Alright, that sounds reasonable to me.
> We can also go full high level and see if compiling programs with latest
complier and latest package set got suddenly slower than previous compiler
and previous package set. This is very high level, but this is what
constitutes the perception 'GHC compiles slower than it used to'.
>
> > However, I'm not sure that nofib is that place.
>
> This is what `fibon` is and I'm not sure either if this a good place.
>
Indeed; I've largely written `fibon` off.
> I like gipeda very much, this is more or less what I had in mind. I
would advise using off-the-shelf tools for this purpose, as it is a lot of
work to create one from scratch. There is a relevant industry that is very
concerned with performance and performance variations in time. They have
developed a lot of tools to collect performance data points and then be
able to dig into the issues by finding exceptional value, high variation,
correlating causes and effects and so on. This industry is network
monitoring and operations in general. And tools I like are Graphana and a
backend database for data points, could be ElasticSearch or InfluxDB.
Those more or less form gipeda, well, rather more than gipeda :)
>
Before setting out to write my hack I did a survey of the tools available
and was pretty disappointed in what I found. While there are many tools
which can give you a qualitative sense for the behavior a small number of
performance metrics over the course of a day or so, I found nothing which
gave me the quantitative insight which I needed.
> Anyway I was thinking of a datapoint that will be a tuple: (operating
system, machine arch, ghc version, test name, date of last merged commit,
and METRIC) where METRIC is one of metrics you listed.
>
That is similar to the encoding I use. I call the tuple of `(operating
system, machine arch, ghc version)` a "test environment". Instead of "date
of last merged commit" I use commit sequence number with respect to a
topologically linearized view of git history.
> Can you share database of results? I'd like to try to import data into
InfluxDB, connect Graphana, see if this is useful.
>
Of course. It's backed by PostgreSQL. I've uploaded a dump [[http://home
.smart-cactus.org/~ben/ghc_perf.sql|here]]. Note that it's rather big.
--
Ticket URL: <http://ghc.haskell.org/trac/ghc/ticket/11501#comment:20>
GHC <http://www.haskell.org/ghc/>
The Glasgow Haskell Compiler
More information about the ghc-tickets
mailing list