[Git][ghc/ghc][wip/extensible-interface-files] 13 commits: hadrian: Use --export-dynamic when linking iserv

Josh Meredith gitlab at gitlab.haskell.org
Fri Apr 10 03:21:57 UTC 2020



Josh Meredith pushed to branch wip/extensible-interface-files at Glasgow Haskell Compiler / GHC


Commits:
ce481361 by Ben Gamari at 2020-04-09T16:17:21-04:00
hadrian: Use --export-dynamic when linking iserv

As noticed in #17962, the make build system currently does this (see
3ce0e0ba) but the change was never ported to Hadrian.

- - - - -
fa66f143 by Ben Gamari at 2020-04-09T16:17:21-04:00
iserv: Don't pass --export-dynamic on FreeBSD

This is definitely a hack but it's probably the best we can do for now.
Hadrian does the right thing here by passing --export-dynamic only to
the linker.

- - - - -
39075176 by Ömer Sinan Ağacan at 2020-04-09T16:18:00-04:00
Fix CNF handling in compacting GC

Fixes #17937

Previously compacting GC simply ignored CNFs. This is mostly fine as
most (see "What about small compacts?" below) CNF objects don't have
outgoing pointers, and are "large" (allocated in large blocks) and large
objects are not moved or compacted.

However if we do GC *during* sharing-preserving compaction then the CNF
will have a hash table mapping objects that have been moved to the CNF
to their location in the CNF, to be able to preserve sharing.

This case is handled in the copying collector, in `scavenge_compact`,
where we evacuate hash table entries and then rehash the table.

Compacting GC ignored this case.

We now visit CNFs in all generations when threading pointers to the
compacted heap and thread hash table keys. A visited CNF is added to the
list `nfdata_chain`. After compaction is done, we re-visit the CNFs in
that list and rehash the tables.

The overhead is minimal: the list is static in `Compact.c`, and link
field is added to `StgCompactNFData` closure. Programs that don't use
CNFs should not be affected.

To test this CNF tests are now also run in a new way 'compacting_gc',
which just passes `-c` to the RTS, enabling compacting GC for the oldest
generation. Before this patch the result would be:

    Unexpected failures:
       compact_gc.run          compact_gc [bad exit code (139)] (compacting_gc)
       compact_huge_array.run  compact_huge_array [bad exit code (1)] (compacting_gc)

With this patch all tests pass. I can also pass `-c -DS` without any
failures.

What about small compacts? Small CNFs are still not handled by the
compacting GC. However so far I'm unable to write a test that triggers a
runtime panic ("update_fwd: unknown/strange object") by allocating a
small CNF in a compated heap. It's possible that I'm missing something
and it's not possible to have a small CNF.

NoFib Results:

--------------------------------------------------------------------------------
        Program           Size    Allocs    Instrs     Reads    Writes
--------------------------------------------------------------------------------
             CS          +0.1%      0.0%      0.0%     +0.0%     +0.0%
            CSD          +0.1%      0.0%      0.0%      0.0%      0.0%
             FS          +0.1%      0.0%      0.0%      0.0%      0.0%
              S          +0.1%      0.0%      0.0%      0.0%      0.0%
             VS          +0.1%      0.0%      0.0%      0.0%      0.0%
            VSD          +0.1%      0.0%     +0.0%     +0.0%     -0.0%
            VSM          +0.1%      0.0%     +0.0%     -0.0%      0.0%
           anna          +0.0%      0.0%     -0.0%     -0.0%     -0.0%
           ansi          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
           atom          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
         awards          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
         banner          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
     bernouilli          +0.1%      0.0%      0.0%     -0.0%     +0.0%
   binary-trees          +0.1%      0.0%     -0.0%     -0.0%      0.0%
          boyer          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
         boyer2          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
           bspt          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
      cacheprof          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
       calendar          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
       cichelli          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
        circsim          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
       clausify          +0.1%      0.0%     -0.0%     +0.0%     +0.0%
  comp_lab_zift          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
       compress          +0.1%      0.0%     +0.0%     +0.0%      0.0%
      compress2          +0.1%      0.0%     -0.0%      0.0%      0.0%
    constraints          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
   cryptarithm1          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
   cryptarithm2          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
            cse          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
   digits-of-e1          +0.1%      0.0%     +0.0%     -0.0%     -0.0%
   digits-of-e2          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
         dom-lt          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
          eliza          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
          event          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
    exact-reals          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
         exp3_8          +0.1%      0.0%     +0.0%     -0.0%      0.0%
         expert          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
 fannkuch-redux          +0.1%      0.0%     -0.0%      0.0%      0.0%
          fasta          +0.1%      0.0%     -0.0%     +0.0%     +0.0%
            fem          +0.1%      0.0%     -0.0%     +0.0%      0.0%
            fft          +0.1%      0.0%     -0.0%     +0.0%     +0.0%
           fft2          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
       fibheaps          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
           fish          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
          fluid          +0.0%      0.0%     +0.0%     +0.0%     +0.0%
         fulsom          +0.1%      0.0%     -0.0%     +0.0%      0.0%
         gamteb          +0.1%      0.0%     +0.0%     +0.0%      0.0%
            gcd          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
    gen_regexps          +0.1%      0.0%     -0.0%     +0.0%      0.0%
         genfft          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
             gg          +0.1%      0.0%      0.0%     +0.0%     +0.0%
           grep          +0.1%      0.0%     -0.0%     +0.0%     +0.0%
         hidden          +0.1%      0.0%     +0.0%     -0.0%      0.0%
            hpg          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
            ida          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
          infer          +0.1%      0.0%     +0.0%      0.0%     -0.0%
        integer          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
      integrate          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
   k-nucleotide          +0.1%      0.0%     +0.0%     +0.0%      0.0%
          kahan          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
        knights          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
         lambda          +0.1%      0.0%     +0.0%     +0.0%     -0.0%
     last-piece          +0.1%      0.0%     +0.0%      0.0%      0.0%
           lcss          +0.1%      0.0%     +0.0%     +0.0%      0.0%
           life          +0.1%      0.0%     -0.0%     +0.0%     +0.0%
           lift          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
         linear          +0.1%      0.0%     -0.0%     +0.0%      0.0%
      listcompr          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
       listcopy          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
       maillist          +0.1%      0.0%     +0.0%     -0.0%     -0.0%
         mandel          +0.1%      0.0%     +0.0%     +0.0%      0.0%
        mandel2          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
           mate          +0.1%      0.0%     +0.0%      0.0%     +0.0%
        minimax          +0.1%      0.0%     -0.0%      0.0%     -0.0%
        mkhprog          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
     multiplier          +0.1%      0.0%     +0.0%      0.0%      0.0%
         n-body          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
       nucleic2          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
           para          +0.1%      0.0%      0.0%     +0.0%     +0.0%
      paraffins          +0.1%      0.0%     +0.0%     -0.0%      0.0%
         parser          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
        parstof          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
            pic          +0.1%      0.0%     -0.0%     -0.0%      0.0%
       pidigits          +0.1%      0.0%     +0.0%     -0.0%     -0.0%
          power          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
         pretty          +0.1%      0.0%     -0.0%     -0.0%     -0.1%
         primes          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
      primetest          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
         prolog          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
         puzzle          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
         queens          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
        reptile          +0.1%      0.0%     -0.0%     -0.0%     +0.0%
reverse-complem          +0.1%      0.0%     +0.0%      0.0%     -0.0%
        rewrite          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
           rfib          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
            rsa          +0.1%      0.0%     -0.0%     +0.0%     -0.0%
            scc          +0.1%      0.0%     -0.0%     -0.0%     -0.1%
          sched          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
            scs          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
         simple          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
          solid          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
        sorting          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
  spectral-norm          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
         sphere          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
         symalg          +0.1%      0.0%     -0.0%     -0.0%     -0.0%
            tak          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
      transform          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
       treejoin          +0.1%      0.0%     +0.0%     -0.0%     -0.0%
      typecheck          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
        veritas          +0.0%      0.0%     +0.0%     +0.0%     +0.0%
           wang          +0.1%      0.0%      0.0%     +0.0%     +0.0%
      wave4main          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
   wheel-sieve1          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
   wheel-sieve2          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
           x2n1          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
--------------------------------------------------------------------------------
            Min          +0.0%      0.0%     -0.0%     -0.0%     -0.1%
            Max          +0.1%      0.0%     +0.0%     +0.0%     +0.0%
 Geometric Mean          +0.1%     -0.0%     -0.0%     -0.0%     -0.0%

Bumping numbers of nonsensical perf tests:

Metric Increase:
    T12150
    T12234
    T12425
    T13035
    T5837
    T6048

It's simply not possible for this patch to increase allocations, and
I've wasted enough time on these test in the past (see #17686). I think
these tests should not be perf tests, but for now I'll bump the numbers.

- - - - -
dce50062 by Sylvain Henry at 2020-04-09T16:18:44-04:00
Rts: show errno on failure (#18033)

- - - - -
045139f4 by Hécate at 2020-04-09T23:10:44-04:00
Add an example to liftIO and explain its purpose

- - - - -
101fab6e by Sebastian Graf at 2020-04-09T23:11:21-04:00
Special case `isConstraintKindCon` on `AlgTyCon`

Previously, the `tyConUnique` record selector would unfold into a huge
case expression that would be inlined in all call sites, such as the
`INLINE`-annotated `coreView`, see #18026. `constraintKindTyConKey` only
occurs as the `Unique` of an `AlgTyCon` anyway, so we can make the code
a lot more compact, but have to move it to GHC.Core.TyCon.

Metric Decrease:
    T12150
    T12234

- - - - -
f5212dfc by Sebastian Graf at 2020-04-09T23:11:57-04:00
DmdAnal: No need to attach a StrictSig to DataCon workers

In GHC.Types.Id.Make we were giving a strictness signature to every data
constructor wrapper Id that we weren't looking at in demand analysis
anyway. We used to use its CPR info, but that has its own CPR signature
now.

`Note [Data-con worker strictness]` then felt very out of place, so I
moved it to GHC.Core.DataCon.

- - - - -
75a185dc by Sylvain Henry at 2020-04-09T23:12:37-04:00
Hadrian: fix --summary

- - - - -
20536990 by Josh Meredith at 2020-04-09T23:21:50-04:00
Implement extensible interface files

- - - - -
4b7c99e3 by Josh Meredith at 2020-04-09T23:21:50-04:00
Change expected stdout for hi file Docs tests

- - - - -
b3a7a09a by Josh Meredith at 2020-04-09T23:21:50-04:00
Add comment subtitle section for BinData

- - - - -
a2ef8a67 by Josh Meredith at 2020-04-09T23:21:50-04:00
Add some discussion about extensible interfaces to extending_ghc.rst

- - - - -
0142093c by Josh Meredith at 2020-04-09T23:21:50-04:00
Link to the extensible interface files wiki page from extending_ghc.rst

- - - - -


30 changed files:

- compiler/GHC/Core/DataCon.hs
- compiler/GHC/Core/Op/DmdAnal.hs
- compiler/GHC/Core/Op/Simplify.hs
- compiler/GHC/Core/TyCon.hs
- compiler/GHC/Core/Type.hs
- compiler/GHC/Driver/Types.hs
- compiler/GHC/Iface/Binary.hs
- compiler/GHC/Iface/Load.hs
- compiler/GHC/Iface/Make.hs
- compiler/GHC/Types/Demand.hs
- compiler/GHC/Types/Id/Make.hs
- compiler/prelude/TysWiredIn.hs
- compiler/utils/Binary.hs
- docs/users_guide/extending_ghc.rst
- hadrian/src/Settings/Builders/RunTest.hs
- hadrian/src/Settings/Packages.hs
- includes/rts/storage/Closures.h
- libraries/base/Control/Monad/IO/Class.hs
- libraries/ghc-compact/tests/all.T
- libraries/ghc-compact/tests/compact_gc.hs
- rts/Hash.c
- rts/Hash.h
- rts/StgMiscClosures.cmm
- rts/posix/itimer/Pthread.c
- rts/sm/CNF.c
- rts/sm/Compact.c
- testsuite/config/ghc
- testsuite/tests/showIface/DocsInHiFile0.stdout
- testsuite/tests/showIface/DocsInHiFile1.stdout
- utils/iserv/ghc.mk


Changes:

=====================================
compiler/GHC/Core/DataCon.hs
=====================================
@@ -614,7 +614,7 @@ data DataConRep
                                  -- and *including* all evidence args
 
         , dcr_stricts :: [StrictnessMark]  -- 1-1 with dcr_arg_tys
-                -- See also Note [Data-con worker strictness] in GHC.Types.Id.Make
+                -- See also Note [Data-con worker strictness]
 
         , dcr_bangs :: [HsImplBang]  -- The actual decisions made (including failures)
                                      -- about the original arguments; 1-1 with orig_arg_tys
@@ -715,8 +715,26 @@ filterEqSpec eq_spec
 instance Outputable EqSpec where
   ppr (EqSpec tv ty) = ppr (tv, ty)
 
-{- Note [Bangs on data constructor arguments]
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+{- Note [Data-con worker strictness]
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+Notice that we do *not* say the worker Id is strict even if the data
+constructor is declared strict
+     e.g.    data T = MkT !(Int,Int)
+Why?  Because the *wrapper* $WMkT is strict (and its unfolding has case
+expressions that do the evals) but the *worker* MkT itself is not. If we
+pretend it is strict then when we see
+     case x of y -> MkT y
+the simplifier thinks that y is "sure to be evaluated" (because the worker MkT
+is strict) and drops the case.  No, the workerId MkT is not strict.
+
+However, the worker does have StrictnessMarks.  When the simplifier sees a
+pattern
+     case e of MkT x -> ...
+it uses the dataConRepStrictness of MkT to mark x as evaluated; but that's
+fine... dataConRepStrictness comes from the data con not from the worker Id.
+
+Note [Bangs on data constructor arguments]
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 Consider
   data T = MkT !Int {-# UNPACK #-} !Int Bool
 


=====================================
compiler/GHC/Core/Op/DmdAnal.hs
=====================================
@@ -454,7 +454,7 @@ dmdTransform :: AnalEnv         -- The strictness environment
 
 dmdTransform env var dmd
   | isDataConWorkId var                          -- Data constructor
-  = dmdTransformDataConSig (idArity var) (idStrictness var) dmd
+  = dmdTransformDataConSig (idArity var) dmd
 
   | gopt Opt_DmdTxDictSel (ae_dflags env),
     Just _ <- isClassOpId_maybe var -- Dictionary component selector


=====================================
compiler/GHC/Core/Op/Simplify.hs
=====================================
@@ -2758,7 +2758,7 @@ a case pattern.  This is *important*.  Consider
 
 We really must record that b is already evaluated so that we don't
 go and re-evaluate it when constructing the result.
-See Note [Data-con worker strictness] in GHC.Types.Id.Make
+See Note [Data-con worker strictness] in GHC.Core.DataCon
 
 NB: simplLamBndrs preserves this eval info
 


=====================================
compiler/GHC/Core/TyCon.hs
=====================================
@@ -45,7 +45,7 @@ module GHC.Core.TyCon(
         noTcTyConScopedTyVars,
 
         -- ** Predicates on TyCons
-        isAlgTyCon, isVanillaAlgTyCon,
+        isAlgTyCon, isVanillaAlgTyCon, isConstraintKindCon,
         isClassTyCon, isFamInstTyCon,
         isFunTyCon,
         isPrimTyCon,
@@ -1868,6 +1868,15 @@ isVanillaAlgTyCon :: TyCon -> Bool
 isVanillaAlgTyCon (AlgTyCon { algTcParent = VanillaAlgTyCon _ }) = True
 isVanillaAlgTyCon _                                              = False
 
+-- | Returns @True@ for the 'TyCon' of the 'Constraint' kind.
+isConstraintKindCon :: TyCon -> Bool
+-- NB: We intentionally match on AlgTyCon, because 'constraintKindTyCon' is
+-- always an AlgTyCon (see 'pcTyCon' in TysWiredIn) and the record selector
+-- for 'tyConUnique' would generate unreachable code for every other data
+-- constructor of TyCon (see #18026).
+isConstraintKindCon AlgTyCon { tyConUnique = u } = u == constraintKindTyConKey
+isConstraintKindCon _                            = False
+
 isDataTyCon :: TyCon -> Bool
 -- ^ Returns @True@ for data types that are /definitely/ represented by
 -- heap-allocated constructors.  These are scrutinised by Core-level


=====================================
compiler/GHC/Core/Type.hs
=====================================
@@ -2924,9 +2924,6 @@ distinct uniques, they are treated as equal at all times except
 during type inference.
 -}
 
-isConstraintKindCon :: TyCon -> Bool
-isConstraintKindCon tc = tyConUnique tc == constraintKindTyConKey
-
 -- | Tests whether the given kind (which should look like @TYPE x@)
 -- is something other than a constructor tree (that is, constructors at every node).
 -- E.g.  True of   TYPE k, TYPE (F Int)


=====================================
compiler/GHC/Driver/Types.hs
=====================================
@@ -147,7 +147,14 @@ module GHC.Driver.Types (
 
         -- * COMPLETE signature
         CompleteMatch(..), CompleteMatchMap,
-        mkCompleteMatchMap, extendCompleteMatchMap
+        mkCompleteMatchMap, extendCompleteMatchMap,
+
+        -- * Exstensible Iface fields
+        ExtensibleFields(..), FieldName,
+        emptyExtensibleFields,
+        readField, readIfaceField, readIfaceFieldWith,
+        writeField, writeIfaceField, writeIfaceFieldWith,
+        deleteField, deleteIfaceField,
     ) where
 
 #include "HsVersions.h"
@@ -215,8 +222,10 @@ import GHC.Serialized   ( Serialized )
 import qualified GHC.LanguageExtensions as LangExt
 
 import Foreign
-import Control.Monad    ( guard, liftM, ap )
+import Control.Monad    ( guard, liftM, ap, forM, forM_, replicateM )
 import Data.IORef
+import Data.Map         ( Map )
+import qualified Data.Map as Map
 import Data.Time
 import Exception
 import System.FilePath
@@ -1090,9 +1099,17 @@ data ModIface_ (phase :: ModIfacePhase)
         mi_arg_docs :: ArgDocMap,
                 -- ^ Docs on arguments.
 
-        mi_final_exts :: !(IfaceBackendExts phase)
+        mi_final_exts :: !(IfaceBackendExts phase),
                 -- ^ Either `()` or `ModIfaceBackend` for
                 -- a fully instantiated interface.
+
+        mi_ext_fields :: ExtensibleFields
+                -- ^ Additional optional fields, where the Map key represents
+                -- the field name, resulting in a (size, serialized data) pair.
+                -- Because the data is intended to be serialized through the
+                -- internal `Binary` class (increasing compatibility with types
+                -- using `Name` and `FastString`, such as HIE), this format is
+                -- chosen over `ByteString`s.
      }
 
 -- | Old-style accessor for whether or not the ModIface came from an hs-boot
@@ -1164,6 +1181,9 @@ instance Binary ModIface where
                  mi_doc_hdr   = doc_hdr,
                  mi_decl_docs = decl_docs,
                  mi_arg_docs  = arg_docs,
+                 mi_ext_fields = _ext_fields, -- Don't `put_` this in the instance so we
+                                              -- can deal with it's pointer in the header
+                                              -- when we write the actual file
                  mi_final_exts = ModIfaceBackend {
                    mi_iface_hash = iface_hash,
                    mi_mod_hash = mod_hash,
@@ -1264,6 +1284,8 @@ instance Binary ModIface where
                  mi_doc_hdr     = doc_hdr,
                  mi_decl_docs   = decl_docs,
                  mi_arg_docs    = arg_docs,
+                 mi_ext_fields  = emptyExtensibleFields, -- placeholder because this is dealt
+                                                         -- with specially when the file is read
                  mi_final_exts = ModIfaceBackend {
                    mi_iface_hash = iface_hash,
                    mi_mod_hash = mod_hash,
@@ -1307,7 +1329,9 @@ emptyPartialModIface mod
                mi_doc_hdr     = Nothing,
                mi_decl_docs   = emptyDeclDocMap,
                mi_arg_docs    = emptyArgDocMap,
-               mi_final_exts        = () }
+               mi_final_exts  = (),
+               mi_ext_fields  = emptyExtensibleFields
+             }
 
 emptyFullModIface :: Module -> ModIface
 emptyFullModIface mod =
@@ -3279,7 +3303,105 @@ phaseForeignLanguage phase = case phase of
 -- avoid major space leaks.
 instance (NFData (IfaceBackendExts (phase :: ModIfacePhase)), NFData (IfaceDeclExts (phase :: ModIfacePhase))) => NFData (ModIface_ phase) where
   rnf (ModIface f1 f2 f3 f4 f5 f6 f7 f8 f9 f10 f11 f12
-                f13 f14 f15 f16 f17 f18 f19 f20 f21 f22 f23) =
+                f13 f14 f15 f16 f17 f18 f19 f20 f21 f22 f23 f24) =
     rnf f1 `seq` rnf f2 `seq` f3 `seq` f4 `seq` f5 `seq` f6 `seq` rnf f7 `seq` f8 `seq`
     f9 `seq` rnf f10 `seq` rnf f11 `seq` f12 `seq` rnf f13 `seq` rnf f14 `seq` rnf f15 `seq`
     rnf f16 `seq` f17 `seq` rnf f18 `seq` rnf f19 `seq` f20 `seq` f21 `seq` f22 `seq` rnf f23
+    `seq` rnf f24
+
+{-
+************************************************************************
+*                                                                      *
+\subsection{Extensible Iface Fields}
+*                                                                      *
+************************************************************************
+-}
+
+type FieldName = String
+
+newtype ExtensibleFields = ExtensibleFields { getExtensibleFields :: (Map FieldName BinData) }
+
+instance Binary ExtensibleFields where
+  put_ bh (ExtensibleFields fs) = do
+    put_ bh (Map.size fs :: Int)
+
+    -- Put the names of each field, and reserve a space
+    -- for a payload pointer after each name:
+    header_entries <- forM (Map.toList fs) $ \(name, dat) -> do
+      put_ bh name
+      field_p_p <- tellBin bh
+      put_ bh field_p_p
+      return (field_p_p, dat)
+
+    -- Now put the payloads and use the reserved space
+    -- to point to the start of each payload:
+    forM_ header_entries $ \(field_p_p, dat) -> do
+      field_p <- tellBin bh
+      putAt bh field_p_p field_p
+      seekBin bh field_p
+      put_ bh dat
+
+  get bh = do
+    n <- get bh :: IO Int
+
+    -- Get the names and field pointers:
+    header_entries <- replicateM n $ do
+      (,) <$> get bh <*> get bh
+
+    -- Seek to and get each field's payload:
+    fields <- forM header_entries $ \(name, field_p) -> do
+      seekBin bh field_p
+      dat <- get bh
+      return (name, dat)
+
+    return . ExtensibleFields . Map.fromList $ fields
+
+instance NFData ExtensibleFields where
+  rnf (ExtensibleFields fs) = rnf fs
+
+emptyExtensibleFields :: ExtensibleFields
+emptyExtensibleFields = ExtensibleFields Map.empty
+
+--------------------------------------------------------------------------------
+-- | Reading
+
+readIfaceField :: Binary a => FieldName -> ModIface -> IO (Maybe a)
+readIfaceField name = readIfaceFieldWith name get
+
+readField :: Binary a => FieldName -> ExtensibleFields -> IO (Maybe a)
+readField name = readFieldWith name get
+
+readIfaceFieldWith :: FieldName -> (BinHandle -> IO a) -> ModIface -> IO (Maybe a)
+readIfaceFieldWith name read iface = readFieldWith name read (mi_ext_fields iface)
+
+readFieldWith :: FieldName -> (BinHandle -> IO a) -> ExtensibleFields -> IO (Maybe a)
+readFieldWith name read fields = sequence $ ((read =<<) . dataHandle) <$>
+  Map.lookup name (getExtensibleFields fields)
+
+--------------------------------------------------------------------------------
+-- | Writing
+
+writeIfaceField :: Binary a => FieldName -> a -> ModIface -> IO ModIface
+writeIfaceField name x = writeIfaceFieldWith name (`put_` x)
+
+writeField :: Binary a => FieldName -> a -> ExtensibleFields -> IO ExtensibleFields
+writeField name x = writeFieldWith name (`put_` x)
+
+writeIfaceFieldWith :: FieldName -> (BinHandle -> IO ()) -> ModIface -> IO ModIface
+writeIfaceFieldWith name write iface = do
+  fields <- writeFieldWith name write (mi_ext_fields iface)
+  return iface{ mi_ext_fields = fields }
+
+writeFieldWith :: FieldName -> (BinHandle -> IO ()) -> ExtensibleFields -> IO ExtensibleFields
+writeFieldWith name write fields = do
+  bh <- openBinMem (1024 * 1024)
+  write bh
+  --
+  bd <- handleData bh
+  return $ ExtensibleFields (Map.insert name bd $ getExtensibleFields fields)
+
+deleteField :: FieldName -> ExtensibleFields -> ExtensibleFields
+deleteField name (ExtensibleFields fs) = ExtensibleFields $ Map.delete name fs
+
+deleteIfaceField :: FieldName -> ModIface -> ModIface
+deleteIfaceField name iface = iface { mi_ext_fields = deleteField name (mi_ext_fields iface) }


=====================================
compiler/GHC/Iface/Binary.hs
=====================================
@@ -148,7 +148,15 @@ readBinIface_ dflags checkHiWay traceBinIFaceReading hi_path ncu = do
     wantedGot "Way" way_descr check_way ppr
     when (checkHiWay == CheckHiWay) $
         errorOnMismatch "mismatched interface file ways" way_descr check_way
-    getWithUserData ncu bh
+
+    extFields_p <- get bh
+
+    mod_iface <- getWithUserData ncu bh
+
+    seekBin bh extFields_p
+    extFields <- get bh
+
+    return mod_iface{mi_ext_fields = extFields}
 
 
 -- | This performs a get action after reading the dictionary and symbol
@@ -200,8 +208,16 @@ writeBinIface dflags hi_path mod_iface = do
     let way_descr = getWayDescr dflags
     put_  bh way_descr
 
+    extFields_p_p <- tellBin bh
+    put_ bh extFields_p_p
 
     putWithUserData (debugTraceMsg dflags 3) bh mod_iface
+
+    extFields_p <- tellBin bh
+    putAt bh extFields_p_p extFields_p
+    seekBin bh extFields_p
+    put_ bh (mi_ext_fields mod_iface)
+
     -- And send the result to the file
     writeBinMem bh hi_path
 


=====================================
compiler/GHC/Iface/Load.hs
=====================================
@@ -48,6 +48,7 @@ import GHC.Driver.Types
 import GHC.Types.Basic hiding (SuccessFlag(..))
 import GHC.Tc.Utils.Monad
 
+import Binary   ( BinData(..) )
 import Constants
 import PrelNames
 import PrelInfo
@@ -83,6 +84,7 @@ import GHC.Driver.Plugins
 import Control.Monad
 import Control.Exception
 import Data.IORef
+import Data.Map ( toList )
 import System.FilePath
 import System.Directory
 
@@ -1159,6 +1161,7 @@ pprModIface iface at ModIface{ mi_final_exts = exts }
         , text "module header:" $$ nest 2 (ppr (mi_doc_hdr iface))
         , text "declaration docs:" $$ nest 2 (ppr (mi_decl_docs iface))
         , text "arg docs:" $$ nest 2 (ppr (mi_arg_docs iface))
+        , text "extensible fields:" $$ nest 2 (pprExtensibleFields (mi_ext_fields iface))
         ]
   where
     pp_hsc_src HsBootFile = text "[boot]"
@@ -1248,6 +1251,11 @@ pprIfaceAnnotation :: IfaceAnnotation -> SDoc
 pprIfaceAnnotation (IfaceAnnotation { ifAnnotatedTarget = target, ifAnnotatedValue = serialized })
   = ppr target <+> text "annotated by" <+> ppr serialized
 
+pprExtensibleFields :: ExtensibleFields -> SDoc
+pprExtensibleFields (ExtensibleFields fs) = vcat . map pprField $ toList fs
+  where
+    pprField (name, (BinData size _data)) = text name <+> text "-" <+> ppr size <+> text "bytes"
+
 {-
 *********************************************************
 *                                                       *


=====================================
compiler/GHC/Iface/Make.hs
=====================================
@@ -268,7 +268,8 @@ mkIface_ hsc_env
           mi_doc_hdr     = doc_hdr,
           mi_decl_docs   = decl_docs,
           mi_arg_docs    = arg_docs,
-          mi_final_exts  = () }
+          mi_final_exts  = (),
+          mi_ext_fields  = emptyExtensibleFields }
   where
      cmp_rule     = comparing ifRuleName
      -- Compare these lexicographically by OccName, *not* by unique,


=====================================
compiler/GHC/Types/Demand.hs
=====================================
@@ -1666,17 +1666,15 @@ dmdTransformSig (StrictSig dmd_ty@(DmdType _ arg_ds _)) cd
   = postProcessUnsat (peelManyCalls (length arg_ds) cd) dmd_ty
     -- see Note [Demands from unsaturated function calls]
 
-dmdTransformDataConSig :: Arity -> StrictSig -> CleanDemand -> DmdType
+dmdTransformDataConSig :: Arity -> CleanDemand -> DmdType
 -- Same as dmdTransformSig but for a data constructor (worker),
 -- which has a special kind of demand transformer.
 -- If the constructor is saturated, we feed the demand on
 -- the result into the constructor arguments.
-dmdTransformDataConSig arity (StrictSig (DmdType _ _ con_res))
-                             (JD { sd = str, ud = abs })
+dmdTransformDataConSig arity (JD { sd = str, ud = abs })
   | Just str_dmds <- go_str arity str
   , Just abs_dmds <- go_abs arity abs
-  = DmdType emptyDmdEnv (mkJointDmds str_dmds abs_dmds) con_res
-                -- Must remember whether it's a product, hence con_res, not TopRes
+  = DmdType emptyDmdEnv (mkJointDmds str_dmds abs_dmds) topDiv
 
   | otherwise   -- Not saturated
   = nopDmdType


=====================================
compiler/GHC/Types/Id/Make.hs
=====================================
@@ -506,10 +506,9 @@ mkDataConWorkId wkr_name data_con
     tycon  = dataConTyCon data_con  -- The representation TyCon
     wkr_ty = dataConRepType data_con
 
-        ----------- Workers for data types --------------
+    ----------- Workers for data types --------------
     alg_wkr_info = noCafIdInfo
                    `setArityInfo`          wkr_arity
-                   `setStrictnessInfo`     wkr_sig
                    `setCprInfo`            mkCprSig wkr_arity (dataConCPR data_con)
                    `setUnfoldingInfo`      evaldUnfolding  -- Record that it's evaluated,
                                                            -- even if arity = 0
@@ -518,27 +517,7 @@ mkDataConWorkId wkr_name data_con
                      -- setNeverLevPoly
 
     wkr_arity = dataConRepArity data_con
-    wkr_sig   = mkClosedStrictSig (replicate wkr_arity topDmd) topDiv
-        --      Note [Data-con worker strictness]
-        -- Notice that we do *not* say the worker Id is strict
-        -- even if the data constructor is declared strict
-        --      e.g.    data T = MkT !(Int,Int)
-        -- Why?  Because the *wrapper* $WMkT is strict (and its unfolding has
-        -- case expressions that do the evals) but the *worker* MkT itself is
-        --  not. If we pretend it is strict then when we see
-        --      case x of y -> MkT y
-        -- the simplifier thinks that y is "sure to be evaluated" (because
-        -- the worker MkT is strict) and drops the case.  No, the workerId
-        -- MkT is not strict.
-        --
-        -- However, the worker does have StrictnessMarks.  When the simplifier
-        -- sees a pattern
-        --      case e of MkT x -> ...
-        -- it uses the dataConRepStrictness of MkT to mark x as evaluated;
-        -- but that's fine... dataConRepStrictness comes from the data con
-        -- not from the worker Id.
-
-        ----------- Workers for newtypes --------------
+    ----------- Workers for newtypes --------------
     univ_tvs = dataConUnivTyVars data_con
     arg_tys  = dataConRepArgTys  data_con  -- Should be same as dataConOrigArgTys
     nt_work_info = noCafIdInfo          -- The NoCaf-ness is set by noCafIdInfo


=====================================
compiler/prelude/TysWiredIn.hs
=====================================
@@ -639,6 +639,7 @@ typeNatKind    = mkTyConTy typeNatKindCon
 typeSymbolKind = mkTyConTy typeSymbolKindCon
 
 constraintKindTyCon :: TyCon
+-- 'TyCon.isConstraintKindCon' assumes that this is an AlgTyCon!
 constraintKindTyCon = pcTyCon constraintKindTyConName Nothing [] []
 
 liftedTypeKind, typeToTypeKind, constraintKind :: Kind


=====================================
compiler/utils/Binary.hs
=====================================
@@ -27,6 +27,8 @@ module Binary
     {-type-}  BinHandle,
     SymbolTable, Dictionary,
 
+   BinData(..), dataHandle, handleData,
+
    openBinMem,
 --   closeBin,
 
@@ -73,6 +75,7 @@ import Fingerprint
 import GHC.Types.Basic
 import GHC.Types.SrcLoc
 
+import Control.DeepSeq
 import Foreign
 import Data.Array
 import Data.ByteString (ByteString)
@@ -95,6 +98,44 @@ import GHC.Serialized
 
 type BinArray = ForeignPtr Word8
 
+
+
+---------------------------------------------------------------
+-- BinData
+---------------------------------------------------------------
+
+data BinData = BinData Int BinArray
+
+instance NFData BinData where
+  rnf (BinData sz _) = rnf sz
+
+instance Binary BinData where
+  put_ bh (BinData sz dat) = do
+    put_ bh sz
+    putPrim bh sz $ \dest ->
+      withForeignPtr dat $ \orig ->
+        copyBytes dest orig sz
+  --
+  get bh = do
+    sz <- get bh
+    dat <- mallocForeignPtrBytes sz
+    getPrim bh sz $ \orig ->
+      withForeignPtr dat $ \dest ->
+        copyBytes dest orig sz
+    return (BinData sz dat)
+
+dataHandle :: BinData -> IO BinHandle
+dataHandle (BinData size bin) = do
+  ixr <- newFastMutInt
+  szr <- newFastMutInt
+  writeFastMutInt ixr 0
+  writeFastMutInt szr size
+  binr <- newIORef bin
+  return (BinMem noUserData ixr szr binr)
+
+handleData :: BinHandle -> IO BinData
+handleData (BinMem _ ixr _ binr) = BinData <$> readFastMutInt ixr <*> readIORef binr
+
 ---------------------------------------------------------------
 -- BinHandle
 ---------------------------------------------------------------


=====================================
docs/users_guide/extending_ghc.rst
=====================================
@@ -749,6 +749,18 @@ NOT be invoked with your own modules.
 In the ``ModIface`` datatype you can find lots of useful information, including
 the exported definitions and type class instances.
 
+The ``ModIface`` datatype also contains facilities for extending it with extra
+data, stored in a ``Map`` of serialised fields, indexed by field names and using
+GHC's internal ``Binary`` class. The interface to work with these fields is:
+
+::
+
+    readIfaceField :: Binary a => FieldName -> ModIface -> IO (Maybe a)
+    writeIfaceField :: Binary a => FieldName -> a -> ModIface -> IO ModIface
+    deleteIfaceField :: FieldName -> ModIface -> ModIface
+
+To read an interface file from an external tool without linking to GHC, the format
+is described at `Extensible Interface Files<https://gitlab.haskell.org/ghc/ghc/wikis/Extensible-Interface-Files>`_.
 
 Source plugin example
 ^^^^^^^^^^^^^^^^^^^^^


=====================================
hadrian/src/Settings/Builders/RunTest.hs
=====================================
@@ -165,7 +165,7 @@ getTestArgs = do
         brokenTestArgs = concat [ ["--broken-test", t] | t <- brokenTests args ]
         speedArg     = ["-e", "config.speed=" ++ setTestSpeed (testSpeed args)]
         summaryArg   = case testSummary args of
-                           Just filepath -> Just $ "--summary-file " ++ show filepath
+                           Just filepath -> Just $ "--summary-file=" ++ filepath
                            Nothing -> Just $ "--summary-file=testsuite_summary.txt"
         junitArg     = case testJUnit args of
                            Just filepath -> Just $ "--junit=" ++ filepath


=====================================
hadrian/src/Settings/Packages.hs
=====================================
@@ -122,6 +122,14 @@ packageArgs = do
           [ notStage0 ? builder (Cabal Flags) ? arg "ghci"
           , cross ? stage0 ? builder (Cabal Flags) ? arg "ghci" ]
 
+        --------------------------------- iserv --------------------------------
+        -- Add -Wl,--export-dynamic enables GHCi to load dynamic objects that
+        -- refer to the RTS.  This is harmless if you don't use it (adds a bit
+        -- of overhead to startup and increases the binary sizes) but if you
+        -- need it there's no alternative.
+        , package iserv ? mconcat
+          [ builder (Ghc LinkHs) ? arg "-optl-Wl,--export-dynamic" ]
+
         -------------------------------- haddock -------------------------------
         , package haddock ?
           builder (Cabal Flags) ? arg "in-ghc-tree"


=====================================
includes/rts/storage/Closures.h
=====================================
@@ -486,4 +486,7 @@ typedef struct StgCompactNFData_ {
     StgClosure *result;
       // Used temporarily to store the result of compaction.  Doesn't need to be
       // a GC root.
+    struct StgCompactNFData_ *link;
+      // Used by compacting GC for linking CNFs with threaded hash tables. See
+      // Note [CNFs in compacting GC] in Compact.c for details.
 } StgCompactNFData;


=====================================
libraries/base/Control/Monad/IO/Class.hs
=====================================
@@ -30,8 +30,42 @@ module Control.Monad.IO.Class (
 
 class (Monad m) => MonadIO m where
     -- | Lift a computation from the 'IO' monad.
+    -- This allows us to run IO computations in any monadic stack, so long as it supports these kinds of operations
+    -- (i.e. 'IO' is the base monad for the stack).
+    --
+    -- === __Example__
+    --
+    --
+    -- > import Control.Monad.Trans.State -- from the "transformers" library
+    -- >
+    -- > printState :: Show s => StateT s IO ()
+    -- > printState = do
+    -- >   state <- get
+    -- >   liftIO $ print state
+    --
+    --
+    -- Had we omitted @'liftIO'@, we would have ended up with this error:
+    --
+    -- > • Couldn't match type ‘IO’ with ‘StateT s IO’
+    -- >  Expected type: StateT s IO ()
+    -- >    Actual type: IO ()
+    --
+    -- The important part here is the mismatch between @StateT s IO ()@ and @'IO' ()@.
+    --
+    -- Luckily, we know of a function that takes an @'IO' a@ and returns an @(m a)@: @'liftIO'@,
+    -- enabling us to run the program and see the expected results:
+    --
+    -- @
+    -- > evalStateT printState "hello"
+    -- "hello"
+    --
+    -- > evalStateT printState 3
+    -- 3
+    -- @
+    --
     liftIO :: IO a -> m a
 
 -- | @since 4.9.0.0
 instance MonadIO IO where
     liftIO = id
+


=====================================
libraries/ghc-compact/tests/all.T
=====================================
@@ -1,4 +1,4 @@
-setTestOpts(extra_ways(['sanity']))
+setTestOpts(extra_ways(['sanity', 'compacting_gc']))
 
 test('compact_simple', normal, compile_and_run, [''])
 test('compact_loop', normal, compile_and_run, [''])


=====================================
libraries/ghc-compact/tests/compact_gc.hs
=====================================
@@ -6,6 +6,8 @@ main = do
   let m = Map.fromList [(x,show x) | x <- [1..(10000::Int)]]
   c <- compactWithSharing m
   print =<< compactSize c
-  c <- foldM (\c _ -> do c <- compactWithSharing (getCompact c); print =<< compactSize c; return c) c [1..10]
+  c <- foldM (\c _ -> do c <- compactWithSharing (getCompact c)
+                         print =<< compactSize c
+                         return c) c [1..10]
   print (length (show (getCompact c)))
   print =<< compactSize c


=====================================
rts/Hash.c
=====================================
@@ -444,17 +444,13 @@ freeHashTable(HashTable *table, void (*freeDataFun)(void *) )
 void
 mapHashTable(HashTable *table, void *data, MapHashFn fn)
 {
-    long segment;
-    long index;
-    HashList *hl;
-
     /* The last bucket with something in it is table->max + table->split - 1 */
-    segment = (table->max + table->split - 1) / HSEGSIZE;
-    index = (table->max + table->split - 1) % HSEGSIZE;
+    long segment = (table->max + table->split - 1) / HSEGSIZE;
+    long index = (table->max + table->split - 1) % HSEGSIZE;
 
     while (segment >= 0) {
         while (index >= 0) {
-            for (hl = table->dir[segment][index]; hl != NULL; hl = hl->next) {
+            for (HashList *hl = table->dir[segment][index]; hl != NULL; hl = hl->next) {
                 fn(data, hl->key, hl->data);
             }
             index--;
@@ -464,6 +460,25 @@ mapHashTable(HashTable *table, void *data, MapHashFn fn)
     }
 }
 
+void
+mapHashTableKeys(HashTable *table, void *data, MapHashFnKeys fn)
+{
+    /* The last bucket with something in it is table->max + table->split - 1 */
+    long segment = (table->max + table->split - 1) / HSEGSIZE;
+    long index = (table->max + table->split - 1) % HSEGSIZE;
+
+    while (segment >= 0) {
+        while (index >= 0) {
+            for (HashList *hl = table->dir[segment][index]; hl != NULL; hl = hl->next) {
+                fn(data, &hl->key, hl->data);
+            }
+            index--;
+        }
+        segment--;
+        index = HSEGSIZE - 1;
+    }
+}
+
 /* -----------------------------------------------------------------------------
  * When we initialize a hash table, we set up the first segment as well,
  * initializing all of the first segment's hash buckets to NULL.


=====================================
rts/Hash.h
=====================================
@@ -34,8 +34,10 @@ int keyCountHashTable (HashTable *table);
 int keysHashTable(HashTable *table, StgWord keys[], int szKeys);
 
 typedef void (*MapHashFn)(void *data, StgWord key, const void *value);
+typedef void (*MapHashFnKeys)(void *data, StgWord *key, const void *value);
 
 void mapHashTable(HashTable *table, void *data, MapHashFn fn);
+void mapHashTableKeys(HashTable *table, void *data, MapHashFnKeys fn);
 
 /* Hash table access where the keys are C strings (the strings are
  * assumed to be allocated by the caller, and mustn't be deallocated


=====================================
rts/StgMiscClosures.cmm
=====================================
@@ -686,11 +686,11 @@ INFO_TABLE_CONSTR(stg_MVAR_TSO_QUEUE,2,0,0,PRIM,"MVAR_TSO_QUEUE","MVAR_TSO_QUEUE
    compaction is in progress and the hash table needs to be scanned by the GC.
    ------------------------------------------------------------------------- */
 
-INFO_TABLE( stg_COMPACT_NFDATA_CLEAN, 0, 8, COMPACT_NFDATA, "COMPACT_NFDATA", "COMPACT_NFDATA")
+INFO_TABLE( stg_COMPACT_NFDATA_CLEAN, 0, 9, COMPACT_NFDATA, "COMPACT_NFDATA", "COMPACT_NFDATA")
     ()
 { foreign "C" barf("COMPACT_NFDATA_CLEAN object (%p) entered!", R1) never returns; }
 
-INFO_TABLE( stg_COMPACT_NFDATA_DIRTY, 0, 8, COMPACT_NFDATA, "COMPACT_NFDATA", "COMPACT_NFDATA")
+INFO_TABLE( stg_COMPACT_NFDATA_DIRTY, 0, 9, COMPACT_NFDATA, "COMPACT_NFDATA", "COMPACT_NFDATA")
     ()
 { foreign "C" barf("COMPACT_NFDATA_DIRTY object (%p) entered!", R1) never returns; }
 


=====================================
rts/posix/itimer/Pthread.c
=====================================
@@ -110,13 +110,13 @@ static void *itimer_thread_func(void *_handle_tick)
 
     timerfd = timerfd_create(CLOCK_MONOTONIC, TFD_CLOEXEC);
     if (timerfd == -1) {
-        barf("timerfd_create");
+        barf("timerfd_create: %s", strerror(errno));
     }
     if (!TFD_CLOEXEC) {
         fcntl(timerfd, F_SETFD, FD_CLOEXEC);
     }
     if (timerfd_settime(timerfd, 0, &it, NULL)) {
-        barf("timerfd_settime");
+        barf("timerfd_settime: %s", strerror(errno));
     }
 #endif
 
@@ -124,7 +124,7 @@ static void *itimer_thread_func(void *_handle_tick)
         if (USE_TIMERFD_FOR_ITIMER) {
             if (read(timerfd, &nticks, sizeof(nticks)) != sizeof(nticks)) {
                 if (errno != EINTR) {
-                    barf("Itimer: read(timerfd) failed");
+                    barf("Itimer: read(timerfd) failed: %s", strerror(errno));
                 }
             }
         } else {
@@ -170,7 +170,7 @@ initTicker (Time interval, TickProc handle_tick)
         pthread_setname_np(thread, "ghc_ticker");
 #endif
     } else {
-        barf("Itimer: Failed to spawn thread");
+        barf("Itimer: Failed to spawn thread: %s", strerror(errno));
     }
 }
 
@@ -204,7 +204,7 @@ exitTicker (bool wait)
     // wait for ticker to terminate if necessary
     if (wait) {
         if (pthread_join(thread, NULL)) {
-            sysErrorBelch("Itimer: Failed to join");
+            sysErrorBelch("Itimer: Failed to join: %s", strerror(errno));
         }
         closeMutex(&mutex);
         closeCondition(&start_cond);


=====================================
rts/sm/CNF.c
=====================================
@@ -381,6 +381,7 @@ compactNew (Capability *cap, StgWord size)
     self->nursery = block;
     self->last = block;
     self->hash = NULL;
+    self->link = NULL;
 
     block->owner = self;
 


=====================================
rts/sm/Compact.c
=====================================
@@ -473,6 +473,67 @@ thread_TSO (StgTSO *tso)
     return (P_)tso + sizeofW(StgTSO);
 }
 
+/* ----------------------------------------------------------------------------
+    Note [CNFs in compacting GC]
+    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+    CNF hash table keys point outside of the CNF so those need to be threaded
+    and updated during compaction. After compaction we need to re-visit those
+    hash tables for re-hashing. The list `nfdata_chain` is used for that
+    purpose. When we thread keys of a CNF we add the CNF to the list. After
+    compacting is done we re-visit the CNFs in the list and re-hash their
+    tables. See also #17937 for more details.
+   ------------------------------------------------------------------------- */
+
+static StgCompactNFData *nfdata_chain = NULL;
+
+static void
+thread_nfdata_hash_key(void *data STG_UNUSED, StgWord *key, const void *value STG_UNUSED)
+{
+    thread_((void *)key);
+}
+
+static void
+add_hash_entry(void *data, StgWord key, const void *value)
+{
+    HashTable *new_hash = (HashTable *)data;
+    insertHashTable(new_hash, key, value);
+}
+
+static void
+rehash_CNFs(void)
+{
+    while (nfdata_chain != NULL) {
+        StgCompactNFData *str = nfdata_chain;
+        nfdata_chain = str->link;
+        str->link = NULL;
+
+        HashTable *new_hash = allocHashTable();
+        mapHashTable(str->hash, (void*)new_hash, add_hash_entry);
+        freeHashTable(str->hash, NULL);
+        str->hash = new_hash;
+    }
+}
+
+static void
+update_fwd_cnf( bdescr *bd )
+{
+    while (bd) {
+        ASSERT(bd->flags & BF_COMPACT);
+        StgCompactNFData *str = ((StgCompactNFDataBlock*)bd->start)->owner;
+
+        // Thread hash table keys. Values won't be moved as those are inside the
+        // CNF, and the CNF is a large object and so won't ever move.
+        if (str->hash) {
+            mapHashTableKeys(str->hash, NULL, thread_nfdata_hash_key);
+            ASSERT(str->link == NULL);
+            str->link = nfdata_chain;
+            nfdata_chain = str;
+        }
+
+        bd = bd->link;
+    }
+}
 
 static void
 update_fwd_large( bdescr *bd )
@@ -489,7 +550,6 @@ update_fwd_large( bdescr *bd )
     switch (info->type) {
 
     case ARR_WORDS:
-    case COMPACT_NFDATA:
       // nothing to follow
       continue;
 
@@ -968,6 +1028,7 @@ compact(StgClosure *static_objects,
             update_fwd(gc_threads[n]->gens[g].part_list);
         }
         update_fwd_large(gen->scavenged_large_objects);
+        update_fwd_cnf(gen->live_compact_objects);
         if (g == RtsFlags.GcFlags.generations-1 && gen->old_blocks != NULL) {
             debugTrace(DEBUG_gc, "update_fwd:  %d (compact)", g);
             update_fwd_compact(gen->old_blocks);
@@ -983,4 +1044,8 @@ compact(StgClosure *static_objects,
                    gen->no, gen->n_old_blocks, blocks);
         gen->n_old_blocks = blocks;
     }
+
+    // 4. Re-hash hash tables of threaded CNFs.
+    // See Note [CNFs in compacting GC] above.
+    rehash_CNFs();
 }


=====================================
testsuite/config/ghc
=====================================
@@ -29,7 +29,9 @@ config.other_ways         = ['prof', 'normal_h',
                              'ext-interp',
                              'nonmoving',
                              'nonmoving_thr',
-                             'nonmoving_thr_ghc']
+                             'nonmoving_thr_ghc',
+                             'compacting_gc',
+                             ]
 
 if ghc_with_native_codegen:
     config.compile_ways.append('optasm')
@@ -105,6 +107,7 @@ config.way_flags = {
     'nonmoving'    : [],
     'nonmoving_thr': ['-threaded'],
     'nonmoving_thr_ghc': ['+RTS', '-xn', '-N2', '-RTS', '-threaded'],
+    'compacting_gc': [],
    }
 
 config.way_rts_flags = {
@@ -146,6 +149,7 @@ config.way_rts_flags = {
     'nonmoving'        : ['-xn'],
     'nonmoving_thr'    : ['-xn', '-N2'],
     'nonmoving_thr_ghc': ['-xn', '-N2'],
+    'compacting_gc': ['-c'],
    }
 
 # Useful classes of ways that can be used with only_ways(), omit_ways() and


=====================================
testsuite/tests/showIface/DocsInHiFile0.stdout
=====================================
@@ -2,3 +2,4 @@ module header:
   Nothing
 declaration docs:
 arg docs:
+extensible fields:


=====================================
testsuite/tests/showIface/DocsInHiFile1.stdout
=====================================
@@ -33,4 +33,4 @@ arg docs:
   p:
     0:
       " An argument"
-
+extensible fields:


=====================================
utils/iserv/ghc.mk
=====================================
@@ -30,8 +30,9 @@ endif
 # refer to the RTS.  This is harmless if you don't use it (adds a bit
 # of overhead to startup and increases the binary sizes) but if you
 # need it there's no alternative.
+# Don't do this on FreeBSD to work around #17962.
 ifeq "$(TargetElf)" "YES"
-ifneq "$(TargetOS_CPP)" "solaris2"
+ifeq "$(findstring $(TargetOS_CPP), solaris2 freebsd)" ""
 # The Solaris linker does not support --export-dynamic option. It also
 # does not need it since it exports all dynamic symbols by default
 utils/iserv_stage2_MORE_HC_OPTS += -optl-Wl,--export-dynamic



View it on GitLab: https://gitlab.haskell.org/ghc/ghc/-/compare/e3f6acc45263b6963a5eb508c7883d81970fb999...0142093c65f35415e80a1e720fddf73f01ad0482

-- 
View it on GitLab: https://gitlab.haskell.org/ghc/ghc/-/compare/e3f6acc45263b6963a5eb508c7883d81970fb999...0142093c65f35415e80a1e720fddf73f01ad0482
You're receiving this email because of your account on gitlab.haskell.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.haskell.org/pipermail/ghc-commits/attachments/20200409/89101854/attachment-0001.html>


More information about the ghc-commits mailing list